In the wake of recent news regarding improper use of online consumer information, many businesses, governments, and civil society organizations are questioning what responsibilities companies of all sizes have to protect user data. Data privacy is a universal issue affecting citizens and the private sector across the globe. Officials in Kenya, Nigeria, the Philippines, the United Kingdom, and the United States are among those scrutinizing data harvesting techniques and other practices by political consulting firm Cambridge Analytica and social media giant Facebook. This follows reports that personal information for millions of users was shared without proper consent and may have been used to influence elections.
The private sector depends on consumer data and feedback to spur innovation and ensure more targeted and cost-effective marketing. New estimates by the United Nations Conference on Trade and Development (UNCTAD) indicate that worldwide e-commerce sales reached nearly $26 trillion in 2016. Large increases are expected in future years, with emerging economies accounting for much of the growth. As more businesses become digitized, this new reality raises an important question: How can companies from various industries that rely on user data to improve their products and services (healthcare, retail, finance, etc.) ethically handle valuable consumer information?
Many governments in emerging markets are pushing to improve business environments so that they can compete in the digital economy. Representatives from countries such as Egypt and Bangladesh and members of Association of Southeast Asian Nations (ASEAN) will discuss their national or regional e-commerce strategies and challenges at UNCATD’s upcoming E-Commerce Week in Geneva.
While governments engage in such discussions at international fora, the business community in local economies must also engage in important debates on how to collect and use data for innovation while respecting privacy and ownership of data beyond what is legally required. These dialogues should take place both inside companies and with the public via their stakeholders.
Companies should consider data privacy and responsibility as part of their corporate strategy, not simply a compliance issue. Businesses are usually obligated to obey privacy laws. Going the extra mile to respect the human right of privacy in the digital age makes good business sense, because it translates to strong brand reputation and builds trust with consumers and users. Businesses that neglect data privacy often face consequences such as financial loss, damage to their reputation, or lawsuits. A recent U.S. cybersecurity survey found that only 25% of consumers polled believe companies handle personal information responsibly. The overwhelming majority said they do not buy from companies with questionable security practices and would like to see more government regulation of data usage.
To make digital privacy an essential part of a company’s business strategy, departments across the firm, including legal and compliance, privacy, tech, research and development, and sales and marketing, must cooperate. Guides and resources offered by Global Digital Partners, for instance, provide good starting points for firms in emerging economies to deliberate on the business case for data privacy and practical ways they can act responsibly and ethically.
In addition, it is vital for companies to engage in public-private dialogues (PPDs) to understand the demands of stakeholders and collaboratively design policies or standards on data privacy. Fostering a strong competitive market that responds to and respects government and consumer demand requires both the public and private sector to come together. Partaking in PPDs is beneficial for companies because governments are more likely to develop policies or standards that are fair and business-friendly when they listen to businesses in open and constructive conversations. The PPD process—a structured, transparent, inclusive dialogue—also helps build trust between companies, governments, and consumers.
One example of a regulatory approach that governments could take is the European Union’s General Data Protection Regulation (GDPR), a new privacy law set to take effect on May 25, 2018. GDPR restricts how personal data is collected and handled, ensuring that users know, understand, and consent to data collected about them. Although the law protects individuals in the EU’s 28 member countries, some companies are making certain changes universally.
When it comes to data privacy challenges, we have seen just the tip of the iceberg. As more industries integrate artificial intelligence, internet of things, and other emerging technologies in their operations, more data will be produced and used by citizens, businesses, and governments around the world. Such instances surrounding monetization of consumer data without user consent, cybersecurity attacks, and data breaches will inevitably happen as global digital transactions escalate. Now is the time for the business community in emerging markets to lead the way by putting digital privacy at the forefront of their corporate strategies, as well as having dialogues with stakeholders on how to best collect and use consumer data. Companies gain a competitive advantage when they proactively gather stakeholders’ input on how to ethically manage user data—rather than asking for forgiveness after embarrassing incidents happen.
Maiko Nakagaki is a Program Officer for Global Programs at the Center for International Private Enterprise (CIPE).
Read about the joint initiative by CIPE, National Democratic Institute (NDI), and Center for International Media Assistance (CIMA) to identify and promote internet norms and principles that are fundamental to democratic governance.
Listen to a podcast with experts from CIPE, NDI, and CIMA discussing the importance of an open internet.
Learn how blockchain technology provides an unprecedented opportunity to strengthen democracies and foster entrepreneurship.
Read about how CIPE is helping Egypt transition to a cashless economy.