What is the future of data privacy?

The implementation of the European Union’s General Data Protection Regulation (GDPR) was a milestone for regulators and businesses alike, with a major knock-on effect in Asia Pacific (APAC). Following this, there was a significant move from businesses to step up when it came to protecting personal data and respecting the individual’s rights over their own information, or information about them.

However, factoring data protection into a post GDPR-world is a major undertaking. A recent Facing the Future: Developing a response to regulatory change report shows that two in five organisations in APAC are not ready to address new global regulations. Legal and compliance departments are struggling to deal with the complexity of new regulations and the speed of change. At the same time, businesses need to adapt to the changing consumer view: that data privacy is an individual human right, and one not to be superseded by commercial interest.

While the GDPR can seem burdensome, this legislation is just the beginning when it comes to defining privacy expectations. And what more regulation might look like in shaping the future of data privacy is an important consideration…

Consumer activism

Consumer awareness and involvement will trigger a much greater conversation about what data privacy means, and how it is applied. It would be beneficial if consumers are provided with more ways of tracking down where a company got their data from (similar to how services like ‘Have I Been Pwned’ can flag if the user’s email has been compromised), as well as more transparent information around individual rights, and easier processes to opt out and withdraw consent across the board.

This will form part of an evolving conversation about trust between consumers and organisations – ranging from corporations through to charities. The rise of high-profile data breaches, password leaks, and data misuse cases in Singapore have eroded the degree of trust that individuals place in government and businesses alike. It is possible for the damage to be repaired, but it will take work on the part of organisations to win trust through transparency, and a candid relationship with consumers over how data is, and indeed is not, being used.

Consumers also have the power to shape how regulators enforce sanctions for GDPR non-compliance. It would be impossible for local privacy watchdogs, even dedicated bodies such as the Personal Data Protection Commission, to monitor the entire Internet for breaches in the policy, so it will fall to wronged citizens to flag up the issues that matter to them. To that end, although it is still uncertain what sort of transgretions lead to what fines, by consumer groundswell will exert pressure on regulators to meet public demand.

International co-ordination will remain patchy at best

The formal introduction of GDPR has indirectly meant that many ASEAN countries have revisited and adapted their own data protection regulations. Many ASEAN member possess data privacy laws, for example, Singapore (Personal Data Protection Act 2012), Malaysia (Personal Data Protection Act 2010) and Philippines (Data Privacy Act of 2012) – although many of the proposed drafts and new regulations are less stringent than GDPR. From an industry perspective, the culture of cross-border businesses would ideally lead to a global standard for data privacy, but today’s geopolitical landscape limits the scope and ability of an internationally recognised governing body.

Instead, on the global stage, the onus will fall to local enforcement to shape how international companies go about approaching data privacy in their business practices. Some will work as best they can to apply the tightest standards on a global basis, providing those benefits to all. Others will risk-assess and act by country, while others will continue to pay lip-service to the regulation due to the limited number of cases where significant penalties were enforced.

For example, British Airways was recently handed a £183 million fine for infringement of the GDPR following a data breach which affected 500,000 customers last year, a cost that amounts to 1.5% of the airlines’ annual revenue. The effectiveness of monetary penalties as an impetus for change for tech giants remains questionable. Meanwhile, many other large enterprises continue to operate in legacy data environments, unnoticed due to their relative obscurity.

Close to home, Singapore’s PDPA came into full effect on 2 July 2014, with any organisation failing to comply with PDPA handed fines up to $1 million. The total amount of fines meted out in 2017 was $93,000 and always centred around the same offence – inadequate security measures for personal data.

Ethical questions around automation

Another issue arises around the way privacy is determined and managed with automation and the increasing use of machine learning and AI around the Internet of Things.

Yet, even with anonymised data, businesses are still profiting from the use of a person’s information, just without their name attached. Some consumers are taking advantage of how their information is being monetised – research published at the end of 2018 suggested that Facebook users want to be paid more than $1000 to deactivate their accounts for a year. However, given the value that an individual’s data provides to Facebook in the same period, the reality is that many services may end up paying the individuals for the consensual use of their data.

As Singapore positions itself to tap into the vast opportunities represented by big data analytics, there is an ethical dilemma around anonymised data. For example, wearable health devices are gaining traction, tracking information on cardiovascular activity – which is then analysed (anonymously), by healthcare researchers using AI. If there is a correlation between a certain reading and a healthcare risk, is there an ethical obligation for healthcare researchers to then inform users who exhibit this pattern?

How these types of issues are handled will likely remain a topic of debate for years to come, as deep learning and AI generate more insight from increasingly sophisticated ways of collecting this sort of data. Singapore introduced its model AI governance framework at the World Economic Forum 2019 as guidance on how AI can be ethically and responsibly used. However, such guides still represent the infancy stages of ethical discussions around how anonymised, once personal, data is used and managed.

Why should businesses take note now, before these changes even come into effect? According to the ‘Facing the Future’ report, experts reported that stronger regulation on data usage, protection and privacy can have a beneficial impact on businesses; overall more than 70% agreed with this view. Taking a more stringent approach to data protection inevitably leads to better data management overall, which in itself enables more meaningful business insights, cost optimisation and improves brand reputation. In that light, the future of data privacy will no doubt see businesses making an active, long-term commitment to embracing a privacy-first mindset and ultimately, regaining the trust of (currently) doubting consumers.