Data Privacy and Digital Trust

loading

THE DILEMMA OF limiting personal data privacy for the need to share information in the interest of public safety has heightened awareness of the issues involved in the current pandemic. While the rationale is well understood, there are concerns that this precedent may become an irreversible and automatic default.

The latest data breaches and scandals involving Cambridge Analytica, Facebook and Marriott have further alerted users to how data are collected, stored – and breached. In the case of Cambridge Analytica, the use of a quiz app to harvest, without consent, millions of Facebook users’ personal data that were then analysed and used for political advertising to support the Trump campaign in 2018 was the straw that broke the camel’s back.

From Nonchalance to Anxiety

Of late, employment search firms, under the guise of better matching the candidate to a prospective job, have launched psychographic tests to generate a diagnostic profile on behavioural strengths and weaknesses in a variety of work and leadership scenarios. Genomics companies collect DNA data to help customers trace ancestry, or ascertain allergies or hereditary diseases.

Another trend is fitness wearables that users employ to track calories burned and steps walked; and to monitor blood pressure, sleep patterns, heart rate etc. Presumably such data will aid in health diagnosis or motivate healthier behaviours. The data from these wearables are typically contributed to a centralised cloud database. However, the privacy policies of such product suppliers are vague and until a breach occurs, how extensive the measures used to encrypt and deny unauthorised access are seldom scrutinised.

These case examples may seem beneficial and the data collection innocent enough prima facie, but users should question how data on such biological and psychological insights and personal information are used, how they are stored or shared, and for how long. While a Facebook account can be deleted relatively easily, getting these companies to permanently erase personal data may be time-consuming, complicated and next to impossible.

Interestingly, and despite the cited scandals and breaches, digital ubiquity has put users at the mercy of such service providers. Personal data are given freely in exchange for the received services. And many users are increasingly willing to mechanically accept the cookies and privacy policies in return for access. Cookies function by tracking user behaviour, e.g. how a person navigates the webpages and location-based data storage. Data collected by cookies can potentially become a privacy risk; they can be used to analyse behavioural patterns, preferences and sentiments for subliminal targeting and persuasive advertising of companies.

Demand and Supply of Data

As technology advances, so will the expansion of data collection methods and devices. In the future, a living environment that is presence-aware will be envisaged through the Internet of Things comprising a hyper-network of sensors and cameras that will work to continually collect data on people, their movements and their surroundings. This had already happened in pre-pandemic China; CCTVs and facial recognition software are used to track the movements of its citizens.

When viewed through the lens of liberal standards, this approach is construed as the work of an intrusive Big Brother state. However, authoritarian regimes justify such surveillance on the grounds of the greater good. As part of the Social Contract, the average law-abiding citizen who values safety is surely willing to make the necessary trade-off in return for a life free from crimes, pandemics and disasters.

Artificial Intelligence (AI) algorithms depend on large volumes of data as input to teach its heuristics learning model to make inferences and deductions. As these grow in sophistication, so will the demand for data. However, its accuracy is subject to the quality of input. Input of bad data or its deliberate colouring and tainting through human bias and prejudices will also influence how these algorithms decide and what they recommend. Also, if certain population segments choose to be stringent on data-sharing significantly more than others, it may skew the outcomes of AI models.

"... digital ubiquity has put users at the mercy of such service providers. Personal data are given freely in exchange for the received services."

Data brokerage is a burgeoning industry that many are unaware of. Data brokers trawl the internet to scrape, aggregate and harvest data without consent, and monetise them through lucrative trades. They collect and trade data from available public sources like property transactions, court cases, driver’s license sites, company websites, browsing history, social media connections and retail stores. The data amassed range from being scarily accurate to false and out-of-date. All the same, it is still of some value to companies in formulating their product and marketing strategies.

In The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Prof Shoshana Zuboff writes, “Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data” that are packaged into “prediction products traded in behavioural futures marketplace”. Does that mean that we should guard our personal data tightly, drop off the grid and shy away from going digital?

Digital Trust

Developing trust is essential in the push towards going digital. Companies earn digital trust by demonstrating that they are serious and ethical in protecting the safety, privacy, security and reliability of personal data collected from online programmes or tracked through devices. The measures and steps taken in building digital trust and confidence ultimately contribute to brand equity and a reputation of trustworthiness.

According to Harvard Business Review1, digital trust can be measured in the dimensions of behaviour, attitudes, environment and experience. On this scale, Malaysia scored low trust in the attitudes and experience dimensions.

  • Behaviour refers to how users actually respond to frictions in their digital experiences and environment. There is some level of friction in any digital transaction, such as entering a security code or the latency in loading a webpage. In general, users are tolerant as they view these steps as necessary and will persist in completing the transaction.
  • Attitudes refer to how users feel about the digital trust environment, whether users trust governments and businesses to respect data privacy and use it responsibly to provide value. An example of this is the voluntary download of the Bluetooth app to facilitate contact tracing. For the app to be effective, at least 75% of the population must install the app and turn the Bluetooth on. However, data from Singapore show that only 25% of the population did so, and this is despite the government being transparent and turning the app into open source code for scrutiny and reassuring citizens of data privacy.
  • Digital Environment refers to the robust mechanisms in guaranteeing accountability and preventing security breaches. Unlike the more stringent European Union General Data Protection Regulation 2018, Malaysia’s Personal Data Protection Act 2010 only protects against the inappropriate use of personal data for commercial purposes. Several gaps, including data on geolocation and cookies, still remain uncovered. Data processed outside of Malaysia are not covered either. To strengthen digital trust, we need to bolster our legal framework and implement identity management for enforcement and accountability to be on par with other jurisdictions.
  • Digital User Experience refers to how users experience the digital trust environment and how they ascertain the right balance and trade-off between more security versus negative friction, and promote a seamless experience for online engagements. While more security measures may promote trust, this may also cause negative friction in making transactions cumbersome.

The digital tide is unstoppable and with digitisation, everything is going deep,2 e.g. deep learning, deep insights, deep surveillance, deep mind. Such advances are steering us towards uncharted territory and empowering us as never before. This unparalleled promise also presents potential perils that governments need to step up to regulate in order to ensure data privacy and maintain digital trust.

Tony Yeoh is the CEO of Digital Penang.



Related Articles

COVID-19 EXCLUSIVES