Customer experience will drive your bottom line, and AI will help you win that race. Meanwhile, privacy is becoming the key to keeping those wins.
In today’s era of digital business customer experience is that sole factor that distinguishes your service from your competitors’. When McKinsey had forecasted that the era of hyper-personalisation is dawning upon us, Gartner had published multiple reports within a span of a year, remarking on the rising value of AI - after all, great, personalised experiences inevitably leverage AI across multiple functions, achieving astonishing results in the process.
At the same time, legislations were setting new benchmarks for the cost of privacy breaches. In 2021, Amazon was slammed with a €746m fine for non-compliance with the General Data Protection Regulation (GDPR). Google paid a similar amount in 2020, and some of the biggest names have caught the headlines for privacy breaches - Twitter, Marriott, WhatsApp, H&M, British Airways, Uber, amongst others. However, fines were not the only thing that these brands paid - for instance, WhatsApp continues to pay with a loss of its market share as its install rates continue to fall and competing services gain close to a million new users by the day.
If we are quick enough to connect the dots, here is the TL;DR: While AI becomes essential to run and grow business in a data-driven world, data privacy is the key to keeping it working that way.
The what of data privacy
So, what kind of information do enterprises need to keep under the lock? In the early era of computing, privacy laws largely pertained to sensitive information like employee IDs, customer IDs, social security numbers, credit card numbers, and so on. While data privacy was a simpler deal a few years ago, AI has rapidly changed the game. Today, most people can be identified by connecting seemingly trivial dots. For instance, a person’s date of birth, their ZIP code, and their gender taken together, can serve as a unique identifier - data mining techniques now help identify over 87% of the US population in that manner.
Consider how the situation becomes more difficult when the context adds more clues to the mix. For instance, a patient whose gender is not known, might be identifiable with a disease that only affects women. Similarly, customers could be identified by the category of apparel that they purchase. Personally identifiable information, or PII, is no longer a bunch of fields that need to be kept under the radar - instead, PII is determined by the intersection of the context to which the data corresponds, and evolving data mining techniques.
The why of data privacy
In response to the rising price tag of data privacy, legislations are now issuing and enforcing stricter compliance laws that require enterprises to pay close attention to their data strategies. Non-compliance with the GDPR in the EU has already cost dearly to the tech giants and smaller players alike; California Consumer Privacy Act (CCPA) and New York Privacy Act (NYPA) are foremost examples of strict local regulations. In addition, other countries are also enforcing privacy laws in their geographies too - in the APAC region, Personal Data Protection Act (PDPA) in Singapore, and Japan’s upcoming Act on the Protection of Personal Information (APPI) are key examples that deserve a mention.
While non-compliance can cost as much as $750 per compromised record, it is usually what entails that’s costlier. Lost customers, the negative media attention, lack of trust within the business, and reducing confidence amongst employees usually cost more to the business that the settlement itself. What’s more, restoring the reputation takes more than a few press releases, and winning customers to the other side becomes further tricky.
What’s the way forward then?
Redact, don’t react
The cost of reacting to data privacy breaches is massive, and this cost should detract all businesses from resorting to the reactive mode when it comes to data privacy. Data redaction, or the process of substituting some fields of data with placeholders (sometimes, simply blacking them out), is a practice that can help you avoid leakages from the source, from your work platforms like CRMs and ERPs, or, virtually anywhere over the course of digital orchestration of your business. For instance, redacting may be leveraged to substitute a customer’s age in the CRM by placeholder characters. Or, for an employee verifying a customer, they might simply request for the last six digits of their debit card, while the others are blacked out - which is what the employee sees on their screen.
Now, data redaction must work across several places - for instance, within digital platforms, on real-time messaging channels, within attachments, or call records - to name a few. Redaction works differently on unstructured data, where PII can be hidden within emails, instant messages, attachments, or pictures. AI is a key technology in the ability to redact data – a key step in this process is identifying what can be classified as PII given the context in question and taking appropriate decisions to redact it.
What must be noted here is that redaction must be done in a context-aware manner. While redacting the age of a customer might be straightforward to execute, this information might be critical to a sales rep, who will approach the customer accordingly. In such scenarios, identifying other data that can be redacted without affecting the underlying process, while also ensuring that customers cannot be identified with the available information is where AI plays a crucial role in preserving privacy. Similarly, verification systems must search for uniqueness while precluding the ability to identify an individual.
While AI technologies are deployed within the enterprise, their privacy-preserving strategies must be paid close attention - especially because of their purpose, which is usually to churn massive amounts of data to power an underlying use-case. Enterprise AI solutions, that serve as an overarching insights delivery service must bring equally mature redaction capabilities. Sainapse, for example, can not only identify confidential and sensitive information within structured and unstructured data, but also examine the context and take appropriate actions to mask or substitute PII. Without such AI solutions, enterprises are bound to pay hefty fines, chase after deserting customers, and call for press releases as they cover up for their slips, that shouldn’t have happened in the first place.