Virtualization Technology News and Information
Article
RSS
Are We Sacrificing Data Privacy for Personalization?

By Will Hayes, Lucidworks CEO

As technology advances, consumer expectations continue to rise. We all expect a highly personal digital experience at every touchpoint, from our worklife to our freetime, even if we don't understand exactly how the technology behind it all works. Most people don't understand why Instagram serves up that perfect gym membership ad, or how online stores share perfect recommendations for an outfit for the wedding you're attending three months from now in Mexico.

Companies across all industries are turning to machine learning and AI to take signals like customer behavior to make consumer expectations reality. When a site offers you the perfect product at the exact moment you need it, it's great. But, it quickly gets creepy when you hit the "Accept" pop-up button without fully realizing that your data is going far beyond the page you're browsing.

And even scarier when you think that the security of your entire digital, financial, and personal lives could be at stake. 

We've all breezed through those semi-sketchy looking app downloads to access a free photo editing app or get the answers to a very important personality quiz. Years ago, the data that was being collected was minimal and didn't raise (too many) red flags for consumers. Now that the technologies powering these digital experiences are developing so rapidly (including data collection, image recognition, 5G, etc.) and becoming more integrated with one another, consumers don't always know what they're sacrificing in exchange for a more personal experience.

A study from FobrukerRadet revealed just how far your data goes. For example, Grindr, a dating app, shared detailed user data including Advertising ID, IP address, age, GPS location, and gender with a large number of third parties. Many of the companies you don't even know about  are in the business of collecting, using, and selling location data for various commercial purposes. You probably didn't read that fine print before downloading.

It's critical that people understand the impact that this constant and unconscious privacy sharing could have.

Data Control Versus Platform Access

As people become more educated on how organizations are using their data, privacy will become a commodity. We'll have the transparent choice to withhold our information or make it available to the most trustworthy bidder, if the tradeoff is worth it. AI will have consumers wielding their privacy as a commodity like never before, and future infrastructure will have to support a new level of transparency.

Social media platforms like Facebook and Instagram are a great case study for this struggle: wanting control of your personal data versus having full access to a platform you rely on to stay in touch. A survey from the Pew Research Centre, found that 69% of American adults use Facebook with more than half visiting the site "several times a day." And over 1 billion people use Instagram every month. Many have found the trade-off worth it.

Transparency Is Key With Privacy as a Commodity

Complex user agreements won't survive for much longer as the line between offline and online personas is blurred. I anticipate control returning to the consumer, and a shift away from what's been a fast and loose method of data collection.

As privacy laws become more strict and the demand for transparency grows, data exchange could be attached to each individual service we consume. I could share access to my photos to train a company's facial recognition system in exchange for premium access to a platform. And unlike a 20 page privacy agreement full of legalese, the exchange would clearly list every bit of personal information the company is collecting, how it will be used, and what third parties it can be shared with. This clear exchange creates a value for our privacy which can be exchanged and eventually normalized across applications.

Companies Are Forced to Walk a Thin Line

Once companies realized the incredible worth of personal user data, there was pressure to continue collecting it. Data collection has become a sort of arms race where slowing down or taking your foot off the gas could mean falling behind, or not meeting user expectations because you haven't collected the necessary information on their behavior. Forbes contributor, Joe Toscano, calls this type of data collection at all costs, "a race to the bottom."

Sidenote: if the world was full of people with good intentions, we could spend more time praising what this massive data collection has allowed: predicting the likelihood and impact of catastrophic weather to prepare disaster teams, improvements in healthcare including the ability to provide aid remotely, detecting unusual activities that could signal credit card fraud, and more. Unfortunately, not everyone has good intentions.

The tension we feel on a personal level is being felt on a global scale. When more data means better AI to power better experiences, companies (and governments) could be compelled to get more data without any regulations. Some say that China is so far ahead in their data collection, because of their state-run internet services, that they may have the advantage in AI over the coming years.

Which leaves us asking: Do you have to trade your private data for more convenience? Do you deregulate data collection if the competitive advantage hangs in the balance? I'm of one mind with Toscano, "Our work must remain focused on human rights and democratic processes if we hope to win."

Choice for Peace of Mind

Data collection, privacy, and the artificial intelligence that powers a personal digital experience can successfully coexist. But things have to change. Consumers can't make well-informed choices about what privacy they're willing to give up without complete transparency about where their data is going and what it's being used for. Choice is the main thing that's lacking in data collection today. And it's likely that people will continue to opt-in to sharing their data to have access to a beloved platform, or just to avoid spending time reviewing a complicated document that explains what they're agreeing to.

The peace of mind that comes with full transparency could even encourage people to share more of their personal information. That is, if the tradeoff is worth it.

## 

About the Author

will hayes lucidworks 

Will Hayes is the CEO of Lucidworks. He has over 20 years of experience in Silicon Valley leading product, marketing, and business development initiatives. Prior to Lucidworks he was head of technical business development for Splunk, where he was responsible for defining the company's market category and key product feature sets. Hayes also created and led the company's global partner program, building an ecosystem of consultants, developers, resellers, system integrators, service providers, and technology partners to drive sales opportunities and increase partner profitability. Earlier in his career, Hayes served as a software engineer at Genentech, where he built solutions that supported the sales and drug development teams. He currently lives with his family in San Francisco, California, where Lucidworks is headquartered. Learn more at Lucidworks.com.

Published Wednesday, April 22, 2020 7:33 AM by David Marshall
Filed under: ,
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<April 2020>
SuMoTuWeThFrSa
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789