VMblog recently spoke with Cindi Howson, Chief Data Strategy Officer at ThoughtSpot, to share her expertise on a topic that is really gathering steam right now, the collection and use of people's data and information as it is being gathered en masse lately, seemingly everywhere. And what should we do about it?
VMblog: Do you think the tech industry should self-impose data usage
standards on a case-by-case (e.g business or application-specific basis)? Or
should an industry standard evolve?
Cindi Howson: Every company should have their own data usage standards about
what information they collect, how they use the information they collect, and
how transparent they are to customers and consumers who provide this data.
Anonymized patient data will require a different set of guidelines than say
shopping data. Getting the overall data and analytics industry to evolve to a
standard seems more like wishful thinking at this point in time. I can better
envision certain industry verticals such as healthcare and telecomm to develop
standards on a country by country basis initially.
VMblog: What role, if any, should the government play in regulating
data?
Howson: I have concerns about government regulation of data because
regulation can stifle innovation. We have uninformed politicians crafting
policy about technology that many of them do not understand. The Facebook
hearings were at times cringeworthy to say the least.
Despite that, it's clear that some technology companies are not
safeguarding citizen's data and in some cases, abusing it. Government's role is
to protect citizens and level the playing field, so ultimately, government
needs to play a role in forcing tech providers to be more transparent in what
they collect, how it is used and shared, and the safeguards to protect it. GDPR
in Europe is a good starting point as are other initiatives such as CCPA
(California Consumer Privacy Act).
VMblog: Can you speak to some of the 'data for good' initiatives you've
seen come to market?
Howson: Data for good shows some of the best of what our industry can do
to make the world a better place. When I first wrote about this idea in 2008
and began talking about it at conferences, people quietly chuckled, thinking I
had clearly drunk too much of the industry Kool Aid (or champagne if you
prefer).
One of the early examples that sparked my interest was from a
project that started within Nationwide Insurance, who wanted to give educators
an early warning system for students at risk in inner city schools.
While I did not grow up in an inner city, I did grow up in a
difficult household and my education has been hard earned. I can think back to
fourth grade with so many missed days of school when I finally called our
principal to see if he could come get my brother and me to take us to school.
So Nationwide's vision and use case resonated on a personal level. Their measures
of success were tangible: improved reading and math scores, and eventually,
higher high school graduation rates. This ultimately led to the project forming
its own non profit, The Learning Circle, now used in multiple states in the
USA.
Other powerful examples include addressing homelessness, fighting
malaria, reducing human trafficking, rescuing wildlife, and finding cures for
cancer. I am increasingly seeing organizations aligning data for good
initiatives with the UN's sustainable development goals.
There is so much passion and excitement about the data for good
movement, which is wonderful. But to have a lasting impact on society, we need
to move beyond one-off projects to ongoing programs.
For these programs to have sustained impact, technology providers
and the private sector need to provide data and analytics expertise with
ongoing mentorship to organizations working on a worthy social purpose. Simply
providing money or software is helpful, but few non profits have the skills to
leverage data and analytics technology by themselves. From the lack of access
to the talent shortage, they face the same analytics challenges the private
sector faces, perhaps moreso.
VMblog: How can we restore trust among those consumers and businesses
that have seen data being used in less than ethical ways?
Howson: Trust has to be earned and once lost, it is difficult - if not
impossible - to restore. Company ethics matter. Greater transparency will help,
but consumers will only trust businesses with their data based on the company's
actions as it relates to the entire value chain.
If you treat an individual poorly as a customer in other
interactions like service or price negotiations, then why should they trust you
with their data?
Take for example, medical data. Sharing data from different
consumers between providers and healthcare professionals could certainly be
used to achieve better health outcomes, which is a worthy social purpose.
Consumers, however, see the lack of safeguards for this data and the potential
for abuses. These concerns are some of the reasons why England's CARE.DATA
program failed, even though the vision was good.
The benefits of sharing personal data have to outweigh the risks
of the loss of privacy, potential misuse, or breach. Giving consumers better
control over the data to see what they provided and what they can purge is
another way to restore trust. But I think some of this is overwhelming for most
people. Few even know how to purge their browser history, for example.
VMblog: Other than goodwill, are there incentives for businesses to
employ data for good?
Howson: Absolutely! Having a culture of philanthropy is a real differentiator
in a tight labor market, and helps companies attract and retain talent.
Financial incentives include investments from portfolio funds that consider
social purpose; impact investing is on the rise globally. Data for good
programs also foster an ethical culture that can potentially counter greed and
corruption.
##