Every day massive torrents of data are processed, shared, and, sometimes, exploited. Some of that data is yours.
But how can we know just what’s being done with our personal data, especially as AI takes over more and more decisions? And how many of us are even aware of our privacy rights — or the transparency, responsibility, and accountability of organizations?
The Cisco 2023 Consumer Privacy Survey captures the attitudes and perceptions of 2,600 adult consumers in 12 countries. It’s the latest in a series of annual reports that also include Cisco’s Data Privacy Benchmark Study, which explores privacy from the perspective of security and privacy professionals around the world.
To gain a better understanding of what consumers are thinking, we caught up with Robert Waitman, director in Cisco’s Privacy Center of Excellence.
Thank you, Robert! One of the notable highlights from this year’s study was the generational focus, specifically younger people. Are they getting savvier about actively managing and protecting their data?
They’re getting quite savvy, especially compared to older consumers. Not so long ago, the widespread belief was that consumers, particularly younger generations, were not particularly careful with their data — accepting that their data was already readily available on the internet, and they didn’t have any idea what data companies had about them. But as we found in this year’s study, things have changed. More people are protecting their privacy, and younger people are leading the way.
In the study, you call this mostly younger segment “Privacy Actives.” How are they taking action?
One-third of consumers are proactively taking steps to protect their data, like leaving providers if they are unhappy with the way they use their data. More of them are exercising their rights under local privacy laws. For example, Data Subject Access Rights give them an opportunity to find out what information companies have about them, and to request changes or deletions to the data. This empowers individuals to take the necessary actions to protect their data and themselves.
How do older consumers compare?
Being proactive about privacy declines with older generations. For instance, 42 percent of consumers between ages 18 and 24 have exercised their Data Subject Access Rights, whereas only 6 percent of consumers over the age of 75 did so. That’s an enormous difference. We’ve also observed a general increase of consumers acting to protect their privacy over the past five years. Last year, 24 percent of consumers overall had exercised these data rights; this year it’s 28 percent.
At the same time, more governments are enacting laws around data privacy. How is that resonating with consumers?
It’s important to think about the role of government, organizations, and individuals — each has to do their part to ensure data is properly protected. Nonetheless, half of our survey respondents want government to play the lead role on privacy, with 21 percent saying organizations should take the lead, and 19 percent saying it should be individuals themselves. So, they’re looking for government to lead the way.
For this reason, it’s not surprising how receptive consumers have been to privacy laws. There are over 150 privacy laws around the world, and consumers overwhelmingly see them as having a positive impact. On average, 66 percent of consumers said that these laws have had a positive impact versus only 4 percent who said they’ve had a negative impact.
That would bode well for continuing government oversight and regulation.
Yes, we hope that all countries will enact privacy laws, including a national privacy law in the United States at some point. The U.S. does have privacy laws for many industry sectors, but we don’t yet have an overarching national law the way many other countries or regions do.
Data localization — the practice of storing and processing data only within a specific country or region — has received mixed reviews. What were some of the varying attitudes on that?
When initially asked about data localization, 76 percent of respondents would say it’s a good idea. But when we asked them to include the higher cost of products and services, that 76 percent dropped to 44 percent. And as we saw in our most recent Data Privacy Benchmark Study, 89 percent of organizations said that data localization does add significant cost to their operation. So, this is not a theoretical question. Data localization makes products and services more expensive.
For all its amazing benefits, AI raises concerns as it takes over more decisions and leverages our data to do so. What were some responses to AI and its impact on privacy in the survey?
Consumers are supportive of AI and recognize its potential. Forty-eight percent said they believe AI can be useful in improving their lives, and 54 percent said they would even share their own personal data anonymized to help make AI products better.
But they worry about how AI uses their data, don’t they?
Yes, the challenges come when AI is used for automated decision-making that affects individuals, as those decisions may be hard for people to understand. Sixty-two percent of respondents are concerned about the current use of AI, and 60 percent said they’ve already lost trust in organizations over their AI use. That’s a call to action for organizations.
Fortunately, there are specific things organizations can do. Seventy-eight percent said that they would be more comfortable with AI if the organization implements an AI ethics management program or audits the AI applications for bias (72 percent) or improves transparency (75 percent). These are concrete things organizations can and should do right away to build and maintain trust with their customers.
Cisco has set some of the highest industry standards for privacy and responsible AI. How can Cisco help address some of the concerns revealed in the study?
Cisco has certainly taken these issues to heart. We work to educate organizations and individuals about proper data management, data privacy, and consumers’ rights. Our Privacy Data Sheets and Data Maps show how we manage data for our solutions, exemplifying our focus on transparency and accountability.
As part of our Responsible AI Framework, we’ve laid out actions and opportunities to help our customers be informed and comfortable with how their data is being used. And when it comes to Gen AI, we recommend not entering personal or confidential information in Gen AI tools and checking that the output is accurate and makes sense.
See the closing section of the Survey for more recommendations that organizations can apply to help protect their customers, and the Cisco Trust Center to learn more about how we protect privacy.
Any thoughts on the future? Or predictions for how these trends will continue or shift?
I’m excited to see the younger generation taking a leading role when it comes to protecting privacy. As they age, I expect we will move towards a more informed and privacy-active population that will help keep organizations transparent, fair, and accountable. We all want to live in a world where everyone can leverage technology with greater confidence and trust in their providers.