Additional regulations added to the California Consumer Privacy Act to include adoption of a Privacy Options icon come just as the team of researchers that helped lawmakers design the blue button get ready to share results of their research process that informed the icon design.
Researchers from the University of Michigan and Carnegie Mellon University will present a paper at CHI 2021, the ACM Conference on Human Factors in Computing Systems-considered the premier peer-reviewed publication venue for Human-Computer Interaction research-on their work to help consumers opt out of having their private information sold by merchants, social media companies and other businesses.
The California act was ratified in 2018 and went into effect in 2020, but several additional regulations were just announced March 15. The CCPA is a comprehensive privacy law that shares some similarity with Europe’s General Data Protection Regulation. Among other things, it says companies and organizations that profit from personal information must provide a dedicated link on their websites that allows consumers to tell companies "do not sell my personal information.”
The law allows a button to accompany the link for this opt-out purpose, so it became the job of the Office of the Attorney General in California to figure out what that could look like.
"This work in collaboration with the OAG demonstrates the importance of involving user testing in policymaking processes to ensure that resulting controls for consumers are actually useful and usable,” said senior author Florian Schaub, assistant professor at the U-M School of Information. "I really commend the Office of the Attorney General for working with us and ultimately following our research-based recommendations.”
Since depicting the concept of privacy seemed difficult, the team developed 11 icons they thought best represented the concepts of choice, opting out and not selling personal information. They added an icon created by the Digital Advertising Alliance industry group and tested all 12 with participants recruited from Amazon’s Mechanical Turk.
Initial results suggested text was needed to make sense of the icon, so in a second study they came up with and evaluated 16 phrases, including "Do Not Sell My Personal Information,” which is the text required by the CCPA, and potential alternatives such as "Privacy Choices” and "Don’t Sell.”
Five leading text choices were combined with three top icon picks and tested on a fictitious shoe website. In this study, a blue stylized toggle icon emerged as the preferred design to convey choices. Together with the link text "Privacy Options” this icon was highly effective at conveying to people the presence of privacy choices.
The Office of the Attorney General picked an icon and slogan of its own that the team also tested. The icon was found to cause too many misconceptions.
The bottom line of their various user tests:
- Icons for privacy choices should be rooted in simple and familiar concepts rather than attempting to visualize abstract privacy concepts or data practices.
- Icons should be accompanied by link texts, at least initially, to aid comprehension and allow consumers to become familiar with them.
- Another icon, the DAA AdChoices icon, is still not familiar and frequently misunderstood by people, even though it has been around for years.
"It is well known that privacy policies are lengthy and full of jargon,” said Hana Habib, Ph.D. candidate at the Institute for Software Research at Carnegie Mellon. "Privacy choices are also hard to find on a website. Simple icons with the right labeling can communicate concepts quickly and concisely across languages and cultures, but it is important to get it right through user testing.”
At this point the Privacy Options icon is optional. It is up to companies whether they adopt it or not.
"We hope that companies and also policymakers in other U.S. states and at the federal level will decide to adopt it, and thus give consumers an easily recognizable entry point to a company’s privacy settings,” said Yixin Zou, doctoral candidate at the U-M School of Information.
The researchers say their work demonstrates the importance of including user research and usability testing in policy and rulemaking processes, especially when laws and regulations require providing notices or controls to consumers. Otherwise, they say, there’s a risk that intended consumer protections don’t materialize in practice or even mislead consumers, as has been the case with lengthy and vague privacy policies or difficult to find privacy settings and opt-outs.
"We hope that policymakers and researchers collaborating in this way can become a model approach towards more usable consumer notices and controls in privacy and other contexts,” said Lorrie Faith Cranor, director of the CyLab Security and Privacy Institute at Carnegie Mellon.