Understanding consumer priorities and concerns about data privacy management is essential to the success of Permission Slip, the app to take back control of your data, which the Innovation Lab launched last year. While our team is always in conversation with consumers, we decided to launch a study to investigate consumers’ expectations of privacy products more deeply to help inform the evolution of the app’s continued user-centric design. The study delved into the factors that influence consumer decisions around personal data management, inquired about the specific companies and data types that trigger high levels of concern, and sought to discern consumers’ expectations for privacy tools. Whether you’re a policymaker or a privacy product manager, we hope our findings can help illuminate how consumers reason about privacy and what they expect when they begin to take action.
We started by recruiting 32 consumers who expressed interest in managing their data privacy and had already taken steps to safeguard their personal information. We decided not to include consumers who had already used Permission Slip app, since we didn’t want our results to be influenced by the experience CR has already built.
We divided our research into three sections: pre-survey, card sorting, and post-survey. Our pre-survey asked about actions taken and the motivations behind them, as well as their expectations for a privacy tool.
Next, we facilitated three rounds of card sorting where we provided ~30 cards with different company logos and invited participants to sort the company cards into stacks. Then, in the post-survey, we probed into their company grouping rationales and concerns and asked participants to rank the significance of various factors and categories of data held by companies.
Finding #1: Companies that collect “sensitive” data such as that from camera and sensors, tracking, and financial information, are of high priority for managing
Consumers prioritize managing data with companies that either handle sensitive data, or over-collect information that could jeopardize their security, or with whom they engage on a regular basis. The major concern lies with companies gathering what participants called “sensitive” “private” data, such as financial, health, camera, and location-related information. They are concerned about the possible risk associated with sharing such information. A participant described why they are concerned about companies that have access to their sensitive data:
“My health information and where I am traveling is very sensitive. If someone were to stalk me or track what I’m doing/where I’m going, the information these companies have could put me in danger – P31”
Furthermore, excessive data gathering, especially unrelated to services, raises questions about necessity. A participant aptly expressed this sentiment:
“Because I think that they can offer their services and products without the need of collecting this amount of data and maybe they are using my data or sharing it with other parties. – P10”
A subsequent closed-ended question added further color to the precise data categories that consumers desire to manage. About 90% of surveyed consumers expressed worry about Cameras & Sensors, Online Tracking, and Financial Data. The rankings also mirrored their reasons for grouping, with sensitive categories like financial and camera data taking priority. Furthermore, the perceived over-collection seems to be linked to worries about online tracking data.
Finding #2: Consumers’ privacy and security goals are intermingled
We found that consumers expect to feel protected, informed, and in control from using a tool that safeguards their personal data. Consumers’ privacy goals are divided into five layers: prevent harm, feel more secure, be better informed, take extra control, and maintain data autonomy. (Figure 1).
Figure 1: Expectations toward privacy tools for safeguarding personal data
Preventing harm and enhancing security emerged as top priorities. In addition, consumers desire to learn from the tool, such as by understanding data collection practices. People also want to be able to take action, such as preventing data from being sold. Finally, the tool should be secure and not shady itself. A participant expressed worry regarding the security of the tools:
“I expect privacy tools to not take my data and make it less secure – P24 ”.
Ultimately, customers desire a secure and user-friendly app that provides an analysis of organizations’ practices and assists them in taking real steps to restrict data collection.
Finding #3: Privacy personas elucidate consumers’ different knowledge levels and motivations for managing data
In shaping the information architecture of any tool, acknowledging diverse user personas is crucial for crafting user-centered products. We pinpointed four distinct privacy profiles based on participants’ privacy knowledge and engagement: Novice learner, Amateur, Data Savvy, and Protectionist. This categorization was based on three criteria: privacy knowledge, degree of limiting data sharing, and privacy actions.
Figure 2: Distinct privacy profiles based on 32 privacy-conscious individuals
We asked participants ,”What do you expect from using privacy tools”?
“I’m not totally sure, but I would like my information to be better protected…” – Novice learner
“I expect that these tools would help keep my personal information safe and help me to share less with any pages or organizations I don’t want to share with” – Amateur
“A degree more protection from identity theft, as well as not making myself a product bartered between companies.” – Data Savvy
“I’d want a way to manage all of them in one place that still is very secure and helps me keep everything together. I expect them to not take my data and make it less secure“ – Protectionist
We recognized that each profile had unique privacy goals. For instance, Data Savvy prioritized efficient tools, while Novice Learners sought curated guidance for policy understanding. Understanding these characteristics allows us to better tailor our solutions to the unique demands and preferences of each user type.
We’re looking forward to incorporating these insights in our product roadmap for Permission Slip, and continuing to engage in research that can inform the ways we support consumers’ privacy goals.
Study design, conducted, and analyzed by Tuan-he Lee, with support from Ginny Fahs, Lauren Kiesel, and Daniella Raposo.