Privacy policies and regulations can seem like an inscrutable web of unmanageable obligations. However, a unified theory of privacy can help these seemingly disparate ideas fall into place and reveal significance. In this powerful interview with privacy expert Cat Coode, she explains how discrete information is more important than we might expect; how we fundamentally misjudge the information we trade away; and how overbroad data collection practices put users and companies at risk. Once you see how all the pieces of the privacy puzzle come together, you’ll re-think your approach to data collection and its implications.
What is your role and how is it related to privacy law? What prompted your interest in privacy law?
I am an engineer by trade and a large part of my career was as a software developer and architect at BlackBerry. I spent over a decade embedded in a privacy and security-first culture. When iPhone was released, people went crazy over its fancy features but no one seemed to care about the lack of security for their data. That’s when I realized that the bulk of the population had little to no understanding of how these applications and devices worked. They didn’t know where their data was going and what was being done with it.
I launched my own consulting company in 2013 called Binary Tattoo—Binary for the language of all things digital and Tattoo for the permanence of what is put online. My goal is to help individuals and corporations better understand cybersecurity and data privacy. As more regulations were launched, I became immersed in the nuances of the laws, running compliance gap assessment, and assisting companies in Privacy by Design implementation.
What’s the difference between privacy, confidentiality, and data security?
Privacy is an individual’s ability to control how their data is shared.
Confidentiality is the level of authorized access to a piece of data.
Data security is how you protect your data assets from unauthorized access.
To an individual, privacy is the most important because it is your ability to own and control your decision to share information, and in what context. It’s not about having something to hide. Privacy is critical because once the information is no longer private, that can’t be undone; the harm is done. Like toothpaste pushed from a tube.
Confidentiality answers who should have access; security answers how you enforce that access; privacy also answers what is collected, why it is required, how it is transferred and stored, and when it is destroyed.
What are some red flags in privacy policies of consumer apps?
Three stand out:
- It’s out-of-date. A tell-tale sign is any mention of Privacy Shield, which has now been invalidated.
- It includes generic blanket denials. If the policy says simply, “We don’t transmit your data,” without saying anything else affirmatively, then I’m concerned.
What surprising types of information qualify as confidential information or personally identifiable information?
Personally Identifiable Information (PII) isn’t just data unique to you—it includes any combination of data that can identify you. Information that might not be significant alone, like your birthday or your favorite food, can become significant when combined with other things. Then that information can be used to answer your security questions and gain access to additional higher-value information.
Even “de-identifying” information may not be enough to protect a person’s identity. Take gender, for example. When everyone was forced into only two gender options, de-identifying data by organizing it by gender was fairly anonymous, but with the multitude of genders that exist now, you could actually identify a specific person based on that information, especially if the town or organization has little diversity.
People don’t realize how little it takes to identify someone, if given the right context. In a small or homogeneous town or organization, one piece of information can quickly identify the outlier. When there is only one intern, or one med-tech client, or one woman working at a company, if you use those descriptors, it’s no longer anonymous. We must remember context to effectively protect privacy.
What are some unexpected issues with the freemium or “data-as-payment” model for apps?
We’ve talked about this so much that there’s now a saying, “Unless you are paying for the product, you are the product.” Yet many people still don’t understand that free is never really free.
We expect to exchange some value for the use of software. But now there’s a conceptual problem. While we used to pay in money, and therefore understood the value of what we’re giving, we now pay with data, which we don’t fully comprehend. It’s hard to understand exactly what information is collected, how it will be used, or the value of what we’re trading away.
Another conceptual problem is that people don’t think about passively harvested data—they only think about the data they input or actively provide access to. But many apps and tools will be “on” running in the background, listening to you, watching you, reading what you write anywhere, ready to offer “help.” While this may seem good at first, the constant possibility of the offer of help comes at the cost of your privacy. It’s safest to assume all products are always on and listening; ask yourself whether you want that.
So even if you think the onus is on the user to decide whether to use an app in exchange for data, the user is unlikely to comprehend the value and scope of the exchange.
What are some dangers of using a freemium consumer app in legal practice?
You asked about lawyers, but I think about this with kids all the time: Was that information yours to give away? Did you have the authority to give it away or the obligation to protect it? What information are you giving away on their behalf and without their consent when you use these apps?
When you use always-on freemium software for word-processing and email, it’s running through everything that you write, often collecting, transmitting, and storing that information. Everything it reads, it owns as “user content” and may use. Ostensibly, it’s collecting that information to help you write, but the data collection is not for your benefit, it’s for future app development. In using an app that reads everything you write, you’re giving up a lot of control over data that didn’t belong to you, or was entrusted to you to keep confidential.
Another wrinkle is that, sometimes, even when you pay for a product, some will still collect sensitive data. So, if you’re using an app for confidential work, then you must seek specialized terms of service for your confidential purposes. For my med-tech clients, I know that they require software agreements that prohibit data collection so they can comply with HIPAA. I’m sure this exists for law, too, and you should definitely look into it.
What should organizations understand about their own approach to data collection? Should organizations collect data “just in case”?
My guiding principle is: The less data you collect, the easier it is to protect, and the safer your company is. The corollary is that the more data you collect, the more you must concern yourself with protecting it and building security policies and procedures to do so.
If you will collect data then you should know what data you’re collecting; why you’re collecting it; how you will use it; and how you’re keeping it safe. When you receive data, inventory and categorize it so the sensitivity of the data matches the context in which you’re collecting it.
Unfortunately, so many businesses are in the habit of collecting unnecessary data simply because they can. First, if you will not use the data, then you shouldn’t collect it. Second, extra data should benefit the customer, not just the company. If it improves your business prospects but not your customer’s experience or use, then the data-collection tool or question shouldn’t be in the product.
If it truly is optional, and it is solely for the customer’s benefit, then don’t make it a required question; find a way to offer ranges so that the data you receive is useful and predictive but won’t identify them. And, of course, tell the customer what benefit they’ll receive by giving you this additional information. Let them choose whether it’s worth it to them.
How do you identify the software companies that care about privacy?
Right now, there is no industry-specific badge, rating, or listing of software companies with good privacy policies and data security protocols. Eventually, I think there will be a badge or some other indicator that the product is highly trustworthy. I anticipate that this system is at least five years away and would likely come from IAPP or another industry-specific group to unify and prioritize criteria and provide uniform, unbiased evaluations. Since every industry or sector (like law, accounting, medical technology) has their own concerns, I anticipate that each will want their own rating system that prioritizes specific factors.
There are also professional associations leading the way for privacy training, but they’re not helping corporations with selecting the right tools. Some standardization there would be great.
What resources do you recommend for laypeople wanting to learn more about privacy?
A few months ago, I asked 20 privacy professionals for recommendations. Here are the non-fiction books on privacy they recommended:
- The Age of Surveillance Capitalism by Shoshana Zuboff
- Race Against Technology by Ruha Benjamin
- The Future of the Internet by Joseph Zittrain
- Habeas Data: Privacy vs. the Rise of Surveillance Tech by Cyrus Farivar
- The Right to Privacy by Caroline Kennedy and Ellen Alderman
- Future Crimes by Marc Goodman
- When Companies Spy on You by Jeri Freedman
- Privacy Blueprint by Woodrow Hartzog
- The Known Citizen by Sarah E. Igo
I’d also suggest The Circle by Dave Eggers for fun fiction that will get you thinking about privacy. It is both a fiction book and a movie (the book is better, obviously) about a Google-like social network that tracks everything you do under the guise of it being helpful to others. It highlights the invasiveness and abuse of data collection.
About Cat Coode
Cat Coode is the founder of Binary Tattoo, with a mission to help safeguard your data and protect your digital identity. Backed by two decades of experience in mobile development and software architecture, as well as a certification in data privacy law, Cat helps individuals and corporations better understand cybersecurity and data privacy. She specializes in Privacy Regulation Compliance and delivering privacy education seminars. Cat is an engineer, speaker, consultant, author, and, above all else, a parent. Her motivation to help others was born out of her concern for her kids and our ever-changing digital landscape.
About the Privacy and Security Interview Series
This interview is part of a collection of interviews about privacy and data security. By producing this series, we hope to prompt legal professionals to think about the privacy concerns that arise in everyday tasks like word processing and selection of document creation software.
WordRake is clear and concise editing software designed for people who work with confidential information. The software improves writing by simplifying and clarifying text, cutting legalese, and recommending plain English replacements. WordRake runs in Microsoft Word and Outlook, and its suggestions appear in the familiar track-changes style. Try WordRake for free for 7 days.