Congress is working on a federal privacy law. Here’s why we need one
The Ethical Tech Project's Jonathan Joseph calls for action on a Federal Privacy Law this week in 'The Hill'
Don’t look now, but Congress is preparing to pass a federal privacy law. The American Privacy Rights Act would swap America’s mishmash of state-level laws for a single federal statute offering broad new protections — from regulating “dark patterns” that manipulate consumers, to banning unapproved sharing of sensitive information.
That would be a big deal. The average American’s online activities are tracked and sold 747 times per day; in total, our data is tracked 178 trillion times per year. This isn’t the background noise of the internet — it is the internet, and we’re surveilled and monitored every time we log on.
But while better regulations are certainly needed, it’s consumers, not regulators, who are currently doing most to force brands to reevaluate their data practices. Take GM’s recent decision to scrap a program that sold motorists’ data: the move came in response not to a regulatory crackdown, but rather to a consumer backlash that threatened GM’s bottom line.
To exercise their power effectively, though, consumers first need to understand how their data is being used. Unfortunately, it’s increasingly hard to keep track: Nobody has time to read through endless privacy policies and consent notices, so nobody really knows what they’re giving up. To that end, here’s a quick refresher on the ways businesses are currently profiting from your data:
1) Your browser history. OK, obviously companies are scrutinizing your online behavior: It’s the value exchange we enter into — or devil’s bargain we strike — whenever we Google something or use an ad-supported website. Few people realize, though, just how much personal information is up for grabs. We’ve seen Grindr allegedly sharing users’ HIV status with third parties; data brokers classifying people based on past sexual trauma; and Alzheimer’s patients flagged for targeting by scammers. On the modern Internet, the data brokers definitely know you’re a dog.
Consumer countermeasures: Don’t count on Incognito mode or VPNs to mask your online behavior. Be careful what you consent to — the accept-all-cookies button is not your friend — and consider using tools like Global Privacy Control to minimize the data you give up.
2) Your location. For data brokers and the brands who buy from them, where you are is almost as important as what you’re doing — and besides its value to marketers, location data can also be easily weaponized. Organizations can track people’s visits to Planned Parenthood, for instance, in order to create targeted anti-abortion campaigns. Regulators are clamping down, but many companies still color outside the lines: The Tim Hortons coffee chain, for instance, was caught improperly tracking customers around the clock, including the times they left home, where they worked, and how often they visited rival coffee shops.
Consumer countermeasures: There’s no perfect solution, but digging through the preference settings to disable smartphone ad tracking is a good start. Extra-strength protections like Apple’s Lockdown Mode are available, but limit the functionality of your device.
3) Your driving skills (and much, much more). Modern automobiles are smartphones on wheels, and all major car brands — not just GM — are collecting location and behavioral data on motorists. Some automakers have been caught reporting bad drivers to insurers, leading to rate increases or denial of coverage. Others collate driving data with additional information including race, genetic information, sexual activity and more. Every time you get behind the wheel, in other words, you’re giving up a startling amount of data.
Consumer countermeasures: Keep an eye on the small print when signing an auto contract — including rentals — and visit Privacy4Cars.com to get a privacy report on your current ride, or tips on how to delete stored data.
4) Your kids. Children are supposed to be largely off-limits to data brokers, but the reality is much messier. Dozens of data brokers were recently found to have sold children’s location, health and other data; Google, meanwhile, reportedly allowed personalized ads to be served on YouTube videos made for children. Even apps specifically designed to keep kids safe sometimes leak GPS data, private messages and other personal information.
Consumer countermeasures: Federal legislation to strengthen privacy protections is in the pipeline, but courts have blocked state-level efforts to keep children safe. In the meantime, parental controls are about as good as it gets.
5) Your face, your fingerprints — and your thoughts? Biometric data — your features, fingerprints, DNA, retinal patterns and more — is a data goldmine. Retailers have been caught using facial recognition technology to monitor shoppers, while Meta got dinged for collecting biometrics on 60 million Facebook users. As technology advances, marketers will also use gaze-tracking, physiological markers, and even neural monitoring to figure out what you’re thinking from one moment to the next.
Consumer countermeasures: There isn’t a ton you can do, but be careful what you consent to and keep track of local biometric privacy laws — if you live in Illinois, say, you’re better-protected than someone in Idaho or Indiana.
This just scratches the surface of the data swirling around the digital economy, and the “countermeasures” I’ve described are pretty weak sauce: They mostly boil down to expressing a preference and hoping companies will play by the rules. With two-thirds of 10 Americans believing that companies will abuse their data no matter what they do, there’s a risk of us slipping into privacy nihilism, and giving up our data because we don’t see an alternative.
Complex privacy regulations make that worse, not better, by pushing consumers into confusion and apathy. Over half of data deletion requests, for instance, come from states in which consumers aren’t empowered to demand that their data be deleted—a sign that even people who care about data privacy don’t currently understand their legal rights.
A federal law, by offering a single national rulebook, might help with that. But the real power will remain in the hands of consumers. The more we pay attention, get angry, and refuse to buy from brands that play fast-and-loose with our data, the more those brands will be forced to bring their data practices in line with our expectations.
We’ve already seen consumer pressure force companies to push beyond environmental regulations and embrace sustainable business practices, for instance, and also to require better working conditions across global supply chains.
Now, consumer pressure is driving data practices forward, too. The speed with which GM pulled the plug on its data-sharing program shows who’s really in the driving seat. It’s just up to all of us to keep our eyes on the road. As consumers, we have the power to force companies to respect our data preferences — if we stay angry, and keep paying attention to how our data is used and abused.
This opinion piece was published in The Hill on May 10th, 2024
Upcoming Ethical Tech Project Events
Located in the Bay Area? Join Us at Our Inaugural SF Privacy Technologists Meetup on Tuesday, 5/21
Join Nandita Rao Narla, Head of Technical Privacy & Governance @ DoorDash and Ethical Tech Project Board Member, Chitra Dharmarajan, Vice President, Security & Privacy Engineering @ Okta, and Sam Alexander, Data Privacy Engineer @ Ketch and all-around Data Ethics Nerd, for a casual conversation on how to set-up a privacy program and privacy technology from scratch.
Whether you're already running a major privacy governance program, looking to get started, or just have a casual interest in creating a world where data ethics and privacy are a priority rather than an afterthought, stop on by for food, drinks, networking, and casual conversation.
Register Here: https://lu.ma/sf-privacy-technologist-meetup
In NYC For #TechWeek? Join Us At Our Publisher’s Breakfast on Wednesday, 6/5
This exclusive event, designed for technologists, marketers, and legal professionals from publishing and media organizations, aims to spotlight the importance of responsible data practices and how they pave the way for meaningful data monetization and advertiser trust.
Prioritizing responsible data practices enables publishers to build deeper relationships with advertising partners that want to target permissioned audiences, in addition to enabling key DTC use cases like personalization, analytics, and retargeting. In an era where publishers are navigating the complex landscape of data mobilization and monetization, this timely event provides a forum to discuss this challenge and opportunity with industry experts, peers, and new colleagues.