Discover more from The Ethical Tech Project
Data Dignity: The Bedrock of an Ethical Internet
Why Data Dignity matters, and isn't the radical idea some make it out to be
I asked Maritza Johnson, a data privacy expert and the founding director of the University of San Diego’s Center For Digital Society, what she thinks government could do to steer the current regime of digital rights in this direction if “data dignity” really does become a broader political cause. She made the case that it can use the same regulatory tools for accountability that already exist for other industries.
“We need to move away from this fairy tale that [data rights are] up to the individual and recognize this is a collective problem,” Johnson said. She cited the examples of Facebook and Twitter being caught using phone numbers collected ostensibly for two-factor authentication for advertising, saying that regulators should be empowered to set explicit rules for how data is used and presented to users, and to punish companies for not following them.
Maritza also rightly identified, as discussed in our kick-off newsletter, that AI is accelerating the need to act now ethical data use:
Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts:
Johnson argues this is why regulators need to act now, before the AI business model runs roughshod over privacy.
“Privacy and security are really hard to retrofit onto a system, which is why you have a lot of talk about ‘privacy by design,’” she said. “It would take a really big regulatory stick to be sure that companies actually do it, but it’s extremely important.”
But what is Data Dignity, and why is it the bedrock of an ethical internet?
Data Dignity, Defined
At The Ethical Tech Project, we have a fundamental belief in data dignity, which we define as:
Data Dignity: When companies use people's data, they are obligated to use it responsibly. Responsible use means people can exercise negotiating power over the use of their data and the value exchanged for it: when it’s used, how it’s used, and where it’s used.
At The Ethical Tech Project, we believe people deserve data dignity and control over the data that originates with them.
Of course, we weren’t the first to coin the term. Jaron Lanier and Glen Weyl are pioneers of the concept for over a decade, and their joint 2018 Harvard Business Review article “A Blueprint for a Better Digital Society” summarizes Data Dignity as the paradigm in which:
A coherent marketplace is a true market economy coupled with a diverse, open society online. People will be paid for their data and will pay for services that require data from others. Individuals’ attention will be guided by their self-defined interests rather than by manipulative platforms beholden to advertisers or other third parties. Platforms will receive higher-quality data with which to train their machine learning systems and thus will be able to earn greater revenue selling higher-quality services to businesses and individuals to boost their productivity. The quality of services will be judged and valued by users in a marketplace instead of by third parties who wish to influence users. An open market will become more aligned with an open society when the customer and the user are the same person.
Weyl’s organization RadicalxChange has this pithier definition:
People should be able to exert democratic collective bargaining power over their data, in order to make joint decisions controlling its use, and negotiate appropriate compensation.
In either case, all three of these definitions (ours The Ethical Tech Project, and those put forth by the originators of the concept), have the same attributes:
Individuals are exchanging personal data when they use tech products
That data inherently has value
The status quo over the exchange of data and it’s value is out of whack, and individuals have a right to change it and control their data
Individuals and governments are waking up to the concept of Data Dignity - and that will mean deep consequences for firms that don’t respect it.
Data Dignity Isn’t A Radical Concept
Arguments Against Data Dignity
The typical argument against data dignity is something like this:
“C’mon! Individuals aren’t really exchanging personal data!”
“OK, even if individuals are exchanging data, that data only has value in the context of the platform to which it’s given and seems like a fair exchange of value - we’re all consenting adults here, right?”
“Fine… even if individuals are exchanging data, and it’s valuable, most people don’t care anyways - whatever!”
We see all 3 of these lame counterarguments in James Pethokoukis’ recent substack criticizing FTC Commissioner Lina Khan’s recent NYTimes OpEd calling for AI Regulation:
Individuals aren’t really exchanging personal data: James calls it “a fundamental misunderstanding of the online ad business” and highlights that the data is “a series of vector equations that are inscrutable to anyone.”
It’s a fair exchange of value for the platform: James pulls out a study examining the price a consumer would accept to give up a service, identifying a $40-$50/month figure for Facebook and writing, “many of us would need substantial compensation to give up our favorite online services.”
Most people don’t care: James writes, “All those “invasive” business models… seem to provide an acceptable tradeoff to most of us.”
(read more from James’ piece in the link below)
This seems like a convincing argument against data dignity! Surely, data dignity itself is absolutely radical - negotiating power for personal data? Individuals banding together to acknowledge their data has value and taking on the platforms collecting that data? Preposterous!
However, this is exactly what’s happening: consumers are awakening to the need for ethical data use and their personal data dignity, and democratic processes are, in turn, instigating regulators to action.
Let’s Debunk Arguments Against Data Dignity
Here’s why the arguments against data dignity are misguided:
Data leak after data hack shows data isn’t as secure as the platforms want us to believe - and AI accelerates the security risk. As generative AI is deployed across both personal and enterprise data sets, expect more leaks like that experienced by Samsung, where sensitive data appeared in ChatGPT results.
The value of data isn’t static, and the lack of agency and transparency over how that data’s value is realized into the future is a mess of consent. Perhaps the exchange of value today is a service like Instagram or Gmail in exchange for ads promoting mattresses-in-a-box, and to many of us that seems fair. But
tomorrowtoday, that exchange of value is extended into a plethora of AI applications - and there's no audit trailing or custody chaining preventing your photo of grandma from showing up in an AI-generated image.
Study after study shows people do care about their privacy and personal data - and issues of fairness and accountability that come with it - and that’s shaping their perceptions of brands. It’s 2023 - perhaps people didn’t care in 2003, but today, the internet is everywhere, and the privacy risks are well known from the incessant scandals that have permeated their way into pop culture hits like Kiefer Sutherlands’ “Rabbit Hole.” Here’s just one recent study: University of Pennsylvania's Annenberg School for Communication's recently released report "Americans Can't Consent to Companies' Use of Their Data" where 91% agreed “I want to have control over what marketers can learn about me online.” and 72% disagreed to “I trust companies I visit online to handle my data the way I would want the data handled.” That’s a huge gap of trust no amount of anecdotal evidence can argue around.
Data Dignity, when painted as a radical paradise promoted by the technogeeks at Burning Man, seems radical at first glance. But when you delve into what Data Dignity is actually about, it starts to look like a common sense philosophy rooted in a shared sense of what is appropriate when it comes to values such as privacy, agency, transparency, fairness, and accountability - values that we at the Ethical Tech Project call Ethical Data Principles.
Towards an Ethical Data Ecosystem
When companies use people's data, they are obligated to use it responsibly. Responsible use means people can exercise negotiating power over the use of their data and the value exchanged for it: when it’s used, how it’s used, and where it’s used. That’s data dignity.
As individuals, we all have this fundamental right. And businesses and companies should respect these rights - both because the regulators tell them so, and because consumers will reward them. In our next post, we’ll dig into how data dignity materializes at the firm level through data stewardship, ultimately supporting the ethical data ecosystem
Thanks for reading The Ethical Tech Project! Subscribe for free to receive new posts: