Data Shadows in Government Erase Human Worth

We find ourselves in a situation where the authorities have heaps of information about you as a citizen, and you don't know what this information is, whether it is correct or what it is being used for. Do the authorities themselves know?

Data Shadows in Government Erase Human Worth

You have many data shadows today. These are vague summaries of you as a person that authorities, banks and insurance companies use to assess your suitability for things like employment, benefits, sick pay, discounts and credit. In many contexts, you are merely a risk, the question is how big, and a 'data-driven' organization places great importance on your shadow to determine this.

Unfair treatment by government is certainly not new, or a result of modern algorithms and AI. It is a result of how people are viewed. And of how ideas about efficiency, and reliance on data, affect that view.

Sweden’s daily Svenska dagbladet, together with Lighthouse Reports, recently revealed that an automated system used by Sweden’s social insurance agency, supposedly predicting fraudulent behavior, showed clear signs of discrimination. People seeking temporary child support for sick children were disporportionately held up for scrutiny if they happened to be women, migrants, low-income earners or without a university education.

But we should not be surprised by a phenomenon that authorities have been moving towards for decades. It is today a norm in both the public and private sector to manage data shadows, diluted abstracts of each of us, as if they were a reasonable and accurate representation of our human worth and capacity in some form of presumed hierarchy.

"Payment difficulties, driver's license revocations, hospital stays, conflicts with the child welfare board, school grades, letters of recommendation, medical certificates, which books you've checked out, which newspapers you subscribe to, whether a brother has been to prison, whether a brother-in-law is an organized communist. Soon, perhaps, most of your payments and loans, if the banks are linked via the SIBOL system. Incomes are already there, in the tax agency skyscraper."

This is what human rights activist and member of parliament Kerstin Anér wrote back in 1972, more than half a century ago, when she warned about the data collection that was growing along with with computerization.

In 1972, Sweden’s employment agency had long been collecting data on all job seekers. It’s reported that agency workers protested against what they considered to be inhumane treatment – using people’s own information against them was immediately perceived as unfair. Their sister agency, the National Board of Health and Welfare, also put people into predefined boxes, exemplified by: "alcoholic, slacker, family issues...".

For every citizen, shadow after shadow loomed into existence.

There were reports coming from the United States about computer software being used to assess suitability for different jobs. And the software kept being used even when there were no programmers left who knew how the system worked. This carelessness seemed inconceivable.

But now, fifty years later, the "AI" industry refers to the inner workings of systems as existing in black boxes. Nobody can see or understand how they really function, and no explanations for outputs and conclusions are requested by decision-makers. A baffling number of managers and leaders express that this is fine.

Reducing humans into data points immediately feels scientific and awe-inspiring to people in power. Infallible. Human complexity becomes neat, binary and void of gray areas. Information is encoded in templates known as profiles. If those profiles are correct, fair or relevant becomes secondary. As long as we can output a number through some formulaic assumption. Numbers are so much easier to read than people. And difficult to dispute. Especially when you are kept from tracing their origins.

Anér’s essay continues:

"You hardly notice that it’s only a few single points, taken out of context, that make up the picture. You forget that much of the information in the data image has already been interpreted, and inserted into a context that may not be the original one at all. You have conclusions in front of you — but you think they are raw facts. And so you appoint, dismiss, judge and evaluate a data shadow, but the one who has to bear the consequences is a human being of flesh and blood."

I'm amazed by how accurate she was this early on. How few understood, or even understand today. That we've let it come this far.

Contrary to the promise of empowering humans, computers are also being used to lessen us. Trust can only be restored when these pseudo-calculations cease and organizations find a way back to treating citizens and customers with dignity. Automated evaluation often means automated discrimination, and more effective discrimination is a bad look for the idea of progress.

It is impossible to trust authorities that cannot explain the basis on which their automated assessments are made. Whose managers are unable to account for how the algorithms supporting their core operations work. That this governance through unawareness is even a viable approach in a democracy is mind-boggling.

I’m especially stirred when Anér, in the same essay, questions what happens over time if the purpose of the machine is seen as handling and judging more and more people without exchanging a single word with them. Without listening. Without considering any aspect of their physical being. Her words slap me to alertness, as if I’ve failed to pay attention in class:

"Does this serve humanity, or abolish it?"

In many regards, how your many different data shadows are interpreted by machines and strangers is now more important than how you yourself actually feel.

_____

Per Axbom is a teacher, speaker and advisor in digital ethics, author of the handbook Digital Compassion and co-founder of the think-tank Dataskuggan.

Kerstin Anér's quotes are from the essay "Dataskuggan", published in the Christian cultural magazine "Vår lösen", in 1972. It was in this essay that the term "Dataskugga" was first coined, a term that has since been adopted internationally in the English form "data shadow".

💡
Follow Per Axbom on LinkedInBluesky and Mastodon. And if you don’t already: subscribe for free to axbom.com.

References

In Swedish

Original article: Dataskuggor hos myndigheter suddar ut människovärden (December 15, 2024).