The Real Life Black Mirror?

By Laura Smith

Recently, I was in a Lyft in Los Angeles discussing the British dystopian television show, Black Mirror, with my driver. I told him about the episode in which every person you interact with can rate you: coworkers, friends, baristas all have the power to determine your social capital. “Oh!” the driver interrupted, “they already have this in China!”

Ah, China, America’s favorite whipping boy. They’re stealing our jobs, they’re “hostile to the United States and our values,” they’re trying to sink our economy with their trade war. The president’s disgusted pronunciation of the country’s name is its own meme.

As it so happens, I had recently interviewed Shazeda Ahmed, a PhD student at UC Berkeley’s School of Information, who is studying cybersecurity and internet policy in China, about the very program my Lyft driver was referring to. The Chinese Social Credit System is indeed a rating system, intended to encourage compliance with the law. Since its inception, it has captured the West’s imagination, but as Ahmed explained, it doesn’t seem to be causing quite the same stir at home and it isn’t what we think it is. I spoke to Ahmed, to find out what the Chinese Social Credit System is, what it is not, and if it really is the stuff of British dystopian television. The following interview has been edited for length and clarity.

There’s been a lot of hysteria surrounding the Chinese Social Credit System. What is it exactly?

The Chinese government wanted to devise a solution to break a pattern in which people break laws and evade punishment. By sharing mostly government-generated or public data across different bureaus of government, the state is trying to create reward and punishment incentives to ensure that people comply with the law.

People kind of misunderstand what this looks like. You can still send your children to public school. You just have to take a slow train instead of high speed rail.

Four groups are being assessed: individuals, companies, the government itself, and legal organizations like courts. One of the core problems is that it’s actually hard to get a lot of people to comply with the law. In a country of 1.3 billion people, if you have a few million people breaking the law, it’s kind of a big problem.

Take the example of how they are dealing with people who aren’t paying their taxes. The thinking was, if this person hasn’t paid their taxes, they shouldn’t be allowed to spend money on things that the government has uniformly decided fall into this category of luxury consumption like private school education for their children, or high speed trains rather than slow trains. In the past, it would have been hard to implement that because you need to coordinate between the tax bureau, the railway administration, and the education bureau. Now they have mechanisms for sharing this information with each other.

Under the system, one of the big goals to be fulfilled by next year, is for all those bureaus to have access to each other’s data in order to effectively implement punitive measures against law-breakers.

What kind of data is the government using to “nudge” or punish people who have broken the law?

They can only really look at government data, and data from the private sector only where these data pertain to illegal activity. So, if you were selling counterfeit goods on China’s version of Amazon Marketplace they’d share that kind of data. But companies aren’t sharing things like where you shop or who you’re talking to with the government. It’s not a dragnet that spits out a score about how much of a moral person you are. And while there’s a lot of propaganda and language around morality in publicizing this, it’s not really about that.

People kind of misunderstand what this looks like. Going back to the tax example, it’s like, you can still send your children to public school, and you’re not trapped with no options. You just have to take a slow train instead of high speed rail. It’s creating inconveniences that would frustrate people. And the idea is to incentivize them to go pay the court fine or act on the court order that gets them off a blacklist and never commit the transgression that got them on that blacklist ever again.

What’s the deal with the blacklists?

I think a lot of people misunderstand [the blacklisting system]. When it comes to regulation of companies, there are actually several different kinds [of blacklists] that are sector specific. And then there are rules governing what gets you on one if you’re a company. Then within a particular province or on a city level, there are different systems, some of which are experimenting with scores that are for the most part ignored. A couple dozen cities have been given permission to try out their own local systems, though not all of them have scoring. [The Chinese] do have a financial credit score, like a consumer credit score through the People’s Bank of China that’s totally separate from these city level scores, which are more of a measure of how much you’ve complied with the law.

So what are people upset about? Does anti-Chinese sentiment have anything to do with this?

Obviously things have gotten worse politically with China, so that’s certainly a part of it. I just wonder if it’s a broader inability to think about technology in a more nuanced way where it’s not just extreme good or extreme bad. Shows like Black Mirror make it worse because they kind of hit you on the head with the worst case scenario.

Are any of the fears founded? How could this be abused?

There are some examples where people co-signed loans with other people who didn’t pay them back, and now a court has ordered them to pay back the loan, but they can’t, so they’re on the blacklist. And again, I don’t know how widespread this is, but it’s unfortunate that it happens at all.

Viral stories go around about people who’ve broken the law, and they get away with it. Under this system, the thinking goes, those people will finally be
punished.

One thing to keep an eye on: looking out for ways in which a company might be partnering with the government to share data and create incentives for people to change their behavior. I’m worried about this since it’s a common problem here in the U.S too with a lot of smart city developments. Certain cities get locked into contracts with with one major vendor. That hasn’t happened in China yet to my knowledge, but some of what I learned this past year was how hard it is to begin to trace that kind of cooperation.

Also, is there a way to find out if people of a particular ethnic minority are more frequently blacklisted? Or who are more frequently struggling to get out of being monitored by the law in this way? It’s those questions that are actually easier to answer in, for example, New York because we have more publicly accessible data.

Is there any way people might benefit from this technology?

Viral stories go around about people who’ve broken the law, caused a lot of pain for people, and they get away with it, not being called into court, or not being held to account for their misdeeds even when they are; under this system, the thinking goes, those people will finally be punished. And it shows that the government is finally taking this seriously.

I’ve heard stories from people who are looking at the corporate side of this. Now that the system is being enforced, they can’t really get away with bad practices they conducted in the past.

Before we go pointing fingers at the Chinese, aren’t a lot of these technologies also being used by American companies? I’m thinking of digital profiling, financial credit scores, loyalty programs, sex offender registries, etc. How is this different?

Yes, I think what should be emphasized here is that all of the technologies you’ve listed are being used in both China and the U.S., but in China none of the tech you listed is part of the Social Credit System. It’s completely separate, but mistakenly presumed to all feed into a central surveillance apparatus. To my knowledge, that isn’t happening—it’s a projection of a potential future that people outside of China mistakenly treat as though it is part of the present.

Share this article:
Google+ Reddit

Comments

Thanks for fleshing out details of what many of us have heard was an omnipresent surveillance system. As a philosopher who publishes a work of philosophy in an education fantasy employing some of the technology forecast in “Black Mirror,” I watch the series with great interest. The volume and variety and quality of code coming our way will — in the future I construct — turn a cell phone or tablet or big screen into a supercomputing television. It will deliver a form of “cyberneticare” that nourishes a user through ever more revealing virtual conversation with the artifice of our intelligence resident in portrait television libraries. Here are stored the best answers to our many and varied questions. By facilitating teamwork in pursuit of knowledge and wisdom, the enhanced interactive learning systems function of the internet will lead to the “humanet” that delivers universally excellent universal education to enrich all of us. Unfortunately in this fantasy, the path forward is so painful in the decades ahead that we are forced to confront the extinction probability for humans as we manage to “produce” a better future.

Add new comment