Is China’s social credit system as bad as we think?

20

September

2019

5/5 (2)

In 2014 China showed her plans for the implementation of her Social Credit System. China aimed to fully implement it before the end of 2020, but had early on ‘pilots’ running in several areas and cities (the 13th Five Year Plan of China, 2014). Ever since the announcement, references to Black Mirror and a totalitarian government havn’t stopped in the news. But how do we (the West) fare in contrast and how does the Social Credit System even work?

One of the pilots of the Social Credit System works as follows in the city of RongCheng. People in the city start out with a score of 1000. Point will be awarded for socially benevolent behaviour such as volunteering and returning lost wallets. However, points will be deducted if socially bad behaviour is noticed. Although many of us thinks this happens through AI enforced facial recognition, this is mostly not the case. Most areas simply have a person that ‘snoops around’ and writes it down. Good scores can result in social benefits like a discount on riding public transport or a lower interest rate when borrowing money from the government. On the other hand, lower scores can have in negative effects, such as denial of access to train tickets, plane tickets or jobs with high visibility (WIRED, 2018).

At first glance, RongCheng seems like a big success. The city, that was once a mess, is now spotless, car drivers slow down for pedestrian crossways (which happens rarely in China (own experience, 2018)) and many of its inhabitants volunteer at elderly homes. Also, many more people pay back their debt on time, for fear of losing credit points. One could say, “this seems to work and all you need to do is not break the law, which isn’t so hard, right?”. Many Chinese see it this way as well. They believe that it pushes their citizens to be better people and do what’s right (France 24, 2019).

Also, when looking at our countries, we find similar ‘credit scores’. When anyone defaults on a loan from the bank, you can bet that person will not receive a loan again. When having been in contact with the police, it too is registered on our ‘rap sheet’. However, there are some significant differences between our system and the SDS that is going to be implemented in China.

The scope of the SDS is the main difference. Because all of our misdoings are punished within the legal system, the punished can appeal and fight the punishment. However, this is not the case for the SDS. It lacks a presumption of innocence until proven otherwise. One of the core foundations of the Western judicial system.

But we shouldn’t pat ourselves too hard on the back yet. For we can see similar, all be it less invasive, trends appearing. Privileges related to communication, accommodation and transportation are being affected by our behaviour as well. Uber drivers have lost their job for having ratings lower van 4.6 (Forbes, 2014). Insurance companies pull more and more data from your Facebook page and online behaviour. At this moment, this all isn’t the end of the world, because there are multiple competitors in said sectors. However, if this trend continues and huge companies like Uber, Amazon and Facebook gain market share in other markets, we could see socially unwanted behaviour punished by Silicon Valley and no longer Washington D.C.

This bring me to my conclusion about why any social rating system should have limits. Because once a social rating system becomes powerful enough to starve its users from certain services, products or needs, because they aren’t behaving well enough or according to certain guidelines. The owner (or owners) of that social rating system gets to dictate what good or bad behaviour is. This could potentially mean that the rules that we live by are dictated not by law or the Constitution, but by the terms & conditions which (let’s be honest) we all thoughtlessly accept.

Let me know if you agree or if you have a different vision on the case! 🙂

The 13th Five Year plan of the Peoples Republic of China: http://en.ndrc.gov.cn/newsrelease/201612/P020161207645765233498.pdf (17 september, 2019)

The complicate truth of China’s social credit system: https://www.wired.co.uk/article/china-social-credit-system-explained (17 september, 2019)

How Uber’s Driver Policy Could Backfire on the Company:
https://www.forbes.com/sites/ellenhuet/2014/10/30/uber-driver-firing-policy/#3759bd571527 (17 september, 2019)

China ranks ‘good’ and ‘bad’ citizens with social credit system:
https://www.youtube.com/watch?v=NXyzpMDtpSE&t=183s (18 september, 2019)

Please rate this

3 thoughts on “Is China’s social credit system as bad as we think?”

  1. Hi Willem,

    Very interesting writing! As I just recently returned from China, this writing is quite close to me on a personal level and I have to say, you introduced a new point of view on this topic. My comment is focused on China. First of all, let me say, that never had I ever felt more safe anywhere in the world, than China and that is because there are cameras everywhere and people obey the law.

    One paragraph really caught my attention: that screening is not done by facial recognition but by people snooping around. Don’t you think this is quite unfair after all? What if people are doing good deeds to increase their scores but they are not seen by the screening person? Or if one area is more supervised than others so that the people who are there frequently will essentially have a higher score? Don’t you think facial recognition would indeed be the key then?

    Furthermore, from what I saw, and I would like to be as objective in the topic as possible, but the government really “controls” the majority of the people in China by blocking international sites from their Internet and by broadcasting strong government propaganda. And the social rating is also defined by them. However, this is not the case in most democratic countries, or at least not to this extent. In democratic countries, indeed big companies can dictate the rules, as you mentioned it in the case of the Silicon Valley, but what if these governments also incorporated such a system. Do you think it would breach the privacy of the average democratic citizen? Even if bigger safety and obeyence of the law is almost guaranteed?

  2. Hi Willem,

    Thank you for posting and writing about this interesting topic.
    I agree with you on the fact that it dangerous to let the system decide what good and bad behavior is. I think that a social rating system is never the answer, and I agree with you that giving much power to governments or institutions is not always good. I do, however, agree with Franciska on the fact that facial recognition and AI could make some decisions more ‘fair’. I also believe in safety and I believe in the fact that the Europe is more democratic in that regard. AI and facial recognition will likely bring many opportunities that can help us significantly. Think about traffic reductions or safer passport controls.

    Nevertheless, privacy matters should always be considered. Europe is now democratic, and it is difficult for us to imagine to have such a controlled life as in China. When things will change, however, and other politicians take power – this might not be the case anymore. I therefore believe that we should be mindful of our privacy. The Chinese government controls all the data of their population by apps and points are added if you say something good about the government, and deducted when saying something they do not like. Safeguarding ourselves from this kind of behavior from our governments or institutions is right. What has been proven with Alexa and other voice recognition products/services from Google, Amazon and Apple, is that big tech companies could also benefit from some reminders about privacy. I am very curious to see what opportunities AI may bring, but we should be careful of not having our personal information ending up in the wrong hands.

    Best,
    Tessa

  3. Hi Willem,
    This is a very thought-provoking topic! I believe it very tightly relates to the panopticon concept developed by Bentham. It revolves around the idea that when people know they are being watched at all times, they are motivated to behave at their best. Although, it has some considerable advantages, such as personal safety and well-being, it most certainly can have disastrous effects.

    In case of Chinese SCS, it really raises questions of implementation and further societal impacts. Who is defining what constitutes good or bad behaviour? What happens if the system wrongly recognize a person and his/her misdeeds? If a person reaches 0 points, can he ever manage to improve his/her score? This system might as well lead to promoting social inequality, resulting in something like caste system.

    I certainly agree with you that these kinds of systems need some sort of limits because it can easily spiral into a dystopian society. But this raises a question: What sort of transparency level should be in place, as no transparency at all also does not sound nice.

Leave a Reply

Your email address will not be published. Required fields are marked *