First modification:
Fueled by pervasive facial recognition and artificial intelligence, the social credit system begins to spread unevenly across China. The program, which is reminiscent of the Black Mirror series, involves not only ordinary citizens, but also companies and officials.
By RFI China correspondent
According to Lian Weiliang, deputy director of the National Development and Reform Commission, there are 62 pilot cities with a citizen trust rating system.
This scoring system rewards or penalizes citizens according to whether or not they comply with the laws.
Skipping a traffic light, smoking in prohibited places, having unpaid debts or committing fraud, in addition to its corresponding administrative sanction, entails certain restrictions: such as the prohibition to travel by plane or high-speed trains, and the purchase of luxury items. In some cities, the information of delinquent people is published on LED screens in shopping malls, trucks or bus stops, revealing personal data and causing social ridicule for the affected person and their family.
According to reports, more than 20 million people join the “black list” of unreliable people. Some resort to publicly spreading their apologies and warnings to the public through social networks, like Douyin, the Chinese Tiktok.
The penalty in this system implies a lack of confidence, a stain on the record of “good citizenship.” If your social credit is negative, you cannot apply for grants, access loans or get tuition in the best public schools or universities.
However, the social credit system does not specify that certain behaviors can be penalized. In this sense, Xinkai, a resident of Beijing, assures that “there is no behavioral social credit system, but the term ‘social credit deficit’ is used when the law is broken.”
The law prohibits bias towards people who have been in prison, or who have a criminal record, or who have a disease. However, Liang, a young woman from Guangzhou, explains that “there are unwritten rules. And if you made a mistake in your past, it can haunt you all your life. For example, many singers who had problems with drugs, even though they stopped, later had many problems and never gave concerts or appeared on screens again.”
rewards and punishments
Although the social credit system does not explicitly contemplate it, certain behaviors, such as participating in demonstrations or spreading content contrary to the Chinese government on networks, could also be included in this list of antecedents that comes to be social credit.
Citizens not only lose points in this system, they can also recover points by making financial donations to district community services, donating blood, or having an exemplary behavior (something not described and very ambiguous).
WEF ?? ピクチャーズプレゼンツ、実話を元にした作品です?。
中国では、400mlᾧの献血をすると、社会信用ポイントが10ポイントもらえます?。
中国の多くの親は、自分の「子供」(中国のチョルド?を見たことがありますか)が最高の学校に入れるように、献血を選ぶのです?。 pic.twitter.com/ZpSwUhbVNf
— スコール ・レオンハート (@02sUj_8888) April 29, 2023
However, there has been a scoring system in the electronic payment application Alipay (belonging to Alibaba) for years, known as Zhima Credit. The figure (between 350 and 950 points) is related to the purchase history.
Zhima Credit penalizes if a service is not paid – it can even block the payment account until the debt is settled-; and it also rewards if your purchase loyalty is high, giving payment facilities, granting loans or rewarding free trips.
Zhima Credit works like a loyalty program where your score responds solely to economic data.
China’s social credit system is not a reality, at least in a comprehensive and general way. Although it seems that the Asian giant is taking giant steps in the construction of a dystopian unified control system, which has millions of surveillance cameras, facial recognition systems, mobile applications that sneak into the privacy of citizens, or advances in artificial intelligence.
In any case, it should not be forgotten that the control system already works thanks to applications such as Zhima Credit, without the majority of the Chinese population realizing the risk that the technological giants handle such a huge amount of data about our lives.