China’s surveillance state: Using technology to shape individual behavior-

0
1056

ucanews.com 6 February 2019

China’s surveillance state: Using technology to shape individual behavior As Chinese Communist Party wants to export its new social control technologies globally, that should worry us

China's surveillance state: Using technology to shape individual behavior

The Chinese national flag flies behind security cameras on Tiananmen Square in this file image taken on June 4, 2012, on the 23rd anniversary of communist China’s crackdown of democracy protests in Beijing. (Photo by Ed Jones/AFP)

by Anders Corr

The cutting edge of applied technology for social control is in China, where the human right to privacy is daily violated against its 1.4 billion people.

Artificial intelligence (AI), big data analytics and biometrics are all leveraged by the Chinese Communist Party (CCP) in its quest for increasing control over the individual.

The CCP requires purveyors of block-chain technology, as with digital technologies more generally, to provide the government with a perfectly transparent window into the data, including personal identifiers (PID), of users.

The government accesses this data at blazingly high 5G speeds of up to 10 gigabytes per second, provided by Chinese companies like Huawei, under U.S. federal indictment. The CCP supports Huawei in part because it seeks to export China’s communications standards, allegedly along with back doors accessible to the party, worldwide.

The CCP’s ambitious plans to fuse multiple personal data streams for use in finding, quantifying, and predicting what it deems to be negative individual behavior is a direct violation of the right to privacy. The government plans not only to gather and analyze personal data in technologically unprecedented ways, it will then punish, deflect, or obstruct such behavior in an arbitrary manner unbeholden to a democratic electorate.

Some U.S. intelligence programs, for example the 2003 Total Information Awareness (TIA) program of the Defense Advanced Research Projects Agency (DARPA), also used biometric and big data analytics to predict and intercept terrorism.

Privacy advocates in the U.S. Senate, however, closed it down, or thought they did. In fact it secretly moved to the National Security Agency (NSA) where it dropped some of its proposed privacy strictures. Until at least 2012, it continued to develop there and at a one-million-square-meter facility in Utah for data storage and processing. By then, Americans had become used to the idea of their own government sifting their data to catch terrorists.

Meanwhile, the idea proliferated to Singapore, where it rolled out as Risk Assessment and Horizon Scanning (RAHS). The notorious Admiral John Poindexter, who got the idea for TIA after the 2001 attacks and led its development at DARPA in 2003, was on-hand to give a speech at the Singapore unveiling in 2007.

The CCP has built upon ideas from the U.S. and Singapore programs to use sophisticated biometrics and surveillance such as facial recognition, vehicle recognition, geo-tagging, and massive DNA databases, to match an individual’s location and activities to patterns that might indicate not only terrorism, but any type of threat to the party’s power. They do this with no democratic mandate, and with what they call a “social credit score” that along with location patterns are analyzed using big data and AI techniques to find and neutralize threats. Each individual in society, thinking they are tracked and quantified as to their value to the party, is controlled like never before.

Where the law, religion and ethics typically cannot reach, the CCP’s social credit score and AI will subtly shape the individual’s behavior, at times without the individual even knowing that its very identity is influenced to fit the purposes of the state. The new surveillance state is marketed to average citizens as being for their own benefit, as they will live in the resulting “smart cities” of data transparency and benign governance.

While the social credit score is currently an overarching concept for a pastiche of overlapping and sometimes geographically discontinuous local pilot programs in China, they are starting to link together. Successful programs will be scaled to the entire nation, and if China has its way, some of its new social control technologies will be exported globally. That should worry us.

Those Chinese citizens who prefer the freedoms of the past, and act on such preferences, are starting to feel the deleterious effects of China’s surveillance state. People with low social credit scores in China are today barred from access to travel, education, jobs, loans and even personal freedoms. Those who jaywalk see their faces on digital billboards in acts of public shaming.

This photo taken on Feb.5, 2018, shows a police officer speaking as she wears a pair of smart glasses with a facial recognition system at Zhengzhou East Railway Station in Zhengzhou in China’s central Henan province. (Photo by AFP)

Techno-fascism

Ethnic cleansing is the scariest part of what should be called China’s emergent techno-fascism. In the Xinjiang region, approximately 1-2 million people, about 10 percent of all Turkic Muslims, are now detained against their will in what China calls “vocational training centers.” At these centers, Muslims — most of whom are ethnic Uyghurs and Kazakhs — are tortured and beaten. They are also forced to learn Mandarin, Confucian philosophy, and communist propaganda. The children of detainees are housed in newly-built orphanages, where they are subjected to similar forms of propaganda and forbidden to speak Uyghur or practice their religion.

The list of other abuses that are part of this campaign against Muslims in this region is long and hard reading. While academic researchers hope for the best, they are also preparing world public opinion for the worst: that vocational training centers could eventually be part of the depopulation of China’s Turkic Muslim population. There are some solid indicators of this risk. Detainees are mostly in their 20s and 30s, so the detentions impinge on the ability of Turkic Muslims to procreate.

Those Xinjiang Muslims not in camps are frequently checked by guards when they walk to and from work or social events. Police decide at the checks to send most Han through a faster green lane. If sent through the red lane, as nearly all Muslims are, the individual is x-rayed and phones are typically checked to ensure that they contain government spyware that tracks the individual’s location, browsing and communications.

Surveillance cameras track the movements of Turkic Muslims through these checkpoints via facial recognition and license plates. Police substations, “convenience” stations, watch towers, surveillance vans, and armored vehicle convoys dot cities and townships, and are found on almost every block of Urumqi, the capital of Xinjiang. They surveil the population further through a set of human eyes, and are empowered to track and record mobile communications. 

Unlike in past cases of ethnic cleansing, China’s open air prison in Xinjiang will have been aided by technology that selects, using big data analytics and sophisticated algorithms, those within the target population that are seen by the state as most ethnic, and most likely to rebel.

If that sounds like an Orwellian dystopia forgotten in 1984, think again. This is a system that is happening now in China. From Xinjiang’s “vocational” detention camps to the “smart cities” of the coast, China’s population is living in various forms of a high-technology totalitarian “nightmare”, according to some researchers, that is getting worse by the day.

Xinjiang is a test bed for how far a state can control a population in the age of high technology. It is a panopticon, in which no Chinese citizen is quite sure whether a particular communication, transit, or purchase has been tracked, or whether it would really matter to an official who did track it. But individuals do know that data is tracked at least sometimes by a human behind the screen, and sometimes that tracking results in detention, torture or worse. So the risk-averse individual molds his behavior in anticipation to what he thinks the CCP might like were it watching.

China’s cutting edge technologies against the right to privacy are due to its need for coercive social control, rather than the social and political participation that one finds in democracies. China’s harmful policy decisions, from its planned economy to costly ventures abroad, lead to the popular discontent in China that threatens stability and the continued tenure of the CCP. This threat to the party’s power is what in turn causes it to use every technology available in this brave new world of surveillance and social engineering. There is no realized right to privacy in China. There is just the mass of isolated people in service to the ever more technological expansion of the state.

A file image of security cameras on a street in Urumqi, capital of China’s Xinjiang region. Xinjiang is a test bed for how far a state can control a population in the age of high technology. (Photo by Peter Parks/AFP)

Changing international privacy norms

Unfortunately, some elites who are influenced by China’s growing economic and military power are buying into China’s novel invasions of privacy as some positive global trend. Brookings collaborated with Huawei on a 2017 paper that promoted “safe cities”, a marketing term for smart cities controlled by a police brain fed by a nervous system of networked intelligence feeds.

In a 2017 paper, researchers from Yale, UC Berkeley, and the Wikimedia Foundation wrote: “On an international scale, there is great potential for China to set the next generation’s standards for privacy. With China’s market power and increasing relevance in geopolitical discussions of technology, China’s influence cannot be underestimated.”

The authors see China’s economic and political power as giving it the potential to utilize long arm jurisdiction to regulate company privacy policies internationally, even when they have only weak ties to China. China could do this, according to the authors, in such a way as to contradict EU privacy laws or replace the U.S. in APEC privacy discussions, leading to “intriguing” trade results. “Simply put, China can use privacy laws to cement its place as a global leader, not only in privacy, but also technology as a whole.”

According to the authors, “A united pan-Asian bloc could definitely create some interesting international privacy norms, and the clearest choice for leading that kind of endeavor would be China.”

China’s techno-panopticon influences people from the lowest Muslim in its detention camps, to privacy law researchers at elite think tanks and universities in the United States. This increasing control of the global population is well within a trend identified by Michel Foucault, who traced the history of discipline and punishment starting in Europe’s medieval period. He finds that while modern punishment has become less a spectacle of brutality (e.g., the state no longer tears criminals limb from limb in the town square), punishment has gotten more pervasive and internalized.

The increasingly all-seeing state and ever more variegated punishments fitting the “crimes” defined by the state has since the Middle Ages boxed human freedoms ever more tightly. That trend leaves us today with China’s new technologies, the most dangerous threats to privacy in history, according to some researchers. The trend will only be reversed if people mobilize sufficiently to defend their human rights to privacy and freedom, and if democratic governments can overcome the internal resistance of elites to squarely oppose China’s growing power.

Anders Corr holds a Ph.D. in Government from Harvard University and has worked for U.S. military intelligence as a civilian. He frequently appears in the media, including Bloomberg, ‘Financial Times,’ ‘New York Times’ and ‘Forbes’.