Sign Up
..... Connect Australia with the world.
Categories

Posted: 2022-09-03 19:00:00

In the alley by the farmhouse was what looked like a bizarre outdoor photography studio. Three tall, black tripods, each with a smartphone attached with zip ties, stood facing the blank wall of a neighbouring farmyard. Trailing behind each tripod were wires leading to a large, expensive-looking computer setup perched on a rickety desk.

Loading

The man with the squeaky red microphone yelled “NEXT!” A woman in her early 70s at the front of the line nervously shuffled in front of the first tripod, where she was handed a comically large set of sunglasses. She put them on, covering half her face in a way that made her look vaguely beetle-like. As she fidgeted, the smartphone began filming, and her face appeared on the computer screen.

“Look left!” barked a man from behind the computer. The woman snapped her head to one side. “A bit slower. Now right!” The woman’s gaze panned slowly across the alley to the right. She repeated the movements on command; right and left, up and down, before finishing with a wild flourishing circular neck movement. “Good. Next one,” the man finally said, as the woman gingerly handed back the sunglasses. She was then bustled over to the stack of prizes, where she exchanged a ticket for a bottle of cooking oil.

At the second tripod, another elderly woman was handed an eerie printed image of a stranger’s face that had the eyes and nose cut out. As the smartphone camera rolled, she held the unsettling mask up and poked her nose through it. On command, she darted her eyes rapidly back and forth, remaining otherwise perfectly still. “Next …NEXT!” blared the squeaky red microphone. The troupe of villagers shuffled up. Just down the road, a pig grunted in an indoor sty.

To an outsider, the scene would’ve been entirely baffling. To me, it was a perfect example of what made China equal parts fascinating and deeply disturbing. The villagers were a small but key part of the biggest human surveillance experiment on the planet: they were selling their faces to train facial recognition algorithms in return for cooking oil.

“The largest projects have tens of thousands of people, all of whom live in this area,” said Liu Yangfeng, CEO at Qianji Data Co Ltd, the local company based in Pingdingshan that was running the project.

Loading

Dressed smartly in suit pants and a white collared shirt that was one size too small, Liu hovered nervously around the cameras, checking the photos and periodically wandering over into a nearby vegetable patch to take a seemingly never-ending stream of phone calls. In the rural setting, he looked almost as out of place as I did.

He was one of many who capitalised on the fast-growing demand for data sets used to train AI algorithms, including facial recognition. The booming industry had sprouted almost overnight. He steered us into a nearby farmhouse, its walls bare save for a large, tattered poster of Mao Zedong. The floor space was cluttered with rows of tables with blank monitors. “Here is one of the places we train people to do the data labelling,” he said. He was coy about whom he collected data for, but said his clients included the government, as well as some of the country’s largest tech companies.

I started chatting to the old folk in line. My years of studying Mandarin were no match for the local dialect, but I managed to find one woman two weeks out from her hundredth birthday. China is one of very few places on the planet where a 100-year-old woman would – unfazed – find herself participating in an AI data collection project in a rural village square on a Wednesday afternoon. Most of the crowd were born before Communist China existed and had lived through the bloodshed of the Cultural Revolution. Virtually none owned a smartphone, but gifting their faces to artificial intelligence seemed mundane to them. Apparently, Irene and I were the only ones there who thought it was newsworthy.

A woman in her 50s, one of the few that spoke standard Mandarin, stopped to talk to us after her turn wearing a pair of shiny black glasses for the camera. “I just saw many people come here, and it seemed like fun, so I decided to join,” she says. She’s less clear about what she thinks they’ll use her face for. She makes a long low humming noise and looks briefly back toward the cameras. “I know it’s about artificial intelligence,” she says, pausing for a moment. “Other than that, I don’t know.”

Wandering past the villagers, I saw paper notices stuck to walls and posts advertising the project with a Chinese phrase that has the same ring as “come one, come all!” I thought about my own granny back home, and was struck again by the utter strangeness of my job and the country I found myself in. I tried to imagine what granny would say if a man in a tight white collared shirt came to our village and asked to take photos of her in sunglasses to train facial recognition algorithms. Nothing kind, probably.

Foreign correspondent Cate Cadell.

Foreign correspondent Cate Cadell.

Almost exactly a year before I arrived in Beijing in 2014, Xi Jinping came to power. A shrewd technocrat, Xi was quick to understand the role China’s technological prowess could play in cementing party authority. He swiftly created – and headed – the Cyberspace Administration of China, the body responsible for tightly censoring China’s internet.

Loading

By 2016, China’s largest tech companies were facing mass penalties and enforced service outages for failing to follow the rules. Social media companies I covered went from hiring dozens of in-house censors to hiring thousands at a time, filling office buildings with entry-level workers whose only role was to sift through the masses of potentially sensitive online content.

The remaining foreign news outlets and social media sites visible in China went dark. Xi fortified China’s real-name identification system, linking SIM cards to national ID numbers, erasing anonymity from the internet. Unseen to most internet users, a mass industry of digital “public opinion analysts” came online, aided by advances in big data and machine learning, and began trawling the internet on behalf of police and other government bodies to monitor and snuff out dissent before it began. Within a few short years, Xi had transformed China’s internet into the world’s biggest digital laboratory for authoritarian governance.

It was in this techno-authoritarian wave that a facial recognition mania costing tens of billions of dollars began. Government policies with sci-fi names like SkyNet and Sharp Eyes laid out ambitious plans to blanket the country with cameras linked to police stations that shared data across the country. The vision was clear: just like on the internet, anonymity could be erased in real life. With accurate facial recognition, police could identify, categorise and follow a single person among 1.4 billion Chinese citizens.

As the projects rolled out, lauded by state media, those of us living and working in China had little insight into just how effective these systems were. Years of reporting on technology in China had taught me that if something sounds too much like science fiction, it is often just that. And the country’s famously opaque legal system offered few insights into how SkyNet and Sharp Eyes were working on the ground. But the projects fascinated me. Over the years I delved into them any way I could, including collecting a cache of thousands of purchase orders from police that laid out requirements for surveillance systems.

Loading

These documents brought into horrifying clarity the intended use of facial recognition in China. Local police describe vast, automated networks of hundreds or even thousands of cameras in their area alone, that not only scan the identities of passersby and identify fugitives, but create automated alarm systems giving authorities the location of people based on a vast array of “criminal type” blacklists, including ethnic Uighurs and Tibetans, former drug users, people with mental health issues and known protestors.

One 2018 purchase order from Beijing reads: “The real-time video image information can be used to classify outsiders and criminals with specific behaviour characteristics, such as Uighurs; it can identify the characteristics of outsiders and criminals, including age, gender, whether they have a fringe or whether they wear glasses.”

The police description of Xinjiang’s Uighurs, Tibetans and people with mental illnesses as inherently criminal is near universal, as in this 2018 purchase request for a facial recognition system from police in China’s southern Guangxi province: “Early warnings for specific personnel can be set, such as automatic warnings for Xinjiang Uighur people. It must be able to establish key personnel databases [including … terrorist-related people, Xinjiang people and mentally ill people].”

An order for a system supporting 1500 facial recognition cameras in Beijing’s Chaoyang District lays out parameters for what must be identifiable for each person captured on camera, including “whether people are wearing masks, glasses, the style of glasses as well as types of face structure information, including whether they have a beard … It must recall ID number, name, alias, organisation, gender, age, nationality [including Uighur], date of birth and household address.”

Through Her Eyes.

Through Her Eyes.

Some systems describe extra functions to use facial recognition as a form of predictive policing. This 2018 purchase request from police in the Xiqing District of China’s Tianjin city requests a system worth $4 million that sounds alarms to local police when people are captured behaving “suspiciously”, including returning to a public space repeatedly or being out late at night: “People who have high-risk behaviours are regarded as key personnel. When their number of high-risk behaviours is higher than the system preset value, the system will generate an alarm … When a person repeatedly appears in a certain key area, they are regarded as a suspect. Similarly, there are high-risk time periods, like between 12 midnight and 5am. When people appear in key areas repeatedly during those times, their face information will be saved to the blacklist database and an alarm will be generated.”

The technology has become standard in government universities, prisons and hospitals. Often they include tailored requirements, set by the purchaser. One $124,000 facial recognition system purchased in 2019 by a Chongqing vocational school monitors when its students are sleeping: “Student information must include, but is not limited to, the grade, name, student number, major, class number, mobile phone number, ethnicity, gender, ID number, class counsellor, dormitory bed number … It records the individual ID information of students and keeps record of when they go to bed and wake up.”

Estimates of the number of facial recognition cameras in China range from 200 million to more than 600 million as of 2020, but it’s not clear how many have advanced facial recognition capabilities. At times, experiences like the one in Jia County made China’s surveillance feel novel, but that faded as I gradually began to think of China as my second home. More and more, it felt oppressive.

In the lobby of a dingy hotel in west Beijing, another Chinese entrepreneur once showed me footage of a very different data collecting project. On a cell phone with a long crack across the screen, he showed me a video of two men walking briskly toward each other and hugging.

Loading

A second video showed the same two men having what appeared to be a very real fight. Like Liu, the man was creating content to train AI image recognition. His only client at the time was the police. The image recognition project was designed to automatically identify the different ways that people touched each other in public and raise alarms about problematic behaviour. The idea that hugging someone incorrectly could potentially alert police left a pit in my stomach.

I never returned to Jia County, but out of curiosity, I searched my cache of documents in the weeks after I left China. I found the county had purchased a $370,000 facial recognition surveillance system 18 months before our trip. Like other similar documents, it described technology to alert police to Uighurs in the area. It also called for a function to “label the attributes of the captured person, including age, gender, ethnicity, whether or not they’re wearing sunglasses.”

Cate Cadell was based in Beijing between 2014 and 2021, where she covered technology, and later, politics as a correspondent for Reuters news. She is currently covering China on the national security desk at The Washington Post, based in Washington DC.

This is an edited extract of Through Her Eyes: Australia’s Women Correspondents from Hiroshima to Ukraine, edited by Melissa Roberts and Trevor Watson, published by Hardie Grant Books. In stores and online from September 6.

The Morning Edition newsletter is our guide to the day’s most important and interesting stories, analysis and insights. Sign up here.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above