China’s fear of losing control

This isn’t quite the brave new world we were hoping these new technologies would enable.

Davos: George Soros calls Xi Jinping a “dangerous opponent” of open societies
Soros said he wanted to “call attention to the mortal danger facing open societies from the instruments of control that machine learning and artificial intelligence can put in the hands of repressive regimes.” Echoing recent concerns raised about China’s use of facial-recognition technology, Soros asked: “How can open societies be protected if these new technologies give authoritarian regimes a built-in advantage? That’s the question that preoccupies me. And it should also preoccupy all those who prefer to live in an open society.”

Tracing his critique of authoritarian governments to his own childhood under Nazi occupation in Hungary, Soros, who is now 88, urged the Trump administration to take a harder stance on China. “My present view is that instead of waging a trade war with practically the whole world, the US should focus on China,” he said

The complicated truth about China’s social credit system
What’s troubling is when those private systems link up to the government rankings — which is already happening with some pilots, she says. “You’ll have sort of memorandum of understanding like arrangements between the city and, say, Alibaba and Tencent about data exchanges and including that in assessments of citizens,” Ohlberg adds. That’s a lot of data being collected with little protection, and no algorithmic transparency about how it’s analysed to spit out a score or ranking[.]

[…]

The criteria that go into a social credit ranking depends on where you are, notes Ohlberg. “It’s according to which place you’re in, because they have their own catalogs,” she says. It can range from not paying fines when you’re deemed fully able to, misbehaving on a train, standing up a taxi, or driving through a red light. One city, Rongcheng, gives all residents 1,000 points to start. Authorities make deductions for bad behaviour like traffic violations, and add points for good behaviour such as donating to charity.

Running a red light is one thing, but what if you’re a journalist investigating corruption and misconduct?

Chinese blacklist an early glimpse of sweeping new social-credit control
What it meant for Mr. Liu is that when he tried to buy a plane ticket, the booking system refused his purchase, saying he was “not qualified.” Other restrictions soon became apparent: He has been barred from buying property, taking out a loan or travelling on the country’s top-tier trains.

“There was no file, no police warrant, no official advance notification. They just cut me off from the things I was once entitled to,” he said. “What’s really scary is there’s nothing you can do about it. You can report to no one. You are stuck in the middle of nowhere.”

In China, facial recognition tech is watching you
Megvii, meanwhile, supports the state’s nationwide surveillance program, which China, with troubling inferences, calls Skynet. Launched in 2005, Skynet aims to create a nationwide panopticon by blanketing the country with CCTV. Thanks to Face++, it now incorporates millions of A.I.-enhanced cameras that have been used to apprehend some 2,000 suspects since 2016, according to a Workers’ Daily report.

[…]

Jeffrey Ding, an Oxford University researcher focused on Chinese A.I., believes there is more pushback in the West against deploying facial recognition technology for security purposes. “There’s more willingness in China to adopt it,” he says, “or at least to trial it.”

But there’s also less freedom to oppose the onslaught. “The intention of these systems is to weave a tighter net of social control that makes it harder for people to plan action or push the government to reform,” explains Maya Wang, senior China researcher at Human Rights Watch.

The line from Soros about the danger from “the instruments of control that machine learning and artificial intelligence can put in the hands of repressive regimes” chimes with what I’m reading in James Bridle’s new book, New Dark Age.

May we live in interesting times.

There are cameras everywhere

There is such a high level of surveillance in our society. There are cameras everywhere, but they’re our cameras.

The ubiquity of smartphones, as captured by photographers
With so many devices in so many hands now, the visual landscape has changed greatly, making it a rare event to find oneself in a group of people anywhere in the world and not see at least one of them using a phone. Collected here: a look at that smartphone landscape, and some of the stories of the phones’ owners.

cameras-everywhere-1

What an odd world we live in.

cameras-everywhere-2

The real world isn’t really real unless there’s a screen between us and it.

cameras-everywhere-4

But then you come across a photo like this.

cameras-everywhere-3

Such a strong image. The difference between surveillance and sousveillance.

Sousveillance
Sousveillance is the recording of an activity by a participant in the activity, typically by way of small wearable or portable personal technologies. The term “sousveillance”, coined by Steve Mann, stems from the contrasting French words sur, meaning “above”, and sous, meaning “below”, i.e. “surveillance” denotes the “eye-in-the-sky” watching from above, whereas “sousveillance” denotes bringing the camera or other means of observation down to human level, either physically (mounting cameras on people rather than on buildings), or hierarchically (ordinary people doing the watching, rather than higher authorities or architectures doing the watching).

Eye to eye

From Chris Eckert, a maker of “little art machines”, has made what could be described as a surveillance sculpture.

A disconcerting installation featuring 20 blinking kinetic eyeballs that track a person around a room
San Francisco artist Chris Eckert, who wanted to make a point of increasing surveillance and decreasing privacy of the population, has created “Blink”, an absolutely incredible series of mechanically operating blinking single eyeballs in a row, each equipped with face tracking software to deliberately evoke a disconcerting sense of privacy violation.

Yes, that combination of a very realistic eyeball looking directly at you from a very unreal, robotic face is unsettling, but I think what really works well is what happens after, as Chris explains in this video.

“You would have these interactions with them, and hopefully enjoy yourself playing with these eyeballs, only to go around the corner to discover that you’ve been recorded and observed the entire time by multiple eyeballs. And that anyone in that room was watching those video feeds and observing you doing that. And it kind of shifted your view of what was happening there. It changed it from being fun to being kind of invasive.”

Look Out! Chris Eckert’s Machines Are Watching You | KQED Arts

There are many more remarkable art machines on his website, and here is a piece on his Privacy Not Included exhibition, that Blink formed part of.

Chris Eckert: Privacy Not Included exhibition at San Jose Institute of Contemporary Art
Privacy Not Included presents a sensorial experience that considers how we are viewed, followed, and tracked. As technology progresses and, in conjunction, as we continue to share personal data and information, what will be the consequences of our own privacy, our right to personal space, and ultimately, our freedom? In George Orwell’s dystopian novel 1984 about omnipresent surveillance and public manipulation, the author writes, “The choice for mankind lies between freedom and happiness and for the great bulk of mankind, happiness is better.” The works in the exhibition challenge us to contemplate this dilemma. They reflect the conundrum we face in our contemporary society: appreciating the joy of convenience and technology’s ability to provide us security and safety, versus negotiating between the disconcerting, constant surveillance and intrusion in our lives.

And to really bring that point home, here’s an article from Aeon on the same theme, convenience versus surveillance.

Orwell knew: we willingly buy the screens that are used against us
One can easily imagine choosing to buy a telescreen – indeed, many of us already have. And one can also imagine needing one, or finding them so convenient that they feel compulsory. The big step is when convenience becomes compulsory: when we can’t file our taxes, complete the census or contest a claim without a telescreen.

Why Groklaw shut down

Groklaw, Pamela Jones’s website reporting on legal issues around the Free and Open Source Software community, closed down and she herself wants to “get off of the Internet to the degree it’s possible.” Loss of privacy, forced exposure, the dehumanising nature of total surveillance: issues I’ve been vaguely aware of recently, but never really thought about seriously. Her post explaining why she’s shut down her blog is the first thing I’ve read that I’ve understood, I think, with all this.

“Anyway, one resource was excerpts from a book by Janna Malamud Smith, ‘Private Matters: In Defense of the Personal Life’, and I encourage you to read it. I encourage the President and the NSA to read it too. I know. They aren’t listening to me. Not that way, anyhow. But it’s important, because the point of the book is that privacy is vital to being human, which is why one of the worst punishments there is is total surveillance.”

http://www.groklaw.net/article.php?story=20130818120421175