All as bad as each other?

Rhett Jones from Gizmodo strikes a cautionary note about Apple’s positioning following Facebook’s recent data sharing controversies.

Apple isn’t your friend
In its own deliberate fashion, Apple appears to see a market opportunity in the privacy debate that goes beyond polishing its own image. As headlines blared about Facebook’s latest data-sharing turmoil, the Wall Street Journal reported that Apple has been quietly planning to launch a new advertising network for the past year. It’s said to be a re-imagining of its failed iAd network that was shuttered in 2016.

[…]

Generally, more competition is welcome. If Apple is giving Facebook and Google headaches, we say that’s great. But it’s a thorny issue when we’re talking about a few billion-dollar companies exchanging places on the ladder as they strive to be trillion-dollar companies. It’s just not enough for the least bad megacorp to keep the evil ones in check.

Dumbing down the chatbots

A quite different take on Google’s AI demo from the other day. Rather than be impressed at how clever the bots appear, because they sound like us, we should be sad at how inefficient we’ve made them, because they sound like us.

Chatbots are saints
Pichai played a recording of Duplex calling a salon to schedule a haircut. This is an informational transaction that a couple of computers could accomplish in a trivial number of microseconds — bip! bap! done! — but with a human on one end of the messaging bus, it turned into a slow-motion train wreck. Completing the transaction required 17 separate data transmissions over the course of an entire minute — an eternity in the machine world. And the human in this case was operating at pretty much peak efficiency. I won’t even tell you what happened when Duplex called a restaurant to reserve a table. You could almost hear the steam coming out of the computer’s ears.

In our arrogance, we humans like to think of natural language processing as a technique aimed at raising the intelligence of machines to the point where they’re able to converse with us. Pichai’s demo suggests the reverse is true. Natural language processing is actually a technique aimed at dumbing down computers to the point where they’re able to converse with us. Google’s great breakthrough with Duplex came in its realization that by sprinkling a few monosyllabic grunts into computer-generated speech — um, ah, mmm — you could trick a human into feeling kinship with the machine. You ace the Turing test by getting machines to speak baby-talk.

Blogger’s still here?

TechCrunch has news of an update to Blogger. Nothing newsworthy about the update, really. What’s catching our eye is that Blogger still exists at all.

Blogger gets a spring cleaning
It’s surprising that Blogger is still around. I can’t remember the last time I saw a Blogger site in my searches, and it sure doesn’t have a lot of mindshare. Google also has let the platform linger and hasn’t integrated it with any of its newer services. The same thing could be said for Google+, too, of course. Google cuts some services because they have no users and no traction. That could surely be said for Blogger and Google+, but here they are, still getting periodic updates.

I used to have a blog on Blogger, and prompted by this article I’ve just had a very strange stroll down memory lane to visit it, via the Internet Archive’s marvellous Wayback Machine.

more-coffee-less-dukkha-585

I really liked the look of that old blog. Very mid-2000s. Are there no blogs that look like this anymore?

Google’s creeping us out again

But it only wants to help, it’s for our own good.

Google wants to cure our phone addiction. How about that for irony?
This is Google doing what it always does. It is trying to be the solution to every aspect of our lives. It already wants to be our librarian, our encyclopedia, our dictionary, our map, our navigator, our wallet, our postman, our calendar, our newsagent, and now it wants to be our therapist. It wants us to believe it’s on our side.

There is something suspect about deploying more technology to use less technology. And something ironic about a company that fuels our tech addiction telling us that it holds the key to weaning us off it. It doubles as good PR, and pre-empts any future criticism about corporate irresponsibility.

And then there’s this. How many times have we had cause to say, ‘just because we can, doesn’t mean we should’?

Google’s new voice bot sounds, um, maybe too real
“Google Assistant making calls pretending to be human not only without disclosing that it’s a bot, but adding ‘ummm’ and ‘aaah’ to deceive the human on the other end with the room cheering it… horrifying. Silicon Valley is ethically lost, rudderless and has not learned a thing,” tweeted Zeynep Tufekci, a professor at the University of North Carolina at Chapel Hill who studies the social impacts of technology.

“As digital technologies become better at doing human things, the focus has to be on how to protect humans, how to delineate humans and machines, and how to create reliable signals of each—see 2016. This is straight up, deliberate deception. Not okay,” she added.

They know everything about us, and that’s ok?

I really need to stop reading articles about how our personal data is being used and abused by seemingly everyone on the internet. Nothing good can come from going over the same bad news. These from The Guardian are the last ones, I promise.

Why have we given up our privacy to Facebook and other sites so willingly?
If you think you’re a passive user of Facebook, minimising the data you provide to the site or refraining from oversharing details of your life, you have probably underestimated the scope of its reach. Facebook doesn’t just learn from the pictures you post, and the comments you leave: the site learns from which posts you read and which you don’t; it learns from when you stop scrolling down your feed and how long it takes you to restart; it learns from your browsing on other websites that have nothing to do with Facebook itself; and it even learns from the messages you type out then delete before sending (the company published an academic paper on this “self-censorship” back in 2013).

[…]

Lukasz Olejnik, an independent security and privacy researcher, agrees: “Years ago, people and organisations used to shift the blame on the users, even in public. This blaming is unfortunate, because expecting users to be subject-matter experts and versed in the obscure technical aspects is misguided.

“Blaming users is an oversimplification, as most do not understand the true implications when data are shared – they cannot. You can’t expect people to fully appreciate the amount of information extracted from aggregated datasets. That said, you can’t expect users to know what is really happening with their data if it’s not clearly communicated in an informed consent prompt, which should in some cases include also the consequences of hitting ‘I agree’.”

So what kind of data are we talking about? What are we sharing? Everything from where we’ve been, what we’ve ever watched or searched for, to even what we’ve deleted.

Are you ready? This is all the data Facebook and Google have on you
This information has millions of nefarious uses. You say you’re not a terrorist. Then how come you were googling Isis? Work at Google and you’re suspicious of your wife? Perfect, just look up her location and search history for the last 10 years. Manage to gain access to someone’s Google account? Perfect, you have a chronological diary of everything that person has done for the last 10 years.

This is one of the craziest things about the modern age. We would never let the government or a corporation put cameras/microphones in our homes or location trackers on us. But we just went ahead and did it ourselves because – to hell with it! – I want to watch cute dog videos.

And texts and calls too.

Facebook logs SMS texts and calls, users find as they delete accounts
Facebook makes it hard for users to delete their accounts, instead pushing them towards “deactivation”, which leaves all personal data on the company’s servers. When users ask to permanently delete their accounts, the company suggests: “You may want to download a copy of your info from Facebook.” It is this data dump that reveals the extent of Facebook’s data harvesting – surprising even for a company known to gather huge quantities of personal information.

So what can be done?

Beware the smart toaster: 18 tips for surviving the surveillance age
Just over a week ago, the Observer broke a story about how Facebook had failed to protect the personal information of tens of millions of its users. The revelations sparked a #DeleteFacebook movement and some people downloaded their Facebook data before removing themselves from the social network. During this process, many of these users were shocked to see just how much intel about them the internet behemoth had accumulated. If you use Facebook apps on Android, for example – and, even inadvertently, gave it permission – it seems the company has been collecting your call and text data for years.

It’s not me, it’s you! So Facebook protested, in the wake of widespread anger about its data-collection practices. You acquiesced to our opaque privacy policies. You agreed to let us mine and monetise the minutiae of your existence. Why are you so upset?

Most of the tips the article lists fail to really address the issues above, as they are more about how to secure your accounts from hackers, rather than dealing with Facebook and Google intrusions and opaque consent agreements. But a couple are worth highlighting.

12. Sometimes it’s worth just wiping everything and starting over
Your phone, your tweets, your Facebook account: all of these things are temporary. They will pass. Free yourself from an obsession with digital hoarding. If you wipe your phone every year, you learn which apps you need and which are just sitting in the background hoovering up data. If you wipe your Facebook account every year, you learn which friends you actually like and which are just hanging on to your social life like a barnacle.

[…]

18. Finally, remember your privacy is worth protecting
You might not have anything to hide (except your embarrassing Netflix history) but that doesn’t mean you should be blase about your privacy. Increasingly, our inner lives are being reduced to a series of data points; every little thing we do is for sale. As we’re starting to see, this nonstop surveillance changes us. It influences the things we buy and the ideas we buy into. Being more mindful of our online behaviour, then, isn’t just important when it comes to protecting our information, it’s essential to protecting our individuality.

Big bad numbers

TechCrunch has a summary of the latest report from Google on its attempts to clear up its mess. Some of the numbers are incredible.

In 2017, Google removed 3.2B ‘bad ads’ and blocked 320K publishers, 90K sites, 700K mobile apps
Google also removed 130 million ads for malicious activity abuses, such as trying to get around Google’s ad review. And 79 million ads were blocked because clicking on them led to sites with malware, while 400,000 sites containing malware were also removed as part of that process. Google also identified and blocked 66 million “trick to click” ads and 48 million ads that tricked you into downloading software.

Sounds impressive, but that’s not all they’re trying to tackle currently.

The bad ads report publication comes in the wake of Google taking a much more proactive stance tackling harmful content on one of its most popular platforms, YouTube. In February, the company announced that it would be getting more serious about how it evaluated videos posted to the site, and penalising creators a through a series of “strikes” if they were found to be running afoul of Google’s policies.

The strikes have been intended to hit creators where it hurts them most: by curtailing monetising and discoverability of the videos.

This week, Google started to propose a second line of attack to try to raise the level of conversation around questionable content: it plans to post alternative facts from Wikipedia alongside videos that carry conspiracy theories (although it’s not clear how Google will determine which videos are conspiracies, and which are not).

That sounds quite intractable. It will be interesting to see how that plays out.

A self-spamming university?

Google blocks U. of Illinois at Chicago from emailing its own students
The University of Illinois at Chicago recently found itself living a modern nightmare: Google’s automated cybersecurity regime mistook the university as the culprit in a spam attack on the university’s students and began blocking university email accounts from sending messages to Gmail users.

The blocking went on for more than two weeks, and the affected Gmail users included 13,000 of the university’s own students. University officials describe those two weeks as a Kafkaesque state of limbo.

Gmail's beginnings and consequences

How Gmail happened: the inside story of its launch 10 years ago
But serious search practically begged for serious storage: It opened up the possibility of keeping all of your email, forever, rather than deleting it frantically to stay under your limit. That led to the eventual decision to give each user 1GB of space, a figure Google settled on after considering capacities that were generous but not preposterous, such as 100MB.

An interesting read about the cautious beginnings of what now seems like such a no brainer. But consider that passage above with this one from Barclay T Blair, information governance expert, in a post entitled “There is no harm in keeping tiny emails”. He had found an article that he thought…

“There is no harm in keeping tiny emails”
… nicely summed up the attitude I encounter from IT and others in our information governance engagements. Ask an attorney sometime if there really is “no harm in keeping tiny emails around in this age of ever-expanding storage space.” The drug dealers of the IG world have really done an incredible job convincing the addicts that the drug has no downside.

On owning your own data

On owning your own data
The problem, of course, is this wretched business model that has your landlord snooping on you and keeping all that information in the first place. If they didn’t have that information — or if that information was encrypted in a manner that only you could access it — they couldn’t share your information even if they wanted to.