Signal, applicazione per inviare messaggi ed effettuare chiamate criptate, sta venendo bloccata in Egitto e negli Emirati Arabi.

Il team dell’app ha trovato una via per aggirare il blocco, e far sì che i messaggi vengano inviati e ricevuti lo stesso:

Signal’s new anti-censorship feature uses a trick called “domain fronting,” Marlinspike explains. A country like Egypt, with only a few small internet service providers tightly controlled by the government, can block any direct request to a service on its blacklist. But clever services can circumvent that censorship by hiding their traffic inside of encrypted connections to a major internet service, like the content delivery networks (CDNs) that host content closer to users to speed up their online experience — or in Signal’s case, Google’s App Engine platform, designed to host apps on Google’s servers.

“Now when people in Egypt or the United Arab Emirates send a Signal message, it’ll look identical to something like a Google search,” Marlinspike says. “The idea is that using Signal will look like using Google; if you want to block Signal you’ll have to block Google.”

The trick works because Google’s App Engine allows developers to redirect traffic from Google.com to their own domain. Google’s use of TLS encryption means that contents of the traffic, including that redirect request, are hidden, and the internet service provider can see only that someone has connected to Google.com. That essentially turns Google into a proxy for Signal, bouncing its traffic and fooling the censors.

(Via Bruce Schneier)

La Electronic Frontier Foundation ha comprato una pagina su Wired per invitare le aziende tecnologiche a prepararsi a Trump

Se volete farvi un regalo geek per Natale, sottoscrivete una donazione mensile alla EFF. Io l’ho fatto giusto un anno fa, di $10 mensili: vi spediscono pure una bellissima maglietta, per ringraziarvi.

(via Boing Boing)

Cory Doctorow, su Boing Boing:

To do this, they made tiny alterations to the transparency values of the individual pixels of the accompanying banner ads, which were in the PNG format, which allows for pixel-level gradations in transparency. The javascript sent by the attackers would run through the pixels in the banners, looking for ones with the telltale alterations, then it would turn that tweaked transparency value into a character. By stringing all these characters together, the javascript would assemble a new program, which it would then execute on the target’s computer.

La pubblicità va bloccata non perché è brutta a vedersi, ma perché è l’unico modo di navigare il web che non comprometta la propria sicurezza e privacy.

Il Regno Unito conserverà la cronologia di navigazione dei suoi cittadini per 12 mesi

Che bello, finalmente siamo tutti un po’ più al sicuro (a parte quei pochi stronzi che hanno cose da nasconderci). Esplodono cose ad ogni passo di questi giorni ed è dunque giusto, in maniera angosciata, affidarsi a uomini maschi forti e passare leggi che privano pian piano i cittadini della loro libertà, senza al contempo avere alcun effetto sulla sicurezza nazionale.

The Guardian:

The new surveillance law requires web and phone companies to store everyone’s web browsing histories for 12 months and give the police, security services and official agencies unprecedented access to the data.

It also provides the security services and police with new powers to hack into computers and phones and to collect communications data in bulk. The law requires judges to sign off police requests to view journalists’ call and web records, but the measure has been described as “a death sentence for investigative journalism” in the UK.

Electronic Frontier Foundation:

If Mr. Trump carries out these plans, they will likely be accompanied by unprecedented demands on tech companies to hand over private data on people who use their services. This includes the conversations, thoughts, experiences, locations, photos, and more that people have entrusted platforms and service providers with. Any of these might be turned against users under a hostile administration.

Dal permettere accesso anonimo a un sito (senza costringere durante la registrazione a utilizzare nome e cognome), al cancellare i dati raccolti in background durante la navigazione, se proprio debbono essere raccolti in primo luogo.

Suggerimenti simili li aveva dati Maciej Cegłowski, durante uno dei suoi talk.

Firefox ha aggiornato Firefox Focus, precedentemente solo un content blocker, trasformandolo in un browser completamente privato per iOS:

Firefox Focus is set by default to block many of the trackers that follow you around the Web. You don’t need to change privacy or cookie settings.  You can browse with peace of mind, feeling confident in the knowledge that you can instantly erase your sessions with a single tap – no menus needed.

Quando nel 2007 Google acquistò il network pubblicitario DoubleClick, promise che i dati raccolti tramite l’ad tracking non sarebbero stati mischiati con gli altri dati che Google già ha sui propri utenti, grazie ai servizi che offre. In altre parole, Google promise che non avrebbe dato un nome e un cognome ai dati che avrebbe raccolto online: ai siti che visitiamo, o a come ci comportiamo e dove clicchiamo su questi siti.

Nel corso dell’estate, questa promessa è sparita dalla privacy policy:

But this summer, Google quietly erased that last privacy line in the sand – literally crossing out the lines in its privacy policy that promised to keep the two pots of data separate by default. In its place, Google substituted new language that says browsing habits “may be” combined with what the company learns from the use Gmail and other tools. […]

The move is a sea change for Google and a further blow to the online ad industry’s longstanding contention that web tracking is mostly anonymous. In recent years, Facebook, offline data brokers and others have increasingly sought to combine their troves of web tracking data with people’s real names. But until this summer, Google held the line.

La conseguenza è che ora, grazie alle pubblicità di DoubleClick sparse in giro per il web, e grazie a Gmail e agli altri servizi che l’azienda offre, Google può fornire pubblicità ancora più mirata e costruire profili dettagliati sui suoi utenti — basati su quello che scriviamo nelle email, sulle ricerche che facciamo e, ora, anche sui siti che visitiamo.

Verso inizio giugno sono apparsi sul dark web i dati — inclusa password e email — di 117 milioni di account creati su LinkedIn, ottenuti durante l’attacco che LinkedIn subì nel 2012 (potete controllare se anche il vostro account venne compromesso su haveibeenpwned.com).

Come spiega Arstechnica, ogni volta che c’è un leak di queste dimensioni e entità (recentemente: Ashley Madison) gli hacker diventano un po’ più bravi a indovinare le nostre password su altri siti — dato che possono fare affidamento ai dati già collezionati (sia su noi stessi, preferenze e dettagli, sia sulle password), compilando così lunghissime liste di potenziali combinazioni e password:

Back in the early days of password cracking, we didn’t have much insight into the way people created passwords on a macro scale. Sure, we knew about passwords like 123456, password, secret, letmein, monkey, etc., but for the most part we were attacking password hashes with rather barbaric techniques—using literal dictionaries and stupid wordlists like klingon_words.txt. Our knowledge of the top 1,000 passwords was at least two decades old. We were damn lucky to find a password database with only a few thousand users, and when you consider the billions of accounts in existence even back then, our window into the way users created passwords was little more than a pinhole. […]

When you take both RockYou and LinkedIn and combine them with eHarmony, Stratfor, Gawker, Gamigo, Ashley Madison, and dozens of other smaller public password breaches, hackers will simply be more prepared than ever for the next big breach.

Maciej Ceglowski sul perché il settore tecnologico dovrebbe smetterla di collezionare quanti più dati possibili sui suoi utenti — evitando così che, per paura e terrore, in un futuro un governo possa richiedere di farne pessimo uso.

We tend to imagine dystopian scenarios as one where a repressive government uses technology against its people. But what scares me in these scenarios is that each one would have broad social support, possibly majority support. Democratic societies sometimes adopt terrible policies.

When we talk about the moral economy of tech, we must confront the fact that we have created a powerful tool of social control. Those who run the surveillance apparatus understand its capabilities in a way the average citizen does not. My greatest fear is seeing the full might of the surveillance apparatus unleashed against a despised minority, in a democratic country. […]

We have to stop treating computer technology as something unprecedented in human history. Not every year is Year Zero. This is not the first time an enthusiastic group of nerds has decided to treat the rest of the world as a science experiment. Earlier attempts to create a rationalist Utopia failed for interesting reasons, and since we bought those lessons at a great price, it would be a shame not to learn them.

There is also prior art in attempts at achieving immortality, limitless wealth, and Galactic domination. We even know what happens if you try to keep dossiers on an entire country.

Craig Federighi ha spiegato al keynote di ieri che Apple userà tecniche di “privacy differenziale” per rendere privati e sicuri i dati degli utenti di cui ha bisogno per migliorare e offrire i suoi servizi. In tal modo, se un governo o un’entità terza dovesse entrare in possesso di questi dati non dovrebbe essere in grado di ottenere alcuna informazione certa su un individuo specifico:

We believe you should have great features and great privacy. Differential privacy is a research topic in the areas of statistics and data analytics that uses hashing, subsampling and noise injection to enable…crowdsourced learning while keeping the data of individual users completely private. Apple has been doing some super-important work in this area to enable differential privacy to be deployed at scale.

In termini molto semplici (e vaghi): invece di anonimizzare un dataset (cosa che non funziona, dato che spesso questi dataset vengono de-anonimizzati senza problemi) Apple introduce per esempio dei dati falsi al suo interno, rendendo così inaffidabili le risposte dei singoli utenti. I pattern generali d’uso emergono, ma i comportamenti specifici a un utente possono rivelarsi fasulli.

Spiega Wired:

Differential privacy, translated from Apple-speak, is the statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it. With differential privacy, Apple can collect and store its users’ data in a format that lets it glean useful notions about what people do, say, like and want. But it can’t extract anything about a single, specific one of those people that might represent a privacy violation. And neither, in theory, could hackers or intelligence agencies. […]

As an example of that last method [noise injection], Microsoft’s Dwork points to the technique in which a survey asks if the respondent has ever, say, broken a law. But first, the survey asks them to flip a coin. If the result is tails, they should answer honestly. If the result is heads, they’re instructed to flip the coin again and then answer “yes” for heads or “no” for tails. The resulting random noise can be subtracted from the results with a bit of algebra, and every respondent is protected from punishment if they admitted to lawbreaking.

Il documentario, andato in onda su HBO, di Vice con Edward Snowden.

Il nuovo video di Kurzgesagt ci ricorda le conseguenze sulla nostra privacy, sulla libertà individuale, e sui nostri diritti quando in preda al panico e alla paura ci affidiamo alla sorveglianza di massa per far fronte al terrorismo, con grande insuccesso.

Lo Snooper Charter è una proposta di legge del Segretario di Stato Theresa May che obbligherebbe gli operatori telefonici e i produttori di smartphone che operano in Inghilterra a mantenere per 12 mesi un archivio del traffico dati degli utenti, delle email inviate, delle comunicazioni avvenute sui social network e delle chiamate effettuate e ricevute.

Bravi. Wired ha un lungo articolo a riguardo:

Encryption is widely available to anyone motivated to use it, but WhatsApp is pushing it much farther into the mainstream than anyone else. Apple, for instance, encrypts the data sitting on an iPhone, and it uses end-to-end encryption to hide the messages that travel over its own iMessage texting service. But iMessage is only available on iPhones. Over the years, Apple has sold about 800 million iPhones. But it’s hard to know how many are still in use, or how many people who have them are communicating via iMessage anyway. WhatsApp runs on just about every kind of phone. Plus, Apple’s techniques have some gaping holes. Most notably, many users back up their iMessages to Apple’s iCloud service, which negates the end-to-end encryption. WhatsApp, meanwhile, has a billion users on its service right now.

Farhad Manjoo:

Consider all the technologies we think we want — not just better and more useful phones, but cars that drive themselves, smart assistants you control through voice, or household appliances that you can monitor and manage from afar. Many will have cameras, microphones and sensors gathering more data, and an ever-more-sophisticated mining effort to make sense of it all. Everyday devices will be recording and analyzing your every utterance and action. […]

But if Apple is forced to break its own security to get inside a phone that it had promised users was inviolable, the supposed safety of the always-watching future starts to fall apart. If every device can monitor you, and if they can all be tapped by law enforcement officials under court order, can anyone ever have a truly private conversation? Are we building a world in which there’s no longer any room for keeping secrets?

“This case can’t be a one-time deal,” said Neil Richards, a professor at the Washington University School of Law. “This is about the future.”