“Anonymity is like virginity. You don’t get it back once you’ve lost it,” writes one Register reader on Microsoft’s latest raid on your privacy.
Microsoft pulled a major update for Windows after it blew away the user’s privacy settings, allowing app developers and advertisers to glean the user’s identity.
But that’s only part of the story, which gets murkier by the day.
We already knew Windows 10 Threshold deleted third-party data monitoring tools and cleanup tools, including stalwarts like Spybot and CCleaner. It even disabled Cisco’s VPN software. Just a bug, said Microsoft.
Two bugs would be a puzzling coincidence – but something else makes it altogether more troubling.
This year Microsoft introduced background tracking services called DiagTrack, or the Diagnostics Tracking Service. It was added to Windows 8.1 installations as well as betas of Windows 10. It arrived without much fanfare in May 14, in the shape of a patch, KB3022345.
It was just one of several slurping enhancements added via the back door.
The data that DiagTrack collected was typical of a spyware programme. The only way you knew you were being monitored was by eyeballing the list of running processes in Task Manager. As Microsoftexplained:
Examples of data we collect include your name, email address, preferences and interests; browsing, search and file history; phone call and SMS data; device configuration and sensor data; and application usage.
Users thought it had disappeared in recent Windows 10 builds – but it hadn’t. Microsoft had simply renamed it.
The sinister-sounding tracking app was now the beatific and caring “Connected User Experiences and Telemetry Service”. Once again, it needs to be disabled manually (this time through the Services control panel).
“It is this kind of overriding desire for control and a disregard for user choices which is harming Windows 10,” says Forbes journo Gordon Kelly, and he’s right.
Microsoft spent millions portraying Google as a greedy and amoral data marauder. Redmond doesn’t need to read your email, it told everyone. The Scroogled campaign positioned Microsoft itself as the ethical alternative; the occupier of the moral high ground.
For a while, it was. But Windows 10 is bad for your privacy, and it is damaging Microsoft’s reputation as a trusted consumer brand.
November 19, 2015—The Nmap Project is pleased to announce the immediate, free availability of the Nmap Security Scanner version 7.00 fromhttps://nmap.org/. It is the product of three and a half years of work, nearly 3200 code commits, and more than a dozen point releases since the bigNmap 6 release in May 2012. Nmap turned 18 years old in September this year and celebrates its birthday with 171 new NSE scripts, expanded IPv6 support, world-class SSL/TLS analysis, and more user-requested features than ever. We recommend that all current users upgrade.
If you need to visualise how Cisco routers work, this video is the fastest way to see routers and switches at work.
When you watch your Smart TV, it could also be watching you.
A new report from Julia Angwin at ProPublica reveals that Vizio, a top television maker, automatically tracks the viewing habits of Smart TV owners and shares that information with advertisers in a way that could connect those preferences to what those customers do on their phones or other mobile devices.
Vizio’s “Smart Interactivity Program” is turned on by default for its 10 million Smart TV customers, ProPublica reported, and works like this: The company analyzes snippets of what you watch, be it on Netflix or traditional television, and connects patterns in your viewing behavior with your Internet Protocol address — an online identifier that can be used to pinpoint every device connected from your home. That includes everything from your laptop and phone to your smart thermostat. That information is then shared with Vizio’s partners, who in turn could use that data to help to target advertisements.
In an e-mailed response to a Washington Post inquiry, a Vizio spokesperson said the company’s data mining programs are part of a “revolutionary shift across all screens that brings measurability, relevancy and personalization to the consumer like never before.” Vizio said it shares “aggregate, anonymized data” with media and data companies so they can “make better-informed decisions” about content and advertising strategies.
There are laws that limit how companies share information about video watching habits, including the Video Privacy Protection Act (VPPA). However, Vizio says that those laws do not apply to its tracking service because the company associates IP addresses with the data rather than a person’s name or other “personally identifiable information.”
It’s true that some U.S. courts have held that IP addresses do not constitute personally identifiable information. However, privacy regulators in the European Union disagree. And IP addresses are increasingly used by data brokers to paint detailed portraits of who people are.
**Too Right – Your IP identifies you.
Samsung and fellow Vizio rival LG Electronics have similar programs that track Smart TV customers’ viewing habits, but only if customers turn on the feature, according to ProPublica.
**Don’t enable the feature, for heavens sake. Trust me, it ain’t worth it in the long term. If you find yourself blacklisted from work, or life insurance, medical insurance then its those “data brokers” who sell your data based on IP’s that are to blame. Follow the money… who’s paying the data brokers? They are the guys you’re running from. Ditch Vizio.. as they are selling out their own customer base.
Today, TLS connections predominantly use one of two families of cipher suites: RC4 based or AES-CBC based. However, in recent years both of these families of cipher suites have suffered major problems. TLS’s CBC construction was the subject of the BEAST attack (fixed with 1/n-1 record splitting or TLS 1.1) and Lucky13 (fixed with complex implementation tricks). RC4 was found to have key-stream biases that cannot effectively be fixed.
Although AES-CBC is currently believed to be secure when correctly implemented, a correct implementation is so complex that there remains strong motivation to replace it.
Clearly we need something better. An obvious alternative is AES-GCM (AES-GCM is AES in counter mode with a polynomial authenticator over GF(2128)), which is already specified for TLS and has some implementations. Support for it is in the latest versions of OpenSSL and it has been the top preference cipher of Google servers for some time now. Chrome support should be coming in Chrome 31, which is expected in November. (Although we’re still fighting to get TLS 1.2 support deployed in Chrome 30 due to buggy servers.)
AES-GCM isn’t perfect, however. Firstly, implementing AES and GHASH (the authenticator part of GCM) in software in a way which is fast, secure and has good key agility is very difficult. Both primitives are suited to hardware implementations and good software implementations are worthy of conference papers. The fact that a naive implementation (which is also what’s recommended in the standard for GHASH!) leaks timing information is a problem.
AES-GCM also isn’t very quick on lower-powered devices such as phones, and phones are now a very important class of device. A standard phone (which is always defined by whatever I happen to have in my pocket; a Galaxy Nexus at the moment) can do AES-128-GCM at only 25MB/s and AES-256-GCM at 20MB/s (both measured with an 8KB block size).
Lastly, if we left things as they are, AES-GCM would be the only good cipher suite in TLS. While there are specifications for AES-CCM and for fixing the AES-CBC construction, they are all AES based and, in the past, having some diversity in cipher suites has proven useful. So we would be looking for an alternative even if AES-GCM were perfect.
In light of this, Google servers and Chrome will soon be supporting cipher suites based aroundChaCha20 and Poly1305. These are primitives developed by Dan Bernstein and are fast, secure, have high quality, public domain implementations, are naturally constant time and have nearly perfect key agility.
On the same phone as the AES-GCM speeds were measured, the ChaCha20+Poly1305 cipher suite runs at 92MB/s (which should be compared against the AES-256-GCM speed as ChaCha20 is a 256-bit cipher).
In addition to support in Chrome and on Google’s servers, myself and my colleague, Elie Bursztein, are working on patches for NSS and OpenSSL to support this cipher suite. (And I should thank Dan Bernstein, Andrew M, Ted Krovetz and Peter Schwabe for their excellent, public domain implementations of these algorithms. Also Ben Laurie and Wan-Teh Chang for code reviews, suggestions etc.)
But while AES-GCM’s hardware orientation is troublesome for software implementations, it’s obviously good news for hardware implementations and some systems do have hardware AES-GCM support. Most notably, Intel chips have had such support (which they call AES-NI) since Westmere. Where such support exists, it would be a shame not to use it because it’s constant time and very fast (see slide 17). So, once ChaCha20+Poly1305 is running, I hope to have clients change their cipher suite preferences depending on the hardware that they’re running on, so that, in cases where both client and server support AES-GCM in hardware, it’ll be used.
To wrap all this up, we need to solve a long standing, browser TLS problem: in order to deal with buggy HTTPS servers on the Internet (of which there are many, sadly), browsers will retry failed HTTPS connections with lower TLS version numbers in order to try and find a version that doesn’t trigger the problem. As a last attempt, they’ll try an SSLv3 connection with no extensions.
Several useful features get jettisoned when this occurs but the important one for security, up until now, has been that elliptic curve support is disabled in SSLv3. For servers that support ECDHE but not DHE that means that a network attacker can trigger version downgrades and remove forward security from a connection. Now that AES-GCM and ChaCha20+Poly1305 are important we have to worry about them too as these cipher suites are only defined for TLS 1.2.
Something needs to be done to fix this so, with Chrome 31, Chrome will no longer downgrade to SSLv3 for Google servers. In this experiment, Google servers are being used as an example of non-buggy servers. The experiment hopes to show that networks are sufficiently transparent that they’ll let at least TLS 1.0 through. We know from Chrome’s statistics that connections to Google servers do end up getting downgraded to SSLv3 sometimes, but all sorts of random network events can trigger a downgrade. The fear is that there’s some common network element that blocks TLS connections by deep-packet inspection, which we’ll measure by breaking them and seeing how many bug reports we get.
ImperialViolet is a great blog for those interested in the latest in crypto developments.
Privacy advocates are warning federal authorities of a new threat that uses inaudible, high-frequency sounds to surreptitiously track a person’s online behavior across a range of devices, including phones, TVs, tablets, and computers.
The ultrasonic pitches are embedded into TV commercials or are played when a user encounters an ad displayed in a computer browser. While the sound can’t be heard by the human ear, nearby tablets and smartphones can detect it. When they do, browser cookies can now pair a single user to multiple devices and keep track of what TV commercials the person sees, how long the person watches the ads, and whether the person acts on the ads by doing a Web search or buying a product.
Cross-device tracking raises important privacy concerns, the Center for Democracy and Technology wrote in recently filed comments to the Federal Trade Commission. The FTC has scheduled a workshop on Monday to discuss the technology. Often, people use as many as five connected devices throughout a given day—a phone, computer, tablet, wearable health device, and an RFID-enabled access fob. Until now, there hasn’t been an easy way to track activity on one and tie it to another.
“As a person goes about her business, her activity on each device generates different data streams about her preferences and behavior that are siloed in these devices and services that mediate them,” CDT officials wrote. “Cross-device tracking allows marketers to combine these streams by linking them to the same individual, enhancing the granularity of what they know about that person.”
The officials said that companies with names including SilverPush, Drawbridge, and Flurry are working on ways to pair a given user to specific devices. Adobe is developing similar technologies. Without a doubt, the most concerning of the companies the CDT mentioned is San Francisco-based SilverPush.
CDT officials wrote:
Cross-device tracking can also be performed through the use of ultrasonic inaudible sound beacons. Compared to probabilistic tracking through browser fingerprinting, the use of audio beacons is a more accurate way to track users across devices. The industry leader of cross-device tracking using audio beacons is SilverPush. When a user encounters a SilverPush advertiser on the web, the advertiser drops a cookie on the computer while also playing an ultrasonic audio through the use of the speakers on the computer or device. The inaudible code is recognized and received on the other smart device by the software development kit installed on it. SilverPush also embeds audio beacon signals into TV commercials which are “picked up silently by an app installed on a [device] (unknown to the user).” The audio beacon enables companies like SilverPush to know which ads the user saw, how long the user watched the ad before changing the channel, which kind of smart devices the individual uses, along with other information that adds to the profile of each user that is linked across devices.
The user is unaware of the audio beacon, but if a smart device has an app on it that uses the SilverPush software development kit, the software on the app will be listening for the audio beacon and once the beacon is detected, devices are immediately recognized as being used by the same individual. SilverPush states that the company is not listening in the background to all of the noises occurring in proximity to the device. The only factor that hinders the receipt of an audio beacon by a device is distance and there is no way for the user to opt-out of this form of cross-device tracking. SilverPush’s company policy is to not “divulge the names of the apps the technology is embedded,” meaning that users have no knowledge of which apps are using this technology and no way to opt-out of this practice. As of April of 2015, SilverPush’s software is used by 67 apps and the company monitors 18 million smartphones.
SilverPush’s ultrasonic cross-device tracking was publicly reported as long ago as July 2014. More recently, the company received a new round of publicity when it obtained $1.25 million in venture capital. The CDT letter appears to be the first time the privacy-invading potential of the company’s product has been discussed in detail. SilverPush officials didn’t respond to e-mail seeking comment for this article.
Cross-device tracking already in use
The CDT letter went on to cite articles reporting that cross-device tracking has been put to use by more than a dozen marketing companies. The technology, which is typically not disclosed and can’t be opted out of, makes it possible for marketers to assemble a shockingly detailed snapshot of the person being tracked.
“For example, a company could see that a user searched for sexually transmitted disease (STD) symptoms on her personal computer, looked up directions to a Planned Parenthood on her phone, visits a pharmacy, then returned to her apartment,” the letter stated. “While previously the various components of this journey would be scattered among several services, cross-device tracking allows companies to infer that the user received treatment for an STD. The combination of information across devices not only creates serious privacy concerns, but also allows for companies to make incorrect and possibly harmful assumptions about individuals.”
Use of ultrasonic sounds to track users has some resemblance to badBIOS, a piece of malware that a security researcher said used inaudible sounds to bridge air-gapped computers. No one has ever proven badBIOS exists, but the use of the high-frequency sounds to track users underscores the viability of the concept.
Now that SilverPush and others are using the technology, it’s probably inevitable that it will remain in use in some form. But right now, there are no easy ways for average people to know if they’re being tracked by it and to opt out if they object. Federal officials should strongly consider changing that.
Your economic status, your prospects for buying a house, your pregnancy, your illness…your rape; it’s all for sale in one of the biggest and most clandestine global marketplaces ever to lobby and influence government. But the data brokers let the NSA take the public punches while they just take – and sell – your most intimate information…
For nearly two years, media coverage of the NSA has been near-constant, over concerns about the extent of their data collection on people around the world. But, there’s an even larger behemoth in the shadows gathering information about you. Unlike the NSA, they are accountable to few laws, very little accountability, and no oversight, laughing off investigative inquiries at even the highest levels of government. This is a massive ecosystem, with an insatiable desire to learn every detail of your life and then sell it to those who would use it to persuade you. In effect, it’s a sprawling black market—and as one would expect with a black market, many of the purchasers of this information are criminals who are using it to steal the identities and valuables of many. We can only hope that they’re the worst of the buyers.
Who are the data brokers and where do they get your data?
According the U.S. Federal Trade Commission’s report on the industry in 2014, these private companies, called data brokers, buy and sell data about individuals obtained from myriad sources including government records, financial transactions/purchases, online activities, some medical records, phone records, etc. This information includes address histories, criminal records, financial history, family ties, and religious/political identifications. Using this information, brokers make inferences about individuals and divide them into segments like ‘Expectant Parent’, ‘Bible Lifestyle’, ‘Financially Challenged’, ‘Allergy Sufferer’, ‘Discount Shopper’, ‘Diabetes Interest’, and ‘Thrifty Elders’.
Some extremely sensitive information can be sold very cheaply. World Privacy Forum Executive Director Pam Dixon’s testimony before the U.S. Senate included a screen cap showing that MEDbase 200 was selling lists of rape victims for 7.9 cents per name, as well as similarly-priced lists of those suffering from HIV/AIDs, genetic diseases, addictive behavior (conveniently broken down into sub-categories like gambling, sex, alcohol, and drugs) and dementia. The listings were taken down soon after Dixon’s testimony.
One has to wonder why Experian believes that giving consumers more information about the industry’s practices would harm public trust of the means of data collection
One might imagine that information like that could be easily abused. Indeed, as early as 2007, the New York Times reported that the data broker infoUSA was selling lists of 3.3 million ‘Elderly Opportunity Seekers’ of older people ‘looking for ways to make money’, 4.7 million ‘Suffering Seniors’ dealing with cancer or Alzheimer’s, and 500,000 ‘Oldies but Goodies’ of gamblers over 55. One list specifically noted that ‘These people are gullible. They want to believe that their luck can change.’ Naturally, telemarketing fraudsters snapped up these lists like maps to gold mines. The New York Times reported that though thousands of banking documents and court filings show that the companies selling this information are constantly confronted by the fact that the information is being used in fraud, but even after U.S. government investigators warned infoUSA’s executives of the matter the company did not stop dealing with criminals.
This is definitely not an isolated incident of criminals buying in bulk from data brokers, either. According to prosecution by the Federal Trade Commission, the company Ideal Financial Solutions purchased data on 2.2 million consumers from brokers and fraudulently siphoned millions of dollars directly from the victims’ bank accounts from 2009 to 2013. At least 16% of the records used in the scam were provided by broker LeapLab, who obtained victims’ bank information from loan applications, and resold 95% of these applications at $.50 each to ‘third parties who were not online lenders and had no legitimate need for this financial information’, including other brokers who then aggregated this information with other information and resold it yet again. This constant selling and reselling of data ensures that once information gets into the market, it can end up in anyone’s hands.
Acxiom: ‘Over 3,000 propensities for nearly every U.S. consumer’
The amount of information collected by the data brokers is truly massive. According to data broker Acxiom’s 2014 Annual Report and their 2013 Annual Report (the sections are identical) that they have in their databases ‘Over 3,000 propensities for nearly every U.S. consumer’ and ‘Multi-sourced insight into approximately 700 million consumers worldwide.’ Individual dossiers are kept current through ‘nearly 11 trillion consumer record updates per year.’
While Acxiom is perhaps the most visible of the classic data brokers (there’s even a creepy song comparing them to an omniscient God), others are not far off in database size. An executive at rival data broker, Experian, told me, under conditions of anonymity that their own databases contained over 1,000 propensities on nearly every U.S. consumer.
Even these databases are eclipsed by the data collected and stored about individuals on social media sites. European requests for information disclosures suggest that Facebook permanently stores allactivity on the site, no matter how minor. This includes indirect information like one’s physical location (apparently latitude and longitude down to the 8th decimal place) at the time of login.
Obtaining someone’s location at any given point in time is fairly easy, by the way. A typical smartphone has four ways that it can be physically tracked: triangulation from cell towers, GPS, Bluetooth, and Wi-Fi signal—each of which are individually identifiable with serial codes that are globally unique to each phone’s hardware. According to former Wall Street Journal technical consultant Ashkan Soltani, the network provider Verizon leverages this data to monitor the physical behavior to its users and provides these insights in aggregate to advertising clients. Retailers (as well as airports and other locations where lots of people congregate) are now collecting this information at their stores in order to better monitor customer behavior.
It’s important to understand that gathering the raw data is only the first step in peering into individuals’ lives. By digging into the huge reams of data they’ve amassed, analysts can create models that extrapolate even more. As a New York Times article made infamous, retailer Target’s algorithms could use the purchases of key products to not only determine if a customer is pregnant, but accurately estimate when she is due so that they could send coupons corresponding to each stage of pregnancy.
Data brokers know about your drug use, your personality and your pregnancy
It doesn’t even take a large variety of data to reliably extrapolate a great deal of personal information. A well-known Cambridge-Microsoft studyshowed that just using Facebook Likes (from 58,000 volunteers,) researchers could infer traits like a user’s sexual orientation, political/religious beliefs, drug use, IQ, and personality with remarkable accuracy. Indeed, a follow-up study showed that when these models were applied to the typical Facebook user they were more accurate in gauging the user’s personalities than even friends and family members. Just as importantly, this model was also reapplied to identify Bing search users’ political and religious beliefs with only a small loss in accuracy, suggesting that models like these are not tethered to their sources. Many other studies have shown countless other ways that seemingly private information can be extrapolated from apparently innocuous publicly available data. Data brokers have far more than just our Likes recorded, so their models could be vastly more precise.
As for how much information these brokers have collectively, that’s anyone’s guess. The head of the Federal Trade Commission, the U.S. agency that supposedly regulates the industry, has admitted that her agency doesn’t even know how many data brokers exist, let alone what they’re all doing. The industry is very intent on keeping its dealings in the shadows, defying even senior U.S. government investigations.
Even in the face of a Senate committee investigation the brokers refused to reveal their data sources and clients. The 2013 report notes ‘Three of the largest companies – Acxiom, Experian, and Epsilon – to date have been similarly secretive with the Committee with respect to their practices, refusing to identify the specific sources of their data or the customers who purchase it.’
The policy influence of the data brokers
A request by Senator Edward Markey for more specifics on the brokers’ clients was met with a general reply from Acxiom noting that though they would not divulge their clients identities (valuing their privacy very highly) their clients include ‘47 Fortune 100 clients’, ‘5 of the 13 largest U.S. federal government agencies’ and ‘Both major national political parties’. The reply also noted Acxiom’s ‘long history of proactively engaging with policy makers’ adding ‘We are in numerous policy groups, and in some cases, have been a driving force in their creation’.
The letter (no longer available on Markey’s site) hints at just how much power these companies wield over lawmakers. According to a 2013report by the Interactive Advertising Bureau, U.S. politicians use the information from the brokers to micro-target their political campaigns, despite (as a 2012 study showed) the fact that the vast majority of Americans are very uncomfortable with politicians gathering information on them and using said information to tailor ads to them.
This may partly explain why there’s no real regulatory drive to bring the brokers’ activities into the light. The Data Broker Accountability and Transparency Act of 2015, for example was proposed and then died in committee—as did the Data Broker Accountability and Transparency Act of 2015. It follows the noble tradition of the Data Security and Breach Notification Act (which would have been the first U.S. federal law requiring data brokers to inform consumers when hackers have stolen their data), which has been proposed and killed in committee every year from 2015 to 2009.
Lawmakers’ complicity in shielding data brokers from scrutiny does not protect them from the system’s horrific security issues.
In an experiment, Brian Krebs of Krebs on Security, sought to see how easy it would be to obtain the full address histories and social security numbers of all 13 members of the U.S. Senate Commerce Committee Subcommittee on Consumer Protection, Product Safety and Insurance. He only needed to go to two identity theft service sites to obtain them all, as well as the same information about the head of the Federal Trade Commission and the Consumer Financial Protection Bureau. If their information is up for purchase by criminals, how could we possibly imagine that any of the rest of ours is safe? Despite issues like this, the industry has been consistently adamant that they don’t need any further oversight or transparency.
When the Federal Trade Commission proposed a centralized list of data brokers that consumers might look over to better understand the market, Experian replied that this would ‘have the unintended effect of confusing consumers and eroding trust in e-commerce.’ One has to wonder why Experian believes that giving consumers more information about the industry’s practices would harm public trust of the means of data collection. Personally, I can’t ever recall losing trust in someone when I found out that they were acting honestly and benevolently. Indeed, the company even acknowledges in the same statement that there are ‘literally dozens and dozens of smaller data providers with long histories of questionable practices’, which is where Experian says the FTC needs to focus its efforts.
Experian’s purchase of Court Ventures
Experian is very well versed in smaller data providers with questionable practices. After all, they bought one.
Specifically they had acquired Court Ventures, a court records broker who had been providing direct access to sensitive personal data (including social security numbers and bank information) on over 200 million Americans to the identity theft marketplace Superget.info. Experian VP of Government Affairs and Public Policy Tony Hadley testified to congress in 2014 that the Secret Service notified the company of their subsidiaries criminal activities nine months after Experian acquired them, and apparently Experian’s due diligence had failed to scrutinize the monthly wire transfers Court Ventures was receiving from Singapore. Regarding Court Ventures, Hadley testified ‘We were a victim, and scammed by this person.’
Just a few months earlier Hadley had testified before the same committee that ‘Experian shares data responsibly—by carefully safeguarding compliance with all privacy and consumer protection laws and industry self-regulatory standards, advancing and observing industry best practices, and establishing and monitoring adherence to our own corporate policies and practices.’
Despite the clear failure of these safeguards, Experian and the rest of the data broker industry continue to argue that they are sufficient, resisting all calls for further transparency and accountability.
The national security implications of the information that data brokers retain
The current lack of oversight not only allows criminals and arguably private companies to abuse personal data, but it may pose a national security threat. The recently revealed Chinese breach of a U.S. government database of 4.1 million employees is just part of a series of cyber-breaches (originating in China) of personal information databases including the theft of 80 million Social Security records and the breachof travel records database of United Airlines of travel records. U.S. officials and analysts conclude that the breaches are part of a Chinese campaign to build a massive database on U.S. citizens, the most obvious application being for espionage.
The current opaque data broker market could allow China and other governments to simply buy the information they want without having to steal it, particularly as the models for extrapolating data become ever more accurate.
Before we start rounding up angry mobs to storm the databases where our records are kept, however, let’s acknowledge that data brokers provide tangible benefits to all of us.
While I might find it slightly unsettling that Google Maps tells me the locations of where I live, work, and am about to go to a party, without my having to ask, it is convenient. While tailored advertising might sound manipulative, Amazon’s ‘You May Also Like’ has led me to many great books that I would have overlooked. Beyond clear examples of personal data analysis providing conveniences like this, we all indirectly benefit from it making institutions more efficient and keeping costs down.
Unfortunately, data brokers have learned that becoming more transparent opens them up to backlash by a public that still isn’t particularly aware of how much data is available about them and what’s being done with it, and so reacts to any such revelations with shock, confusion, and panic. Fear of such a backlash encourages these companies to remain in the shadows. Naturally, this only further fosters public ignorance and mistrust, which only makes it worse when the public gets a glimpse of what’s going on.
The cycle of mistrust keeps brokers operating what amounts to a black market for personal data. Black markets aren’t known for the integrity of their participants. They aren’t known for their efficiency or reliability either, so even the data brokers are suffering from the current state of affairs.
If we are to break the cycle, we need an open dialogue about how data is to be used.
Apple’s chief executive has sharply criticised surveillance powers proposed by the British government, warning that allowing spies a backdoor route into citizens’ communications could have “very dire consequences”.
Questioning a key element of the draft investigatory powers bill, which places a new legal obligation on companies to assist in these operations to bypass encryption, Tim Cook insisted that companies had to be able to encrypt in order to protect people.
Speaking during a visit to the UK, he said that halting or weakening encryption would hurt “the good people” rather than those who want to do bad things, who “know where to go”.
“You can just look around and see all the data breaches that are going on. These things are becoming more frequent,” Cook told the Daily Telegraph. “They can not only result in privacy breaches but also security issues. We believe very strongly in end-to-end encryption and no back doors. We don’t think people want us to read their messages. We don’t feel we have the right to read their emails.
“Any back door is a back door for everyone. Everybody wants to crack down on terrorists. Everybody wants to be secure. The question is how. Opening a back door can have very dire consequences.”
The draft investigatory powers bill, unveiled by the home secretary, Theresa May, on Wednesday, makes explicit in law for the first time the powers for security services and police to hack into and bug computers and phones
These authorities would be allowed to access records tracking every UK citizen’s use of the internet without any judicial check. It includes new powers requiring internet and phone companies to keep “internet connection records” – which track every website visited but not every page – for a maximum of 12 months but will not require a warrant for the police, security services or other bodies to access the data.
Cook signalled, however, that there would be an outcome which he and others would find favourable, predicting: “When the public gets engaged, the press gets engaged deeply, it will become clear to people what needs to occur. You can’t weaken cryptography. You need to strengthen it. You need to stay ahead of the folks that want to break it.”
Cook’s comments echoed his remarks earlier this year where he warned against eroding the right to privacy and said that technology companies had a duty to protect their customers.
That speech, which was made in February, followed US government attempts to weaken encryption. In the wake of revelations from the National Security Agency whistleblower Edward Snowden, Apple and other tech firms have moved to strengthen encryption and faced a backlash from government officials.
The constant pushing of the Snoopers Charter would suppose the government had missed recent data breaches such as Ashley Madison or TalkTalk…
Seriously, this data retention will be a massive honeypot. It will be hacked. Peoples data will be stolen, ID Theft will occur.
This is a toxic situation for tech companies. My money is betting that these massive companies will tell the British Government to take a hike (in loving terms).
Theresa May, with the general air of a hawk that had a This Morning makeover, has launched the new investigatory powers bill. No more drunken Googling: all it takes is a misspelled search for “bong-making” and suddenly you’ll be in an orange jumpsuit getting beaten with a pillowcase full of bibles. Also, pay attention when searching for a child’s prom.
This law will create lots of new jobs, as the person charged with reading all our communications (who will see more unsolicited erections than customer services at Skype) will regularly feed their screaming face into a meatgrinder.
The government insists, as it tries to scrap the Freedom of Information Act, that only people who have something to hide should worry. People who run for public office will be afforded privacy, while our private lives will become public property. Having our privacy exposed is particularly crushing for the British – a nation for whom the phrase: “How are you?” really means: “Please say one word, then leave me alone.” So why have they just accepted this? Well, for a lot of people it’s the only hope that anyone will ever read their tweets.
The PR push for this bill’s launch has shown how similar the legislative process has become to the lobbying one. In fact, it’s not even lobbying, really, as most things that get lobbied for at least have some notional utility. This is more like phishing, asking us to sign up to something that looks helpful, but is actually a data breach.
Of course, the government has been reading our social media messages for 15 years. Imagine the celebrations at GCHQ when they see a young couple they used to watch sexting, then wanking on Skype, post their first kid’s school uniform pics on Facebook. In its own way, it must be quite touching.
The UK government has always stored intelligence, so don’t worry – they knew about Assad for years and didn’t act; they know about Saudi Arabia’s human rights abuses and do nothing; they knew about Hitler and didn’t act. What makes you think that a Facebook post you’re writing about a riot would … oh … where did you go?
One of the key areas of concern about the bill is the lack of any real judicial check. Theresa May can bypass judicial permission if surveillance is deemed “urgent” – like, if Belmarsh’s kitchen rota has a gap for someone who can cook halal. Nonetheless, Andy Burnham was quick to offer his support.
Many of Labour’s frontbench seem not to have been caught up in the joyous new Corbyn era, and look like people who have been forced by their work to go on some kind of awareness course. Currently, much of the Labour party’s policy seems to be formulated by encouraging them to just go on camera and say how they feel. Burnham has since backtracked, telling May: “On closer inspection of the wording of the bill, it would seem that it does not deliver the strong safeguard that you appeared to be accepting.”
Which translates as: he didn’t read the bill very carefully because the government had told him it would have judicial safeguards and he believed them. Poor Andy. It’s as if you can’t even trust people who are trying to give a load of authoritarian powers to their secret police any more.
This bill will lead to every person of colour’s worst fear: more concerned white people. Because white people will soon be the only ones who can Google the history of Islamophobia without ending up spending a decade watching children’s TV at full volume in a variety of stress positions. Ideological crime will be prioritised while actual crime is ignored, and we’ll adapt. Eventually, when you see a mugging, you’ll just start WhatsApping emojis of bombs until you hear sirens.
We will acquiesce to the scanning of Facebook posts to fight terrorism, which has killed 56 people in the UK in 10 years, but will still regard the killing of two women a week by their partners as a private domestic matter. God knows what this whole shambles says about us all psychologically. May herself gives the impression that the only childhood affection she got was the time a horse mistook her knuckles for a corn cob. At least this bill has allowed me to decode her permanently appalled expression: she looks as if she’s just seen my internet history.
I suppose that we need to consider what our internet history is. The legislation seems to view it as a list of actions, but it’s not. It’s a document that shows what we’re thinking about. The government wants to know what we’ve been thinking about, and what could be more sinister than that?