Facebook will pay $550 million to Illinois users to settle allegations that its facial tagging feature violated their privacy rights.
The settlement — which could amount to a couple of hundred dollars for each user who is part of the class-action settlement — stems from a federal lawsuit filed in Illinois nearly five years ago that alleges the social media giant violated a state law protecting residents’ biometric information. Biometric information can include data from facial, fingerprint and iris scans.
Illinois has one of the strictest biometric privacy laws in the nation. The 2008 law mandates that companies collecting such information obtain prior consent from consumers, detail how they’ll use it and specify how long the information will be kept. The law also allows private citizens, rather than just governmental entities, to file lawsuits over the issue.
In 2018, a judge defined the class as Facebook users in Illinois from whom the Menlo Park, California-based company created a stored face template after June 7, 2011, the date Facebook said its tag suggestion feature was available in most countries. The feature uses facial recognition software to match users’ new photos with other photos they’re tagged in. It groups similar photos together and suggests the names of friends in the photos.
The settlement is a win for privacy advocates who say that protecting biometric information is critical because, unlike a credit card number, it can’t be changed if it’s stolen. “This pretty firmly establishes the fact that those harms are real and consumers deserve restitution when their rights have been violated,” said Abe Scarr, director of the Illinois Public Interest Research Group, a consumer advocacy organization.
I assume Facebook will find a way to weasel out of including everyone in Illinois from this class. I resided solely in Illinois during the time the class action covers, and was probably tagged in a photo, but am not sure. I also don’t have my proper residence listed (I’ve varied it a bit from Frostpocket, to Guam, to Upper Yurtistan, and elsewhere as the mood strikes), but Facebook of course knows where I’m logging into their servers from, down to the individual block group I imagine.
The New York Times reports on the latest slap on the wrist regarding corporate malfeasance and indifference:
The credit bureau Equifax will pay at least $650 million … to end an array of state, federal and consumer claims over a 2017 data breach that exposed the sensitive information of more than 147 million people. The breach was one of the most potentially damaging in an ever-growing list of digital thefts.
The settlement, which was announced on Monday and still needs court approval, would be the largest ever paid by a company over a data breach. The deal requires Equifax to put a minimum of $380.5 million into a restitution fund for American consumers who file claims showing that they were financially harmed.
A portion of that money will pay for lawyers’ fees, but at least $300 million must go to victims, according to settlement documents filed in federal court in Atlanta. If the initial cash is depleted, the company will add up to $125 million more to settle consumers’ claims, bringing the total fund size to more than $500 million.
Equifax will pay an additional $175 million in fines to end investigations by 50 attorneys general. Forty-eight states — all except Indiana and Massachusetts, which separately filed their own lawsuits against Equifax — are part of the deal, along with the District of Columbia and Puerto Rico
So the government gets a ‘taste’, but individual consumers get spit in their eye. $300,000,000 to be distributed to a portion of 147,000,000 people who Equifax screwed. $2 each. Whooo hooo! Lawyers get plenty of money, average people, not so much.
The fine print is that you have to prove that Equifax harmed you by giving away your social security number, bank info, drivers license, date of birth and whatever else.
Equifax will also pay $20,000 to consumers who can prove that they suffered “fraud, identity theft, or other misuse” because of the data breach. Equifax will also pay them $25 per hour for up to 20 hours of time they had spent trying to safeguard their data. Equifax will also reimburse them for out-of-pocket losses and up to 25% of the cost of Equifax credit or identity monitoring. Exactly how Equifax will require consumers verify their costs is unknown.
So Tim Cook called for better privacy regulation in the US. Maybe he reads this humble blog.1
In 2019, it’s time to stand up for the right to privacy—yours, mine, all of ours. Consumers shouldn’t have to tolerate another year of companies irresponsibly amassing huge user profiles, data breaches that seem out of control and the vanishing ability to control our own digital lives.
This problem is solvable—it isn’t too big, too challenging or too late. Innovation, breakthrough ideas and great features can go hand in hand with user privacy—and they must. Realizing technology’s potential depends on it.
That’s why I and others are calling on the U.S. Congress to pass comprehensive federal privacy legislation—a landmark package of reforms that protect and empower the consumer. Last year, before a global body of privacy regulators, I laid out four principles that I believe should guide legislation:
Acxiom, like Mr. Cook, also supports a national privacy law for the U.S., such as GDPR provides for the European Union. Acxiom is actively participating in discussions with U.S. lawmakers as well as industry trade groups to help ensure U.S. consumers receive the kind of transparency, access, and control Acxiom has been providing voluntarily for years,” the company said. “We believe it would be universally beneficial if we were able to work with Apple and other industry leaders to define the best set of laws that maintain the benefits of data in our economy while giving the necessary protections and rights to all people.”
In its statement, Acxiom said it is working with lawmakers to build a “singular, united set of policies across the U.S.” What it does not want, according to the statement, are “multiple and independent state laws” making it onerous to comply.
Of course, it behooves Acxiom to seem amenable to such legislative moves. It’s becoming increasingly clear that the tide is shifting in the U.S., and more people want better safeguards over their data. Cook called for not just stricter data regulations, but a federally controlled data broker database that would make it possible for citizens to know exactly what information the companies have on them and which companies transacted with these data firms. While Acxiom is saying it’s open to new regulation, it’s unclear what exactly the firm will agree to.
One day after Google CEO Sundar Pichai was questioned on data privacy during a House hearing, a group of 15 Democratic senators has proposed a new bill for protecting personal information online.
The Data Care Act, proposed by Sen. Brian Schatz (D-HI) and more than a dozen co-sponsors, including Amy Klobuchar (D-MN) and Cory Booker (D-NJ), would create new rules around how companies that collect user data can handle that information.
Under the act, data collectors would be required to “reasonably secure” identifying information, to not use that information in a harmful way, and to give notice to consumers about breaches of sensitive information. The requirement extends to third parties, if the data collectors share or sell that data with another entity, and the plan would also give the FTC new authority to fine companies that act deceptively with users’ data.
Apple chief executive Tim Cook has demanded a tough new US data protection law, in an unusual speech in Europe.
Referring to the misuse of “deeply personal” data, he said it was being “weaponised against us with military efficiency”.
“We shouldn’t sugar-coat the consequences,” he added. “This is surveillance.”
The strongly-worded speech presented a striking defence of user privacy rights from a tech firm’s chief executive.
Mr Cook also praised the EU’s new data protection regulation, the General Data Protection Regulation (GDPR).
The Apple boss described in some detail what he called the “data industrial complex”, noting that billions of dollars were traded on the basis of people’s “likes and dislikes”, “wishes and fears” or “hopes and dreams” – the kind of data points tracked by tech firms and advertisers.
He warned that the situation “should make us very uncomfortable, it should unsettle us”.
Google exposed the private data of hundreds of thousands of users of the Google+ social network and then opted not to disclose the issue this past spring, in part because of fears that doing so would draw regulatory scrutiny and cause reputational damage.
A software glitch in the social site gave outside developers potential access to private Google+ profile data between 2015 and March 2018, when internal investigators discovered and fixed the issue, according to the documents and people briefed on the incident. A memo reviewed by the Journal prepared by Google’s legal and policy staff and shared with senior executives warned that disclosing the incident would likely trigger “immediate regulatory interest” and invite comparisons to Facebook’s leak of user information to data firm Cambridge Analytica.
For the past year, select Google advertisers have had access to a potent new tool to track whether the ads they ran online led to a sale at a physical store in the U.S. That insight came thanks in part to a stockpile of Mastercard transactions that Google paid for.
But most of the two billion Mastercard holders aren’t aware of this behind-the-scenes tracking. That’s because the companies never told the public about the arrangement.
Alphabet Inc.’s Google and Mastercard Inc. brokered a business partnership during about four years of negotiations, according to four people with knowledge of the deal, three of whom worked on it directly. The alliance gave Google an unprecedented asset for measuring retail spending, part of the search giant’s strategy to fortify its primary business against onslaughts from Amazon.com Inc. and others.
Nick Heer writes about a topic near and dear to our brains, albeit from the web developer side: why do websites load so slowly? And why is our personal data being sold without our informed consent?
The average internet connection in the United States is about six times as fast as it was just ten years ago, but instead of making it faster to browse the same types of websites, we’re simply occupying that extra bandwidth with more stuff. Some of this stuff is amazing: in 2006, Apple added movies to the iTunes Store that were 640 × 480 pixels, but you can now stream movies in HD resolution and (pretend) 4K. These much higher speeds also allow us to see more detailed photos, and that’s very nice.
But a lot of the stuff we’re seeing is a pile-up of garbage on seemingly every major website that does nothing to make visitors happier — if anything, much of this stuff is deeply irritating and morally indefensible.
Take that CNN article, for example. Here’s what it contained when I loaded it:
Eleven web fonts, totalling 414 KB
Four stylesheets, totalling 315 KB
Twenty-nine XML HTTP requests, totalling about 500 KB
Approximately one hundred scripts, totalling several megabytes — though it’s hard to pin down the number and actual size because some of the scripts are “beacons” that load after the page is technically finished downloading.
The vast majority of these resources are not directly related to the information on the page, and I’m including advertising. Many of the scripts that were loaded are purely for surveillance purposes: self-hosted analytics, of which there are several examples; various third-party analytics firms like Salesforce, Chartbeat, and Optimizely; and social network sharing widgets. They churn through CPU cycles and cause my six-year-old computer to cry out in pain and fury. I’m not asking much of it; I have opened a text-based document on the web.
An actual solution recognizes that this bullshit is inexcusable. It is making the web a cumulatively awful place to be. Behind closed doors, those in the advertising and marketing industry can be pretty lucid about how much they also hate surveillance scripts and how awful they find these methods, while simultaneously encouraging their use. Meanwhile, users are increasingly taking matters into their own hands — the use of ad blockers is rising across the board, many of which also block tracking scripts and other disrespectful behaviours. Users are making that choice.
They shouldn’t have to. Better choices should be made by web developers to not ship this bullshit in the first place. We wouldn’t tolerate such intrusive behaviour more generally; why are we expected to find it acceptable on the web?
An honest web is one in which the overwhelming majority of the code and assets downloaded to a user’s computer are used in a page’s visual presentation, with nearly all the remainder used to define the semantic structure and associated metadata on the page. Bullshit — in the form of CPU-sucking surveillance, unnecessarily-interruptive elements, and behaviours that nobody responsible for a website would themselves find appealing as a visitor — is unwelcome and intolerable.
All that “surveillance” stuff and related files are an abomination, and pleases no-one. I’ve heard anecdotal reports that even marketing savvy companies don’t frequently use all the data that is collected on their behalf. So who wants it? Unclear to me. I guess the third party data collection industry is happy to vacuum up this data because they can subsequently re-sell our information to the highest bidder, but that’s not a good enough reason to continue making web pages cumbersome.
Surveillance Society – Halsted and Division Edition
The Guardian reports:
Facebook used its apps to gather information about users and their friends, including some who had not signed up to the social network, reading their text messages, tracking their locations and accessing photos on their phones, a court case in California alleges.
The claims of what would amount to mass surveillance are part of a lawsuit brought against the company by the former startup Six4Three, listed in legal documents filed at the superior court in San Mateo as part of a court case that has been ongoing for more than two years.
A Facebook spokesperson said that Six4Three’s “claims have no merit, and we will continue to defend ourselves vigorously”.
The allegations about surveillance appear in a January filing, the fifth amended complaint made by Six4Three. It alleges that Facebook used a range of methods, some adapted to the different phones that users carried, to collect information it could use for commercial purposes.
“Facebook continued to explore and implement ways to track users’ location, to track and read their texts, to access and record their microphones on their phones, to track and monitor their usage of competitive apps on their phones, and to track and monitor their calls,” one court document says.
This is Facebook’s business model though, so what exactly are they going to argue? No, we don’t collect data on our users and then use this information to sell advertising to corporations?
The one detail that is the most disturbing1 is that Facebook did this for people who weren’t Facebook users. How did these people consent? How do they request their data? How do they update their privacy settings?
While Facebook and Cambridge Analytica are hogging the spotlight, data brokers that collect your information from hundreds of sources and sell it wholesale are laughing all the way to the bank. But they’re not laughing in Vermont, where a first-of-its-kind law hems in these dangerous data mongers and gives the state’s citizens much-needed protections.
Data brokers in Vermont will now have to register as such with the state; they must take standard security measures and notify authorities of security breaches (no, they weren’t before); and using their data for criminal purposes like fraud is now its own actionable offense.
If you’re not familiar with data brokers, well, that’s the idea. These companies don’t really have a consumer-facing side, instead opting to collect information on people from as many sources as possible, buying and selling it amongst themselves like the commodity it has become.
This data exists in a regulatory near-vacuum. As long as they step carefully, data brokers can maintain what amounts to a shadow profile on consumers. I talked with director of the World Privacy Forum, Pam Dixon, about this practice.
“If you use an actual credit score, it’s regulated under the Fair Credit Reporting Act,” she told me. “But if you take a thousand points like shopping habits, zip code, housing status, you can create a new credit score; you can use that and it’s not discrimination.”
And while medical data like blood tests are protected from snooping, it’s not against the law for a company to make an educated guess your condition from the medicine you pay for at the local pharmacy. Now you’re on a secret list of “inferred” diabetics, and that data gets sold to, for example, Facebook, which combines it with its own metrics and allows advertisers to target it.
Exactly why I wish the US would implement its own version of the GDPR that we’ve discussed. Corporations that mine our digital data, and sell it, and resell it, without oversight, or without giving “a taste” to the consumer are corporations that need to be regulated and watched by a consumer protection agency of some kind. Not every consumer is savvy enough to obfuscate their tracks, and honestly, even somewhat savvy consumers are no doubt caught up in these nameless corporations’ databases. Corporations like Equifax, Quotient and Catalina Marketing and a few thousand others don’t really need to use browser cookies anymore, they also use the unique ID of your devices, they track your IP numbers down to your block group, and can track you at home, at office, via phone, via credit card, via geolocation and via other means. I find it Orwellian and creepy.
My sincere wish is that Vermont continues on this path of regulation of the wild, wild web of data brokers, and that other states and the entire country follows suit.
Europe’s new privacy law took effect Friday, causing major U.S. news websites to suspend access across the region as data-protection regulators prepare to brandish their new enforcement powers.
Tronc Inc., publisher of the Los Angeles Times, New York Daily News and other U.S. newspapers [Chicago Tribune], was among those that blocked readers in the European Union from accessing sites, as they scrambled to comply with the sweeping regulation.
“We are engaged on the issue and committed to looking at options that support our full range of digital offerings to the EU market,” the company said in notices it displayed when users attempted to access its news sites from the EU on Friday morning.
Others U.S. regional newspapers owned by Lee Enterprises Inc., as well as bookmarking app Instapaper, owned by Pinterest. Inc., were also blocking access in the EU.
The EU’s General Data Protection Regulation foresees steep fines for companies that don’t comply with the new rules, aimed at giving Europe-based users more control over the data companies hold on them.
Tronc and many other digital news organizations are among the worst offenders of collecting information on consumers. Using this article at the WSJ as an example, Ghostery reports 24 different cookies/trackers being served to a reader, from Facebook, Google, DoubleClick, and so on. I’m a subscriber, and WSJ still allows companies like Bombora to shovel my information into their corporate maws.
So, I’m not surprised that many news organizations are not in compliance with the new GDPR regulations, I’m only saddened that the US doesn’t have a similar protection for consumers. Savvier consumers can install anti-tracking services, like Ghostery, but what about everyone else?
New European privacy regulations went into effect on Friday that will force companies to be more attentive to how they handle customer data.
The ramifications were visible from day one, with major U.S.-media outlets including the LA Times and Chicago Tribune were forced to shutter their websites in parts of Europe.
People in the bloc have been bombarded with dozens of emails asking for their consent to keep processing their data, and a privacy activist wasted no time in taking action against U.S. tech giants for allegedly acting illegally by forcing users to accept intrusive terms of service or lose access.
Amazing really the number of these emails I’ve received. Several are worded in such a way that I did not accept their terms, and assume my account will become dormant. If it was a company I cared to still do business with, I might look a little deeper, but mostly I just shrug and delete.
We first heard about GDPR late last year and only wish the US took consumer privacy as seriously as the EU.
Dreaming Has A Low
From December, 2017:
Almost a fifth of companies in the marketing and advertising sector would go out of business if they were to be hit by a fine for non-compliance of the new GDPR legislation.
The General Data Protection Regulation (GDPR) comes into force in less than one year and covers everything from a consumer’s ‘right to be forgotten’ to data breach notification and accountability. At the heart of the reform in how companies must handle customer data is a fine, standing at €20m or 4% of an company’s global revenue, if they are found to be falling foul.
But, in a survey of 187 marketing and advertising companies conducted by YouGov on behalf of law firm Irwin Mitchel, 70% said they wouldn’t be certain of their ability to detect a data breach. Meanwhile, just 37% said they would be equipped to deal with it in the required timescale of three days.
With 200-plus pages of regulation set to come into force in May 2018, it formalizes concepts like the “right to be forgotten,” data breach accountability, data portability and more — and is arguably the biggest disruption in the digital space in recent years.
Simply put, the regulations are being put into place to give individual more rights to their data, but brands and marketers need to get on board beforehand in order to avoid hefty potential fines – up to $24m, or 4% of annual turnover (whichever is the greater sum). Some of the requirements include:
Requiring consent for data processing
Anonymizing collected data to protect privacy
Providing data breach notifications
Safely handling the transfer of data across borders
Requiring certain companies to have a data protection officer to oversee GDPR compliance
Cell phones are useful for a lot of things, but owning one does have consequences, like the ability for 3rd party organizations or government entities to track your location down to 25-50 feet at any time your phone is connected to a cell tower.
The NYT reports:
Senator Ron Wyden, Democrat of Oregon, wrote in a letter this week to the Federal Communications Commission that Securus confirmed that it did not “conduct any review of surveillance requests.” The senator said relying on customers to provide documentation was inadequate. “Wireless carriers have an obligation to take affirmative steps to verify law enforcement requests,” he wrote, adding that Securus did not follow those procedures.
The service provided by Securus reveals a potential weakness in a system that is supposed to protect the private information of millions of cellphone users. With customers’ consent, carriers sell the ability to acquire location data for marketing purposes like providing coupons when someone is near a business, or services like roadside assistance or bank fraud protection. Companies that use the data generally sign contracts pledging to get people’s approval — through a response to a text message, for example, or the push of a button on a menu — or to otherwise use the data legally.
But the contracts between the companies, including Securus, are “the legal equivalent of a pinky promise,” Mr. Wyden wrote. The F.C.C. said it was reviewing the letter.
Courts are split on whether investigators need a warrant based on probable cause to acquire location data. In some states, a warrant is required for any sort of cellphone tracking. In other states, it is needed only if an investigator wants the data in real time. And in others no warrant is needed at all.
Other experts said the law should apply for any communications on a network, not just phone calls. “If the phone companies are giving someone a direct portal into the real-time location data on all of their customers, they should be policing it,” said Laura Moy, the deputy director of the Georgetown Law Center on Privacy & Technology.
Mr. Wyden, in his letter to the F.C.C., also said that carriers had an obligation to verify whether law enforcement requests were legal. But Securus cuts the carriers out of the review process, because the carriers do not receive the legal documents.
The letter called for an F.C.C. investigation into Securus, as well as the phone companies and their protections of user data. Mr. Wyden also sent letters to the major carriers, seeking audits of their relationships with companies that buy consumer data. Representatives for AT&T, Sprint, T-Mobile and Verizon said the companies had received the letters and were investigating.
In this particular instance, the 3rd parties selling your location data is called 3Cinteractive and LocationSmart, but there are hundreds more such companies who have built their businesses on turning your location into sellable data, most of which are relatively obscure.
Securus received the data from a mobile marketing company called 3Cinteractive, according to 2013 documents from the Florida Department of Corrections. Securus said that for confidentiality reasons it could not confirm whether that deal was still in place, but a spokesman for Mr. Wyden said the company told the senator’s office it was. In turn, 3Cinteractive got its data from LocationSmart, a firm known as a location aggregator, according to documents from those companies. LocationSmart buys access to the data from all the major American carriers, it says.
How does it work?
“Envision a cell site,” says Allen (a typical tower appears in the photo above). “They’re triangular, and each side has about 120 degrees of sweep.” Every time a signal is transmitted to a nearby phone, says Allen, there is a round-trip delay to the mobile device and back. By using all three sides of the triangle to “talk” to the mobile device, the tower can triangulate which edge of the base station is closest to the device. “Typically the accuracy return varies,” says Allen. “In urban settings, it can be accurate down to several blocks; in suburban settings, several hundred meters.”
“We can locate any subscriber,” says Allen, “and companies want all those subscribers to be addressable,” or discoverable. Normally, this requires passing through some privacy gateways, says Allen. “The end user must opt in through a Web portal or SMS, or an app like Foursquare,” he says, per “universal” CTIA and MMA guidelines, and carriers’ own privacy protocol.
But with enterprise services, there’s a catch. “In a workplace scenario, the corporate entity has the right to opt-in those devices,” says Allen. “The [employee] is typically notified, but the opt-in is up to the employer.”
In other words: if your employer owns your phone, tablet or 3G-enabled computer, they’re entitled to own your location, too.
Even Apple, a corporation that prides itself on not selling users data as much as their competitors, has acknowledged that users data has sometimes been sold.
9To5 Mac reports:
Over the last few days, Apple has seemingly started cracking down on applications that share location data with third-parties. In such cases, Apple has been removing the application in question and informing developers that their app violates two parts of the App Store Review Guidelines…
Sylvania HomeKit Light Strip Thus far, we’ve seen several cases of Apple cracking down on these types of applications. The company informs developers via email that “upon re-evaluation,” their application is in violation of sections 5.1.1 and 5.1.2 of the App Store Review Guidelines, which pertain to transmitting user location data and user awareness of data collection.
Legal – 5.1.1 and Legal 5.1.2
The app transmits user location data to third parties without explicit consent from the user and for unapproved purposes.
Apple explains that developers must remove any code, frameworks, or SDKs that relate to the violation before their app can be resubmitted to the App Store
In the context of describing yet another social network aimed at Facebook, albeit one that allegedly will pay you for your content1 Wired reports:
DURING MARK ZUCKERBERG’S over 10 hours of Congressional testimony last week, lawmakers repeatedly asked how Facebook makes money. The simple answer, which Zuckerberg dodged, is the contributions and online activities of its over two billion users, which allow marketers to target ads with razor precision. In which case, asked representative Paul Tonko (D – New York), “why doesn’t Facebook pay its users for their incredibly valuable data?”
Yeah, Facebook doesn’t want to really discuss this key aspect of their business in public: all their wealth is based on the mining and reselling of their users data. It was never a hidden fact, it was always known to anyone who bothered to ask, but Facebook doesn’t really like to explain it so that the majority realize they are the product being sold.
So let’s be clear, Facebook, Snapchat, Instagram, and Twitter even2 only exist to collect data about their users, and use information gleaned from their users to sell to corporations, or governments, etc. That is the model. If everyone, including your grandmother, and my 14 year old nephew understands this basic fact, we’ll all benefit as a society.