Archive for the ‘iPhone’ tag
Apple CEO Tim Cook has spent a lot of effort keeping this case in the public, even giving an interview with Time Magazine’s Lev Grossman, which includes statements like:
Inside Apple this idea is nicknamed, not affectionately, GovtOS. “We had long discussions about that internally, when they asked us,” Cook says. “Lots of people were involved. It wasn’t just me sitting in a room somewhere deciding that way, it was a labored decision. We thought about all the things you would think we would think about.” The decision, when it came, was no.
Cook actually thought that might be the end of it. It wasn’t: on Feb. 16 the FBI both escalated and went public, obtaining a court order from a federal judge that required Apple to create GovtOS under something called the All Writs Act. Cook took deep, Alabaman umbrage at the manner in which he learned about the court order, which was in the press: “If I’m working with you for several months on things, if I have a relationship with you, and I decide one day I’m going to sue you, I’m a country boy at the end of the day: I’m going to pick up the phone and tell you I’m going to sue you.”
It also wasn’t lost on Cook that the FBI chose not to file the order under seal: if Apple wasn’t going to help with a case of domestic terrorism, the FBI wanted Apple to do it under the full glare of public opinion.
The spectacle of Apple, the most admired company in the world, refusing to aid the FBI in a domestic-terrorism investigation has inflamed public passions in a way that, it’s safe to say, nothing involving encryption algorithms and the All Writs Act ever has before. Donald Trump asked, “Who do they think they are?” and called for a boycott of Apple. A Florida sheriff said he would “lock the rascal up,” the rascal meaning Cook. Even President Obama, whose relations with the technorati of Silicon Valley have historically been warm, spoke out about the issue at South by Southwest: “It’s fetishizing our phones above every other value. And that can’t be the right answer.”
As against that, Apple has been smothered in amicus briefs from technology firms supporting its position, including AT&T, Airbnb, eBay, Kickstarter, LinkedIn, Reddit, Square, Twitter, Cisco, Snapchat, WhatsApp and every one of its biggest, bitterest rivals: Amazon, Facebook, Google and Microsoft. Zeid Ra’ad al-Hussein, the U.N. High Commissioner for Human Rights, spoke out in Apple’s defense. So did retired general Michael Hayden, former head of both the NSA and the CIA. The notoriously hawkish Senator Lindsey Graham, who started out lambasting Apple, switched sides after a briefing on the matter. Steve Dowling, Apple’s vice president of communications, showed me a check for $100 that somebody sent to support the world’s most valuable technology company in its legal fight. (Apple didn’t cash it.)
(click here to continue reading Inside Apple CEO Tim Cook’s Fight With the FBI | TIME.)
The case seems weak, for a number of reasons (encryption is not bound by political boundaries; Apple shouldn’t be compelled to work for the government especially when they have done nothing wrong; the laws referred to as CALEA would seem to forbid the FBI’s approach; we don’t live in a police state; and so on), but you can’t assume that the judge in the case can be swayed by logic. I’d rather Tim Cook and Apple engineers were spending time improving iTunes, and fixing bugs in Mac OS X El Capitan instead of fighting government overreach, but you can’t control the universe, only react to its whims.
I want to note another point, as discussed extensively by Jonathan Zdziarski: the idea of a warrant-proof zone. Doctor-patient privilege, diplomatic pouches, married couples, journalistic sources, these and other areas are also “dark” in the FBI parlance. Even in court, even in cases that inflame the public’s interest, even then, a lawyer cannot be compelled to reveal what their client told them.
There are other examples that could be mentioned, but the point is that our country recognizes many laws and international treaties that support the concept of warrant proof as a valid concept. It is not only well within Apple’s rights to produce a product that happens to be warrant-proof, but it’s actually Apple’s responsibility to create a product that’s capable of enforcing the highest level of security permitted by our country’s laws… not the lowest. Apple is well within not only their rights, but in practices that support and place appropriate locks consistent with the levels of privacy our country recognizes. These products protect everyone – diplomats, doctors, journalists, as well as all of us. Of course they should be this secure. If our own country recognizes warrant proof as a thing, of course our technology should too.
We, as everyday Americans, should also encourage the idea of warrant proof places. The DOJ believes, quite erroneously, that the Fourth Amendment gives them the right to any evidence or information they desire with a warrant. The Bill of Rights did not grant rights to the government; it protected the rights of Americans from the overreach that was expected to come from government. Our most intimate thoughts, our private conversations, our ideas, our -intent- are all things our phone tracks. These are concepts that must remain private (if we choose to protect them) for any functioning free society. In today’s technological landscape, we are no longer giving up just our current or future activity under warrant, but for the first time in history, making potentially years of our life retroactively searchable by law enforcement. Things are recorded in ways today that no one would have imagined, even when CALEA was passed. The capability that DOJ is asserting is that our very lives and identities – going back across years – are subject to search. The Constitution never permitted this.
The bottom line is this: Our country actually recognizes warrant proof data, and Apple has every right and ethical obligation to recognize it in the design of their products. As Americans, we should be demanding our thoughts, conversations, and identities be protected with the highest level of security. This isn’t just about credit cards.
(click here to continue reading Apple Should Own The Term “Warrant Proof” | Zdziarski’s Blog of Things.)
I’m on Apple’s side on this, 1,000%, the government should not be allowed such latitude. Apple currently has the full letter on their website, some excerpts below.
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
(click here to continue reading Customer Letter – Apple.)
A new version of the iOS, created just for the government to inspect our private communications? That doesn’t sound good, in fact, that is a horrible precedent for private industry. I assume this case will be appealed all the way to the Supreme Court, all the more reason to have a full 9 Justices sitting on the court.
Tim Cook continues:
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.
While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.
(click here to continue reading Customer Letter – Apple.)
The All Writs Act is a United States federal statute, codified at 28 U.S.C. § 1651, which authorizes the United States federal courts to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”
(click here to continue reading All Writs Act – Wikipedia, the free encyclopedia.)
The NYT gives a little context:
Apple said on Wednesday that it would oppose and challenge a federal court order to help the F.B.I. unlock an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December.
On Tuesday, in a significant victory for the government, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California ordered Apple to bypass security functions on an iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.
Judge Pym ordered Apple to build special software that would essentially act as a skeleton key capable of unlocking the phone.
But hours later, in a statement by its chief executive, Timothy D. Cook, Apple announced its refusal to comply. The move sets up a legal showdown between the company, which says it is eager to protect the privacy of its customers, and the law enforcement authorities, who say that new encryption technologies hamper their ability to prevent and solve crime.
(click here to continue reading Tim Cook Opposes Order for Apple to Unlock iPhone, Setting Up Showdown – The New York Times.)
The WSJ adds:
Apple Inc. Chief Executive Tim Cook said the company will oppose a federal judge’s order to help the Justice Department unlock a phone used by a suspect in the San Bernardino, Calif., attack.
In a strongly worded letter to customers posted on Apple’s website early Wednesday, Mr. Cook called the order an “unprecedented step which threatens the security of our customers” with “implications far beyond the legal case at hand.”
The order, reflected in legal filings unsealed Tuesday, marks a watershed moment in the long-running argument between Washington and Silicon Valley over privacy and security.
In the order, U.S. Magistrate Judge Sheri Pym agreed with a Justice Department request that Apple help unlock an iPhone 5C once used by Syed Rizwan Farook. The order calls on Apple to disable certain security measures on the phone, including a feature that permanently disables the phone after 10 unsuccessful tries at the password. Such measures have kept agents from reviewing the contents of the phone, according to the filing. When the phone is locked, the data is encrypted.
Apple said it isn’t opposing the order lightly nor does it question the FBI’s intentions, but it feels that the government has overreached.
In her order, Judge Pym gave Apple five days to appeal.
(click here to continue reading Apple Opposes Judge’s Order to Help Unlock Phone Linked to San Bernardino Attack – WSJ.)
embiggen by clicking
I took Rye on June 07, 2015 at 10:15AM
and processed it in my digital darkroom on June 07, 2015 at 05:20PM
Kirk McElhearn, a long-time Mac columnist, adds his voice to the chorus of iPhone owners dismayed with iTunes 12 and iOS 8.
Now, syncing an iOS device—iPhone, iPad, or iPod—is too often an ordeal. And it is because it’s become untrustworthy. Will the sync work at all or will your content disappear and be transformed into something that fills the amorphous “Other” category in iTunes’ capacity bar. Will all of your content sync or just your music, or music, or apps?
Sync problems between iTunes and iOS devices are all too common. (See the last thirty days of posts in Apple’s support forums about iTunes sync issues.) In a way, this may be a predictable side effect of Apple’s push to online services. The company wants everything to be in the cloud, and it would prefer that you buy all your music and movies from there as well. Local syncing isn’t really a part of that plan and so may be treated as an afterthought. The difficulty is that not all users are right for the cloud model. For those with large iTunes libraries, or with limited broadband bandwidth, cloud storage simply isn’t usable.
Given that, it’s time to revisit local syncing. In its current state, iTunes syncing is broken and it can only be fixed by Apple.
Apple needs to fix syncing. While users who don’t sync their iOS devices in this way aren’t affected by these issues, those people with small and large iTunes libraries alike report syncing problems. It’s frustrating, and the fact that there’s no way to find out what’s wrong makes it even more so. In an ideal world iTunes would have some kind of sync log or sync diagnostic tool, akin to the Network Diagnostics utility, that would help ferret out problems and let people get on with enjoying their media.
(click here to continue reading iTunes syncing is broken: Apple, please fix it | Macworld.)
I’ve written at least once about my frustrations with syncing, and by my count, I’ve had to restore my iPhone 6-minus at least ten times since I got it last fall. Ten times! New Year’s Eve1 was number eleven, and for some reason2 the PIN I used yesterday would not unlock my iPhone today. Since I have Find my iPhone turned on, I was unable to restore directly via my Mac, and had to log on to https://www.icloud.com/#find, and remotely wipe the iPhone.
Restore Number 12 finally began, and because I use my iPhone for more than just a phone, the syncing takes for freaking ever3, and I probably won’t have use of a phone for several hours.
Sure there are much worse problems in the world, but iPhone owners want devices that we spend thousands of dollars annually4 on to actually work. Currently, the iTunes 12/iOS 8 platform is not up the usual Apple standards. Constantly having to reinstall the software is not customer-friendly.
For perhaps the five hundredth time this decade,1 I spent a long time trying to login to YouTube to upload a video, and my password was not accepted, even though I’d copied it right out of 1 Password. After wasting about ten minutes trying to figure it out, I remembered that because I have set up a 2-Step Verification for my Google account, I have to generate an App specific password for logging into YouTube. I’m not sure why YouTube is different than other 2-Step Verification services2, but at least the solution is easy enough, once you remember that is why your password keeps failing. You’d think Google could update YouTube to at least give a hint that enabling 2-Step verification means a user can’t login simply with email and password. I mean, would it be that hard for the YouTube iOS App to add a footer to the login page? Or at least a suggestion to look to the App passwords page if a password fails a few times?
Anyway, after I did the proper Google search, I ended up here, with these instructions.
Sign in using App Passwords
An App password is a 16-digit passcode that gives an app or device permission to access your Google Account. If you use 2-Step-Verification and are seeing a “password incorrect” error when trying to access your Google Account, an App password may solve the problem. Most of the time, you’ll only have to enter an App password once per app or device, so don’t worry about memorizing it.
- Visit your App passwords page. You may be asked to sign in to your Google Account.
- At the bottom, click Select app and choose the app you’re using.
- Click Select device and choose the device you’re using.
- Click Generate.
- Follow the instructions to enter the App password (the 16 character code in the yellow bar) on your device.
(click here to continue reading Sign in using App Passwords – Accounts Help.)
That’s pretty clear, and simple, once you know that is what you are required to do.
Perhaps since I’m writing a post about this procedure, I’ll remember next time I’m uploading a video from a new iOS device, or a new app that uses YouTube.
Also, the video was pretty dark, I’ll have to retry with better lighting next time I have a can of Nuclear Winter beer by Finch’s Beer…
My app specific list looks like this3
Google App specific passwords, a partial list
With a name like Nuclear Winter, what else could I do?
update, damn, this post became a spam comment magnet so we’re disabling comments for a while. Sorry.
- every time I get a new iPhone or iPad, or Apple TV basically. Though some apps use YouTube as well, I’m guessing this has happened more than three million times since I’ve enabled 2-Step Verification [↩]
- for instance, I use 2-Step Verification for Tumblr, for Twitter, for Buffer, and probably some others too [↩]
- not all shown [↩]
embiggen by clicking
I took Lean on The Wind on December 08, 2013 at 12:27PM
and processed it in my digital darkroom on November 22, 2014 at 03:09PM
FBI Director James Comey continues his public obfuscation tour, blaming the upcoming Joker and Riddler crime spree in Gotham on the fairly new ability of consumers to encrypt data on their own phones against unwilling intrusions by governments and other entities.
The director of the F.B.I., James B. Comey, said on Thursday that the “post-Snowden pendulum” that has driven Apple and Google to offer fully encrypted cellphones had “gone too far.” He hinted that as a result, the administration might seek regulations and laws forcing companies to create a way for the government to unlock the photos, emails and contacts stored on the phones.
But Mr. Comey appeared to have few answers for critics who have argued that any portal created for the F.B.I. and the police could be exploited by the National Security Agency, or even Russian and Chinese intelligence agencies or criminals. And his position seemed to put him at odds with a White House advisory committee that recommended against any effort to weaken commercial encryption.
Apple and Google have announced new software that would automatically encrypt the contents of cellphones, using codes that even the companies could not crack. Their announcement followed a year of disclosures from Edward J. Snowden, the former government contractor who revealed many government programs that collect electronic data, including information on Americans.
The new encryption would hinder investigations involving phones taken from suspects, recovered at crime scenes or discovered on battlefields. But it would not affect information obtained by real-time wiretaps, such as phone conversations, emails or text messages. And the government could still get information that is stored elsewhere, including emails, call logs and, in some cases, old text messages.
(click here to continue reading James Comey, F.B.I. Director, Hints at Action as Cellphone Data Is Locked – NYTimes.com.)
You know what isn’t mentioned in this long article? Warrants. I wonder why that is? Could it be that most criminal masterminds do not store their plans to rob Gotham National Bank solely upon their encrypted cellphones, leaving law enforcement completely in the dark? Possibly The Joker leaves other traces of his plan elsewhere? Or discusses his machinations with co-conspirators? According to Mr. Comey, without the government retaining the ability to tap into each and every one of our cellphones at any time, The Joker will win. He’ll win! He’ll win, Batman!
or as Marcy Wheeler rightfully notes, this seems to really be about warrantless searching, especially at the US border:
Encrypting iPhones might have the biggest impact on law enforcement searches that don’t involve warrants, contrary to law enforcement claims this is about warranted searches. As early as 2010, Customs and Border Patrol was searching around 4,600 devices a year and seizing up to 300 using what is called a “border exception.” That is when CBP takes and searches devices from people it is questioning at the border. Just searching such devices does not even require probable cause (though seizing them requires some rationale). These searches increasingly involve smart phones like the iPhone.
These numbers suggest border searches of iPhones may be as common as warranted searches of the devices. Apple provided account content to U.S. law enforcement 155 times last year. It responded to 3,431 device requests, but the “vast majority” of those device requests involved customers seeking help with a lost or stolen phone, not law enforcement trying to get contents off a cell phone (Consumer Reports estimates that 3.1 million Americans will have their smart phones stolen this year). Given that Apple has by far the largest share of the smart phone market in the U.S., a significant number of border device searches involving a smart phone will be an iPhone. Apple’s default encryption will make it far harder for the government to do such searches without obtaining a warrant, which they often don’t have evidence to get.
If law enforcement wants to retain this access, they should be honest about what they might lose and why every iPhone user should be asked to carry a phone that is susceptible to criminal targeting as a result. Trading default encryption for a limited law enforcement purpose is just that — a trade-off — and officials should be prepared to discuss it as such. And, as forensics expert Jonathan Zdziarski explains, there’s a mountain of other data still available to help law enforcement solve crimes. “There is such a mount of peripheral evidence out there that only a small handful of cases are even likely to have the iPhone be the sole smoking gun to begin with,” he explained. “Cops have iCloud data, iCloud backups, call records, voicemail records, text messages from the carrier (if obtained within a certain retention period), gmail, email, web logs, trap and trace, proxy logs, not to mention copies of data from other people involved or from the victims themselves, desktop backups (if available), sometimes even a desktop (as many criminals don’t use encryption at all). Add to that they’re eavesdropping on the whole damn Internet.”
(click here to continue reading America’s huge iPhone lie: Why Apple is being accused of coddling child molesters – Salon.com.)
For the third time in the last 2 months, I’ve had to restore my iPhone to factory settings – a long, laborious process – because a sync failed, and left “Other” data behind. This “Other” data is music, but the iOS cannot make sense of it, and just ignores it, except I cannot ever sync the iPhone again because there isn’t enough room. There is no way to get at the file system to delete this crud, other than resetting the iPhone back to as if I just opened it from its box.
A real PITA, in other words, that takes several hours from start to finish.
iPhone Data Other.PNG
See, the Other Data is so large, that the iPhone sync process fails. Grrr…
First, Backup the iPhone. Turn Sync Music Off (click the toggle button), rsync. I’ve found that subsequently turning on use iTunes Match helps make this process actually work without failing. Backup again. Restore iPhone to Factory Setting, wait the 90 minutes or so before this finishes. Enable Location Services, log in to iCloud, etc., Sync. Wait until all the apps and photos, books, etc. sync. Restore Hipstamatic lens/film combos. Toggle Sync Music back on. Sync again, hopefully for the last time. All told, I started around 4:30 PM, and now it is nearly 11 PM, and the final sync isn’t completed yet (though I had dinner, drank some wine, watched a little television, and so forth, these times might have been slightly less had I sat in front of my computer all night waiting for the various processes to finish) – it has only synced about 10% of my music so far. At least there is an end in sight.1
Oh, and also recreating the TouchID fingerprint scan, another few minutes of time – time that was interesting to do the first time, not that cumbersome the second time, but now, the third time in 60 days? Not ideal…Footnotes:
- 12:33 AM when the phone is finally usable again. Sheesh [↩]
Ricky Gervais said it best:
The dictionary definition of God is “a supernatural creator and overseer of the universe.” Included in this definition are all deities, goddesses and supernatural beings. Since the beginning of recorded history, which is defined by the invention of writing by the Sumerians around 6,000 years ago, historians have cataloged over 3700 supernatural beings, of which 2870 can be considered deities.
So next time someone tells me they believe in God, I’ll say “Oh which one? Zeus? Hades? Jupiter? Mars? Odin? Thor? Krishna? Vishnu? Ra?…” If they say “Just God. I only believe in the one God,” I’ll point out that they are nearly as atheistic as me. I don’t believe in 2,870 gods, and they don’t believe in 2,869.
embiggen by clicking
I took An Atheist To Your Religion Too on September 08, 2012 at 01:51PM
and processed it in my digital darkroom on August 09, 2014 at 03:39PM
embiggen by clicking
I took Seem Never Satisfied on July 05, 2014 at 04:57PM
and processed it in my digital darkroom on July 05, 2014 at 09:57PM
embiggen by clicking
I took Caught in Your Symmetries on July 01, 2014 at 01:45PM
and processed it in my digital darkroom on July 05, 2014 at 09:58PM
embiggen by clicking
I took Templeton Rye Old Fashioned with mashed cherries on June 22, 2014 at 02:43PM
and processed it in my digital darkroom on June 22, 2014 at 09:26PM
embiggen by clicking
I took Self Portrait with a Hat, Redux on June 06, 2014 at 01:35PM
and processed it in my digital darkroom on June 06, 2014 at 11:16PM
embiggen by clicking
I took You just want to be on the side that’s winning on April 19, 2014 at 01:33PM
and processed it in my digital darkroom on April 19, 2014 at 06:35PM
embiggen by clicking
I took Diversey Harbor Panorama on March 13, 2014 at 05:05PM
and processed it in my digital darkroom on March 13, 2014 at 11:31PM