FBI vs. Apple Continued – Apple ID Changed While iPhone In Government Hands

Restoring iPhone From Backup 2015-01-01 at 11.33.01 AM
Restoring iPhone From Backup 

The unnamed FBI official who was boasting to WSJ journalists about the Farook case being “nearly perfect” as a test probably wishes that quote hadn’t been used now in light of this development:

[Apple said it] had been in regular discussions with the government since early January, and that it proposed four different ways to recover the information the government is interested in without building a backdoor. One of those methods would have involved connecting the iPhone to a known Wi-Fi network and triggering an iCloud backup that might provide the FBI with information stored to the device between the October 19th and the date of the incident.

Apple sent trusted engineers to try that method, the executives said, but they were unable to do it. It was then that they discovered that the Apple ID password associated with the iPhone had been changed. (The FBI claims this was done by someone at the San Bernardino Health Department.) Had that password not been changed, the executives said, the government would not need to demand the company create a “backdoor” to access the iPhone used by Syed Rizwan Farook

(click here to continue reading Apple: Terrorist’s Apple ID Password Changed In Government Custody, Blocking Access – BuzzFeed News.)

Did you notice? The FBI had possession of Farook’s iPhone for over 24 hours, before some agent or other employee changed the Apple ID password. (!!!???!!!)

Changing the Apple ID password isn’t hard, but it isn’t something you do without meaning to.  You’d have to log-in, give the old password, then create the new password, entering it twice. Presumedly, you’d either commit the password to memory, or WRITE IT DOWN.

Hmmm, “nearly perfect test case” indeed. 

Terrorism theatre, part the 234,323rd.

After the FBI sneeringly complained that encryption, privacy and security were merely marketing phrases to Apple, Apple responded with an eyeroll…

Creating the backdoor access, the executives said, would put at risk the privacy of millions of users. It would not only serve to unlock one specific phone, they said, but create a sort of master key that could be used to access any number of devices. The government says the access being sought could only be used on this one phone, but Apple’s executives noted that there is widespread interest in an iPhone backdoor, noting that Manhattan District Attorney Cyrus Vance said Thursday that his office has 175 Apple devices he’d like cracked. They also claimed that no other government in the world has ever asked Apple for the sort of FBiOS the government is demanding that it build now.

Asked why the company is pushing back so hard against this particular FBI request when it has assisted the agency in the past, Apple executives noted that the San Bernadino case is fundamentally different from others in which it was involved. Apple has never before been asked to build an entirely new version of its iOS operating system designed to disable iPhone security measures.

The Apple senior executives also pushed back on the government’s arguments that Apple’s actions were a marketing ploy, saying they were instead based on their love for the country and desire not to see civil liberties tossed aside.

(click here to continue reading Apple: Terrorist’s Apple ID Password Changed In Government Custody, Blocking Access – BuzzFeed News.)

Booting Up

If you haven’t read digital forensics expert Jonathan Zdziarski’s blog post entitled “Apple, FBI, and the Burden of Forensic Methodology”, you should click through and read it right away (well, within 5 seconds). The FBI’s request is quite a big ask, not something considered last minute, but obviously planned carefully for maximum impact. Director Comey has been pushing for back doors to Apple and Google smartphones for a long time. 

Apple must be prepared to defend their tool and methodology in court; no really, the defense / judge / even juries in CA will ask stupid questions such as, “why didn’t you do it this way”, or “is this jail breaking”, or “couldn’t you just jailbreak the phone?” (i was actually asked that by a juror in CA’s broken legal system that lets the jury ask questions). Apple has to invest resources in engineers who are intimately familiar with not only their code, but also why they chose the methodology they did as their best practices. If certain challenges don’t end well, future versions of the instrument may end up needing to incorporate changes at the request of FBI.

If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.

In the likely event that FBI compels the use of the tool for other devices, Apple will need to maintain engineering and legal staff to keep up to date on their knowledge of the tool, maintain the tool, and provide testimony as needed.

In other words, developing an instrument is far more involved than simply dumping a phone for FBI, which FBI could have ordered:

  • Developed to forensically sound standards 
  • Validated and peer-reviewed 
  • Be tested and run on numerous test devices 
  • Accepted in court 
  • Given to third party forensics experts (testing) 
  • Given to defense experts (defense) 
  • Stand up to challenges 
  • Be explained on the stand 
  • Possibly give source code if ordered 
  • Maintain and report on issues 
  • Defend lawsuits from those convicted 
  • Legally pursue any agencies, forensics companies, or hackers that steal parts of the code. 
  • Maintain legal and engineering staff to support it 
  • On appeals, go through much of the process all over again

The risks are significant too:

  • Ingested by an agency, reverse engineered, then combined with in-house or purchased exploits to fill in the gap of code signing.
  • Ingested by private forensics companies, combined with other tools / exploits, then sold as a commercial product.
  • Leaked to criminal hackers, who reverse engineer and find ways to further exploit devices, steal personal data, or use it as an injection point for other ways to weaken the security of the device.
  • The PR nightmare from demonstrating in a very public venue how the company’s own products can be back doored.
  • The judicial precedents set to now allow virtually any agency to compel the software be used on any other device.
  • The international ramifications of other countries following in our footsteps; many countries of which have governments that oppress civil rights.

This far exceeds the realm of “reasonable assistance”, especially considering that Apple is not a professional forensics company and has no experience in designing forensic methodology, tools, or forensic validation. FBI could attempt to circumvent proper validation by issuing a deviation (as they had at one point with my own tools), however this runs the risk of causing the house of cards to collapse if challenged by a defense attorney.

(click here to continue reading Apple, FBI, and the Burden of Forensic Methodology | Zdziarski’s Blog of Things.)

Not something an Apple intern can do in an afternoon, in other words, but a significant task imposed on a private corporation by a government agency, in support of “what some law-enforcement officials privately describe as a nearly perfect test case.” 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.