The ongoing encryption and security debate escalates this time involving a court order and Apple

February 18, 2016 |

Internet service providers, technology based companies and businesses who rely on safe communication on line rely on encryption and security features are protective of their integrity.  There was plenty enough reputational blow back to go around several times when the details of the National Security Agency PRISM program came to light courtesy of the Snowden revelations.

Law enforcement and security agencies have lobbied long and loudly to either obtain encryption keys or have back doors built into encryption software. Providing encryption keys to third parties, usually government agencies, and installing back doors are both fraught practices.  A new approach is being adopted in the latest chapter in the debate playing out in the United States with a court order unlock a phone used by one of the San Bernardino, Calif., terrorists.  Apple’s iOS has security features that prevent brute force attacks on the phone. After 10 wrong guesses, the phone is wiped.

As a result of an application made by the United States Attorney the US District Court the court ordered Apple to provide assistance. The specific terms of the order are:

  1. Apple shall assist in enabling the search of a cellular telephone, Apple make: iPhone SC, Model: Al532, P/N:MGFG2LL/A, S/N:FFMNQ3MTG2DJ, IMEI:358820052301412, on the Verizon Network, (the “SUBJECT DEVICE”) pursuant to a warrant of this Court by providing reasonable technical assistance to assist law enforcement agents in obtaining access to the data on the SUBJECT DEVICE.
  2. Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE; and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.  
  3. Apple ‘s reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (“SIF”) that can be loaded onto the SUBJECT DEVICE.  The SIF will load and run from Random Access Memory (“RAM”) and will not modify the iOS on the actual phone, the user data partition or system partition on the device ‘s flash memory.   The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.   The SIF will be loaded via Device Firmware Upgrade (“DFU”) mode, recovery mode, or other applicable mode available to the FBI.  Once active on the SUBJECT DEVICE, the SIF will accomplish the three functions specified in paragraph 2.  The SIF will be loaded on the SUBJECT DEVICE at either a government facility, or alternatively, at an Apple facility; if the latter, Apple shall provide the government with remote access to the SUBJECT DEVICE through a computer allowing the government to conduct passcode recovery
  4.  If Apple determines that it can achieve the three functions stated above in paragraph 2, as well as the functionality set forth in paragraph 3, using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this Order in that
  5. Apple shall advise the government of the reasonable cost of providing this service.
  6. Although Apple shall make reasonable efforts to maintain the integrity of data on the SUBJECT DEVICE, Apple shall not be required to maintain copies of any user data as a result of the assistance ordered All evidence preservation shall remain the responsibility of law enforcement
  7. To the extent that Apple believes that compliance with this Order would be unreasonably burdensome, it may make an application to this Court for relief within five business days of receipt of the Order.

What the FBI is demanding in effect is that that Apple write a new iOS to install onto the phone to get around these features. The BBC has an excellent piece on the issues at  Apple vs the FBI – a plain English guide which provides:

Apple chief executive Tim Cook says the FBI’s court order to access the mobile phone of San Bernardino killer Syed Farook is “dangerous”, “chilling” and “unprecedented”.

The FBI says Apple’s lack of co-operation is hindering its investigation.

Here’s a plain English guide to the debate, and an explanation of what may happen next.

Before we begin, let’s establish what the FBI isn’t asking for: it doesn’t want Apple to break the encryption on the device. Why? Because it can’t.

Apple made the conscious choice in 2014 to remove itself from being able to access encrypted devices, mainly to avoid ethical dilemmas like this. So…

What exactly does the FBI want?

The FBI wants Apple to alter what is known as a SIF – System Information File. In this context, the FBI is basically referring to the software that runs on the device. The FBI wants Apple to create a new SIF to place on Farook’s iPhone that will allow it to carry out several functions normal iPhones do not allow.

The FBI wants to be able to:

  1. Prevent the phone from erasing itself. If certain security settings are enabled, after 10 failed attempts at entering a passcode, an iPhone can erase the personal data on the device. The FBI doesn’t want to this to happen on Farook’s phone.
  2. Automate the process for trying out passcode combinations. Farook used a four-digit passcode, for which there are 10,000 possible combinations. The FBI doesn’t want to have to guess them all manually, and so it wants Apple to allow the passcode to be tried electronically. This means the FBI could simply instruct a computer to try every passcode, something that would take just minutes, possibly seconds…
  3. …and without unnecessary delay. The iPhone prevents you from entering a passcode for longer and longer periods of time each time you get it wrong. The FBI wants this barrier removed.
  4. Control the process, but not know how it’s done. This is an interesting line, as it is suggests the FBI is willing to allow Apple to work on the phone at its own HQ, and in a way that doesn’t risk the encryption software being released into the world.

As this row goes through the courts, expect that final element to be a key point the FBI makes – it will argue that the SIF will only work on Farook’s phone, and will be known only by Apple, who could choose to destroy it.

Why is Apple refusing to comply?

In a letter to customers, Apple boss Tim Cook said he did not want to introduce what is known in IT security as a “back door”. Like a literal back door, it’s simply a different way in. In this case, a different way to get into the phone other than by using the passcode, i.e. the front door.

Back doors are a big deal in security. Hackers make their money from finding them – a back door into a major piece of software or popular device can be highly lucrative. Buyers range from criminals to governments looking to spy or obtain data they otherwise wouldn’t be able to reach.

Apple says introducing a back door into the iPhone wouldn’t just make Farook’s phone insecure and accessible to the US government – it would make every iPhone inherently weaker.

“You can’t have a back door that’s only for the good guys,” Mr Cook said in an interview in 2015.

“Any back door is something that bad guys can exploit.”

Can it even be done?

Most experts the BBC has spoken to think it is possible to access Farook’s phone without harming the data. And significantly, Apple hasn’t denied it’s possible either, instead choosing to discuss the merits of why it thinks it shouldn’t.

An in-depth explanation of how it could be done was posted by security research firm Trail of Bits.

By using the same technique that enables “jailbreaking” – the practice of forcibly removing restrictions and security measures within the iPhone’s software – you could force new software onto the iPhone, researcher Dan Guido wrote.

He said that by using security signatures that only it possesses, Apple is capable of creating modified software that would work just on Farook’s iPhone.

“This customized version of iOS (*ahem* FBiOS) will ignore passcode entry delays, will not erase the device after any number of incorrect attempts, and will allow the FBI to hook up an external device to facilitate guessing the passcode,” he wrote.

“The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus.”

Who is supporting Apple?

On Wednesday, Apple’s peers in the technology industry – also eager to keep reputations over security intact – gave their backing to the iPhone maker.

Jan Koum, the creator of Whatsapp, which is owned by Facebook, wrote: “We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake.”

The Information Technology Industry Council, a lobbying group that represents Google, Facebook, Microsoft, Samsung, Blackberry and a host of others, put out this statement: “Our fight against terrorism is actually strengthened by the security tools and technologies created by the technology sector, so we must tread carefully given our shared goals of improving security, instead of creating insecurity.”

Google chief executive Sundar Pichai said: “Forcing companies to enable hacking could compromise users’ privacy.”

Edward Snowden, whose revelations about US government spying provoked Apple’s stance on passcode-protected data, said the FBI was “creating a world where citizens rely on Apple to defend their rights, rather than the other way around”.

Who is backing the FBI?

White House press secretary Josh Earnest told reporters on Wednesday that the FBI was “simply asking for something that would have an impact on this one device”.

The FBI, he said, had the full support of the White House in the matter.

Potential Republican Presidential candidate Donald Trump has said he agreed “100% with the courts”.

“We should open it up,” he told Fox News.

While much of the technology community has backed Apple’s stance, some commentators say the company is framing the debate poorly.

On mic.com, writer Jack Smith argued: “The truth is that there is a protection in place: a warrant.

“We should fight to make warrants difficult to obtain. But the real unprecedented feat is the idea that a corporation like Apple should be able to prevent our law enforcement from carrying out a lawfully obtained warrant.”

In the UK, the family of murdered Fusilier Lee Rigby told the BBC Apple was “protecting a murderer’s privacy at the cost of public safety”.

What happens next?

Apple has a few more days to file its formal response to the court, which can be summed up as: “No.”

After a series of briefings at this local level, if neither side is happy, the case will be passed on to the District Court.

Still no solution?

The case would then be escalated to the Court of Appeals for the Ninth Circuit, the court which handles these sorts of issues on the US West Coast.

If that court backs the FBI, and Apple again refuses, it could eventually reach the US Supreme Court, whose decision will ultimately be final, and in this utterly fascinating case, precedent setting.

That could take several years.

And the always excellent Wired has an excellent analysis in Apple’s FBI Battle Is Complicated. Here’s What’s Really Going On which provides:

The news this week that a magistrate ordered Apple to help the FBI hack an iPhone used by one of the San Bernardino shooter suspects has polarized the nation—and also generated some misinformation.

Those who support the government say Apple has cooperated in the past to unlock dozens of phones in other cases—so why can’t it help the FBI unlock this one?

But this isn’t about unlocking a phone; rather, it’s about ordering Apple to create a new software tool to eliminate specific security protections the company built into its phone software to protect customer data. Opponents of the court’s decision say this is no different than the controversial backdoor the FBI has been trying to force Apple and other companies to build into their software—except in this case, it’s an after-market backdoor to be used selectively on phones the government is investigating.

The stakes in the case are high because it draws a target on Apple and other companies embroiled in the ongoing encryption/backdoor debate that has been swirling in Silicon Valley and on Capitol Hill for the last two years. Briefly, the government wants a way to access data on gadgets, even when those devices use secure encryption to keep it private.

Apple specifically altered its software in 2014 to ensure that it would not be able to unlock customer phones and decrypt any of the most important data on them; but it turns out it overlooked a loophole in doing this that the government is now trying to exploit. The loophole is not about Apple unlocking the phone but about making it easier for the FBI to attempt to unlock it on its own. If the controversy over the San Bernardino phone causes Apple to take further steps to close that loophole so that it can’t assist the FBI in this way in the future, it could be seen as excessive obstinance and obstruction by Capitol Hill. And that could be the thing that causes lawmakers to finally step in with federal legislation that prevents Apple and other companies from locking the government out of devices.

If the FBI is successful in forcing Apple to comply with its request, it would also set a precedent for other countries to follow and ask Apple to provide their authorities with the same software tool.

In the interest of clarifying the facts and correcting some misinformation, we’ve pulled together a summary of the issues at hand.

What Kind of Phone Are We Talking About?

The phone in question is an iPhone 5c running the iOS9 version of Apple’s software. The phone is owned by the San Bernardino Department of Public Health, which gave it to Syed Rizwan Farook, the shooter suspect, to use for work.
What Is the Issue?

Farook created a password to lock his phone, and due to security features built into the software on his device, the FBI can’t unlock the phone and access the data on it using the method it wants to use—a bruteforce password-guessing technique wherein they enter different passcodes repeatedly until they guess the right one—without running the risk that the device will lock them out permanently.

How Would It Do That?

Apple’s operating system uses two factors to secure and decrypt data on the phone–the password the user chooses and a unique 256-bit AES secret key that’s embedded in the phone when it’s manufactured. As cryptographer Matthew Green explains in a blog post, the user’s password gets “tangled” with the secret key to create a passcode key that both secures and unlocks data on the device. When the user enters the correct password, the phone performs a calculation that combines these two codes and if the result is the correct passcode, the device and data are unlocked.

To prevent someone from brute-forcing the password, the device has a user-enabled function that limits the number of guesses someone can try before the passcode key gets erased. Although the data remains on the device, it cannot be decrypted and therefore becomes permanently inaccessible. The government’s motion to the court (.pdf) notes that this happens after 10 failed guesses when the auto-erase feature is enabled by the user.

The government says it does not know for certain if Farook’s device has the auto-erase feature enabled, but notes in its motion that San Bernardino County gave the device to Farook with it enabled, and the most recent backup of data from his phone to iCloud “showed the function turned on.”

A reasonable person might ask why, if the phone was backing data up to iCloud, the government can’t just get everything it needs from iCloud instead of breaking into the phone. The government did obtain some data backed up to iCloud from the phone, but authorities allege in their court document that he may have disabled iCloud backups at some point. They obtained data backed up to iCloud a month before the shootings, but none closer to the date of the shooting, when they say he is most likely to have used the phone to coordinate the attack.

Is This Auto-Erase the Only Security Protection Apple Has in Place?

No. In addition to the auto-erase function, there’s another protection against brute force attacks: time delays. Each time a password is entered on the phone, it takes about 80 milliseconds for the system to process that password and determine if it’s correct. This helps prevent someone from quickly entering a new password to try again, because they can only guess a password every 80 milliseconds. This might not seem like a lot of time, but according to Dan Guido, CEO of Trail of Bits, a company that does extensive consulting on iOS security, it can be prohibitively long depending on the length of the password.

“In terms of cracking passwords, you usually want to crack or attempt to crack hundreds or thousands of them per second. And with 80 milliseconds, you really can only crack eight or nine per second. That’s incredibly slow,” he said in a call to reporters this week.

With a four-digit passcode, he says, there are only about 10,000 different combinations a password-cracker has to try. But with a six-digit passcode, there are about one million different combinations a password cracker would have to try to guess the correct one—a simple six-digit passcode composed of just numbers would take a couple of days to crack, Guido says; but a more complex six-character password composed of letters and numbers could take more than five-and-a-half-years, according to Apple. The iOS9 software, which appears to be the software on the San Bernardino phone, asks you to create a six-digit password by default, though you can change this requirement to four digits if you want a shorter one.

Later models of phones use a different chip than the iPhone 5c and have what’s called a “secure enclave” that adds even more time delays to the password-guessing process. Guido describes the secure enclave as a “separate computer inside the iPhone that brokers access to encryption keys” increasing the security of those keys.

With the secure enclave, after each wrong password guess, the amount of time you have to wait before trying another password grows with each try; by the ninth failed password you have to wait an hour before you can enter a tenth password. The government mentioned this in its motion to the court, as if the San Bernardino phone has this added delay. But the iPhone 5c does not have secure enclave on it, so the delay would really only be the usual 80 milliseconds in this case.

Why None of This Is an Issue With Older iPhones

With older versions of Apple’s phone operating system—that is, phones using software prior to iOS8—Apple has the ability to bypass the user’s passcode to essentially unlock the device and access data on the phone. It has done so in dozens of cases over the years, pursuant to a court order. But beginning with iOS8, Apple changed this so that it securely encrypts all of the most important data on your phone by default—photos, messages, contacts, call history—using the password you choose. And Apple cannot bypass your password to obtain that data.

According to the motion filed by the government in the San Bernardino case, the phone in question is using a later version of Apple’s operating system—which appears to be iOS9. We’re basing this on a statement in the motion that reads: “While Apple has publicized that it has written the software differently with respect to iPhones such as the SUBJECT DEVICE with operating system (“iOS”)9, Apple yet retains the capacity to provide the assistance sought herein that may enable the government to access the SUBJECT DEVICE pursuant to the search warrant.”

The government is referring to the changes that Apple made with iOS8 that exist in iOS9 as well. Apple released iOS9 in September 2015, three months before the San Bernardino attacks occurred, so it’s very possible this is indeed the version installed on the San Bernardino phone.

After today, technology vendors need to consider that they might be the adversary they’re trying to protect their customers from.

What Does the Government Want?

A lot of people have misconstrued the government’s request and believe it asked the court to order Apple to unlock the phone, as Apple has done in many cases before. But as noted, the particular operating system installed on this phone does not allow Apple to bypass the passcode and decrypt the data. So the government wants to try bruteforcing the password without having the system auto-erase the decryption key and without additional time delays. To do this, it wants Apple to create a special version of its operating system, a crippled version of the firmware that essentially eliminates the bruteforcing protections, and install it on the San Bernardino phone. It also wants Apple to make it possible to enter password guesses electronically rather than through the touchscreen so that the FBI can run a password-cracking script that races through the password guesses automatically. It wants Apple to design this crippled software to be loaded into memory instead of on disk so that the data on the phone remains forensically sound and won’t be altered.

Note that even after Apple does all of this, the phone will still be locked, unless the government’s bruteforcing operation works to guess the password. And if Farook kept the iOS9 default requirement for a six-digit password, and chose a complex alpha-numeric combination for his password, the FBI might never be able to crack it even with everything it has asked Apple to do.

Apple CEO Tim Cook described the government’s request as “asking Apple to hack our own users and undermine decades of security advancements that protect our customers—including tens of millions of American citizens—from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.”
What Exactly Is the Loophole You Said the Government Is Exploiting?

The loophole is the fact that Apple still retains the ability to run crippled firmware on a device like this without requiring the user to approve it, the way software updates usually work. If this required user approval, Apple would not be able to do what the government is requesting.

How Doable Is All of This?

Guido says the government’s request is completely doable and reasonable.

“They have to make a couple of modifications. They have to make it so that the operating system boots inside of a RAM disk…[and] they need to delete a bunch of code—there’s a lot of code that protects the passcode that they just need to trash,” he said.

Making it possible for the government to test passwords with a script instead of typing them in would take a little more effort he says. “[T]hat would require a little bit of extra development time, but again totally possible. Apple can load a new kernel driver that allows you to plug something in over the [Lightning] port… It wouldn’t be trivial but it wouldn’t be massive.”

Could This Same Technique Be Used to Undermine Newer, More Secure Phones?

There has been some debate online about whether Apple would be able to do this for later phones that have newer chips and the secure enclave. It’s an important question because these are the phones that most users will have in the next one or two years as they replace their old phones. Though the secure enclave has additional security features, Guido says that Apple could indeed also write crippled firmware for the secure enclave that achieves exactly what the FBI is asking for in the San Bernardino case.

“It is absolutely within the realm of possibility for Apple themselves to tamper with a lot of the functionality of the secure enclave. They can’t read the secure private keys out of it, but they can eliminate things like the passcode delay,” he said. “That means the solution that they might implement for the 5c would not port over directly to the 5s, the 6 or the 6s, but they could create a separate solution for [these] that includes basically crippled firmware for the secure enclave.”

If Apple eliminates the added time delays that the secure enclave introduces, then such phones would only have the standard 80-millisecond delay that older phones have.

“It requires more work to do so with the secure enclave. You have to develop more software; you have to test it a lot better,” he said. “There may be some other considerations that Apple has to work around. [But] as far as I can tell, if you issue a software update to the secure enclave, you can eliminate the passcode delay and you can eliminate the other device-erase [security feature]. And once both of those are gone, you can query for passcodes as fast as 80 milliseconds per request.”

What Hope Is There for Your Privacy?

You can create a strong alpha-numeric password for your device that would make bruteforcing it essentially infeasible for the FBI or anyone else. “If you have letters and numbers and it’s six, seven or eight digits long, then the potential combinations there are really too large for anyone to bruteforce,” Guido said.
And What Can Apple Do Going Forward?

Guido says Apple could and should make changes to its system so that what the FBI is asking it to do can’t be done in future models. “There are changes that Apple can make to the secure enclave to further secure their phones,” he said. “For instance, they may be able to require some kind of user confirmation, before that firmware gets updated, by entering their PIN code … or they could burn the secure enclave into the chip as read-only memory and lose the ability to update it [entirely].”

These would prevent Apple in the future from having the ability to either upload crippled firmware to the device without the phone owner’s approval or from uploading new firmware to the secure enclave at all.

“There’s a couple of different options that they have; I think all of them, though, are going to require either a new major version of iOS or new chips on the actual phones,” Guido said. “But for the moment, what you have to fall back on is that it takes 80 milliseconds to try every single password guess. And if you have a complex enough password then you’re safe.”
Is the Ability to Upload Crippled Firmware a Vulnerability Apple Should Have Foreseen?

Guido says no.

“It wasn’t until very recently that companies had to consider: What does it look like if we attack our own customers? What does it look like if we strip out and remove the security mitigations we put in specifically to protect customers?”

He adds: “Apple did all the right things to make sure the iPhone is safe from remote intruders, or people trying to break into the iPhone.… But certainly after today, technology vendors need to consider that they might be the adversary they’re trying to protect their customers from. And that’s quite a big shift.”

Wired has, not suprisingly, been following the story closely with Magistrate Orders Apple to Help FBI Hack San Bernardino Shooter’s Phone, Tim Cook Says Apple Will Fight Order to Help Unlock iPhone and  Apple’s Noble Stand Against the FBI Is Also Great Business.    Forbes has also undertaken a useful analysis in The U.S. vs. Apple: Does the FBI Have a Case?  The Guardian reports on the developing story in Apple challenges ‘chilling’ demand to decrypt San Bernardino shooter’s iPhone  and  Facebook and Twitter back Apple in phone encryption battle with FBI  while the Wall Street covers the story in  U.S. and Apple Dig In for Court Fight Over Encryption.

The terrorist act by the perpetrators have struck a chord in the United States.  In that way the request attracts strong moral support.  The problem is that hard cases make bad law.  And forcing Apple to unlock its phones is very bad law.

The integrity of operating systems  is a fundamental part of data security and privacy protection with communications devices, and on line communications, on portable devices, the internet of things etc…. It is critical.  Framing the debate as a law and order issue is short sighted and compromises a key element in how data is stored, transferred and transactions conducted securely.

The World Today covered the story in Apple rejects FBI request to unlock shooter’s iPhone.

It provides:

ELEANOR HALL: Australia’s Attorney-General George Brandis has joined the calls for the tech company Apple to comply with an order from law enforcement authorities in the US.

Apple is under fire today from the New York police chief and a presidential candidate. The company’s chief executive is refusing an FBI request to unlock an iPhone used by one of the shooters in the San Bernardino massacre, arguing that the demand threatens the security of Apple customers.
………
BRENDAN TREMBATH: Fourteen people were killed in the San Bernardino massacre, which ended in a shootout as police fired on a black sport utility.

REPORTER: Oh my gosh. The left side driver’s side completely blown out. This was a very, very graphic shoot out.

BRENDAN TREMBATH: An FBI investigation into the December attack has hit a brick wall, with Apple declining to help break into an iPhone used by one of the shooters.

A California magistrate has ordered Apple to provide “reasonable technical assistance” to the bureau.

The company’s refusal to cooperate with the FBI has been criticised by a cast of characters including presidential candidate Donald Trump who suggests unlocking the iPhone is “common sense”.

DONALD TRUMP: We have to be very careful, we have to be very vigilant but to think that Apple won’t allow us to get into her cell phone – who do they think they are? Now we have to open it up.

BRENDAN TREMBATH: New York police commissioner Bill Bratton says the issue needs to be resolved.

BILL BRATTON: We are increasingly blind for terrorism purposes and for general law enforcement purposes with the new devices and the continuing effort to make them even more secure against even court orders authorising law enforcement to have access.

BRENDAN TREMBATH: Apple chief executive officer, Tim Cook, in a letter to customers, claimed the demand threatened their security.

He said the Government suggests this tool could only be used once, on one phone. “But that’s simply not true,” said Mr Cook. He argued that once created, the technique could be used over and over again on any number of devices.

Tim Cook called it “the equivalent of a master key, capable of opening hundreds of millions of locks, from restaurants and banks to stores and homes.” He said, “No reasonable person would find that acceptable.”

Nate Cardozo, a lawyer at the Electronic Frontier Foundation, has told CNN that what the FBI is asking for puts ordinary Americans at risk.

NATE CARDOZO: If Apple were to back door this one phone, and make no mistake the FBI is asking for a back door regardless of whatever words they use, if Apple were to backdoor this phone, it would open the floodgates. It would give the FBI unprecedented power to compromise our security in the name of surveillance.

BRENDAN TREMBATH: Australia’s Attorney-General George Brandis has also urged Apple to assist the FBI.

He spoke in an interview with ABC correspondent, Michael Vincent.

GEORGE BRANDIS: We would expect, as in Australia, that all orders of courts should be obeyed by any party which is the subject of a lawful order by a court.

The particular facts of this case are not facts that would arise in Australia but what I think it does illustrate is at a time when encryption of data is becoming almost ubiquitous and vast quantities of data which would previously have been accessible by warrant to law enforcement agencies are inaccessible, I think it shows how important it is that ISPs (internet service providers) do cooperate with law enforcement agencies in facilitating and cooperating with proper investigations into serious crime.

MICHAEL VINCENT: You said ISPs, would you also include companies like Apple, companies that do make this software on their products because, as has been raised in the United States in the Congress in front of hearings, if it’s not an American company that’s going to be encrypting, it’ll be a Russian one, it’ll be a Chinese one?

GEORGE BRANDIS: Well, this particular case involved Apple and I think that the observation applies generally to all companies in the tech sector.

MICHAEL VINCENT: How important do you think it is that this sort of compliance occurs in Australia? Do you have any concerns in Australia that a citizen’s right to privacy would be impeded via some sort of backdoor access to personal devices for intelligence purposes or otherwise?

GEORGE BRANDIS: Well, we’re not proposing that and this is not a problem that has arisen in Australia.

MICHAEL VINCENT: Yet.

GEORGE BRANDIS: My department has established very cooperative and collaborative relationships with companies in the tech sector and we’re happy with the level of cooperation we are receiving but nevertheless, there is a broader problem for law enforcement in all jurisdictions frankly.

If data is encrypted in a way that is entirely inaccessible without the cooperation of the ISP or the maker of the device, then that makes inaccessible relevant investigative information that would hitherto have been accessible and that’s a problem for law enforcement.

ELEANOR HALL: That’s Attorney-General George Brandis with our North America correspondent Michael Vincent. Brendan Trembath compiled the report.

Those who support the order say, as the Attorney General does, that the order is valid and all that is requested is co operation by Apple as part of the legitimate investigation.  Fine as far as it goes. But there are ramifications that go beyond that with the type of orders made by the US District Court.  Those go to the integrity and operations of the system itself.  There is also the question of Government over reach in the nature of the orders and the compulsion placed upon Apple.  Then there are the semantical issues.  As the E Safety Commissioner notes in Apple is not being asked to backdoor iPhones: Australian Children’s eSafety Commissioner  what is not being requested is either a backdoor or handover of an encryption key.  But so what.  That is not where the damage may be done.

The article provides:

Australian Children’s eSafety Commissioner Alistair MacGibbon has dismissed concerns that Apple will have to backdoor one of its own products to comply with a US Federal Court order that demanded the phone maker help the FBI access data on an iPhone used by one of the San Bernardino shooters.

Speaking to ZDNet, MacGibbon said the court request was reasonable.

“What we are talking about here is a criminal court in the United States making a judgment to compel Apple to do certain things,” he said. “The thing that irks me is the use of the word ‘backdoor’ — because it implies things that this court order does not.

“It is not compelling Apple to hand over its cryptokeys, and it is definitely not compelling Apple to build a structural fault in every single device … this is a court order compelling the company to assist the FBI in a single serious criminal investigation.

“This is not the backdoor that people are talking about … they are not going to be building a flaw into the products that we know and love.”

MacGibbon was appointed to the role of eSafety Commissioner in March 2015, after spending 15 years in the Australian Federal Police, including working as the director of the Australian High Tech Crime Centre. He also worked previously for Dimension Data, eBay, and the University of Canberra.

According to MacGibbon, what the US court has demanded of Apple is not extraordinary.

“That’s no different to police using forensic technologies to break encryption and get access to devices today, which they do; it’s just clearly, they are finding it more difficult with this model of phone,” he said. “It requires the government to have seized the item, it requires them to be in physical possession of the item, it requires a court to order the company to help them. Those are all checks and balances that mean I’ll sleep very comfortably at night, assuming the government doesn’t have its hands on my phone.

“For the average person on the street, they can sleep very soundly at night knowing their device is not being tampered with. The fact that the government is going to this extent to try to gain access should in and of itself give people some comfort that the US government, or our government or others, are not doing this on a wholesale basis.”

The commissioner said the US authorities had very good reasons to be pursing the matter, and that in the “real world”, law enforcement does need access to devices from time to time, as long as there is no extrajudicial pressure applied.

“Clearly, I’m a pro law-enforcement type of guy, but not blindly so. I don’t agree with building in backdoors, and governments forcing companies to put in backdoors, because I do think it would expose consumers who actually have to trust those devices.

“Having lived and worked in America as an Australian government official, I can say that the Australian and US judicial systems will tread very carefully on these matters, as they should, as will their parliaments, because they will try to balance that concept of privacy and the integrity of the overall marketplace with what they would consider to be extreme edge cases.”

MacGibbon said the issue surrounding Apple is not dissimilar to the metadata-retention scheme recently introduced in Australia, and that “sensible discussions” are needed.

“It’s fine if we disagree; I actually relish the thought we disagree. What I don’t like is a mindset that says there is absolutely no circumstance under which governments should ever gain access to anyone’s communications data, or data that lays at rest on the device.

“Because frankly, that’s just not the real world.”

Earlier today, Australian Attorney-General George Brandis told the ABC that Apple should comply with the court order.

“We would expect, as in Australia, that all orders of courts should be obeyed by any party which is the subject of a lawful order by a court,” he said.

Brandis said that if data is encrypted in a way that law-enforcement agencies are not able to decrypt, then it presents a problem.

In November last year, Brandis said that Australian attitudes to privacy will need to be adjusted, citing attacks by ISIS as his rationale.

“There will be occasions in which we will have to accept greater limitations, greater impediments to personal privacy,” he said.

Under Brandis’ tenure as attorney-general, the Australian government has passed legislation that mandates the collection and storage of call records, assigned IP addresses, location information, billing information, and other customer data for two years for warrantless access by law enforcement.

The attorney-general’s comments reflect similar thoughts expressed by Republican presidential nominee frontrunner Donald Trump, who called for “common sense” to prevail.

“I agree 100 percent with the courts,” Trump said, according to CNN. “But to think that Apple won’t allow us to get into her cellphone, who do they think they are? No, we have to open it up.”

On the other side of the ledger, the American Civil Liberties Union (ACLU) has thrown its support behind Apple.

“This is an unprecedented, unwise, and unlawful move by the government. The Constitution does not permit the government to force companies to hack into their customers’ devices,” Alex Abdo, ACLU Speech, Privacy, and Technology Project staff attorney, said.

“Apple is free to offer a phone that stores information securely, and it must remain so if consumers are to retain any control over their private data.”

Abdo warned that the court order could set a dangerous precedent.

“If the FBI can force Apple to hack into its customers’ devices, then so too can every repressive regime in the rest of the world.”

The Electronic Frontier Foundation (EFF) has also backed Apple, saying it would file an amicus brief in support of Cupertino.

“Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security,” the EFF said.

“The US government wants us to trust that it won’t misuse this power. But we can all imagine the myriad ways this new authority could be abused. Even if you trust the US government, once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well.”

Earlier in the day, WhatsApp CEO Jan Koum joined the chorus of support for Apple.

“Today, our freedom and our liberty is at stake.”

Google, in somewhat less emphatic fashion, also weighed in on the issue, warning that compliance with the court order could compromise a user’s privacy.

“We know that law-enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe, and we give law enforcement access to data based on valid legal orders,” Google CEO Sundar Pichai said on Twitter.

“But that’s wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent.”

Mr MacGibbon’s argument tends to the glib.  Yes a back door is not being sought nor are the encryption keys.  It is much worse.  What is being sought will have a significant impact on the integrity of an operating system.  MacGibbon’s approach is bureaucratic and not particularly alive to the commercial realities of the information industry.  That is why Facebook and Twitter are backing Apple in its fight against the FBI.  They realise that weakening a security structure, even temporarily and on a case by case basis causes huge damage to an organisation’s reputation. It would be a disastrous precedent.  And no commercial operation finds the fact that court is making the order at all comforting. Mr MacGibbon’s confidence of the wariness of the courts in “treading carefully” is not borne out by statistics.  Obtaining a warrant is not as difficult as he seems to suggest.  And reliance on the meta data scheme as some form of example that all will be OK is just silly.  Many ISPs are not compliant with the law as it stands and have been given extensions.  And as the ItNews story TIO reveals mandatory data retention complaints the data retention system is not as smooth running as he might have hoped. It is a scheme that has been abandoned by other jurisdictions such as Germany.

The debate will go on.  The stakes are very high.  A system without integrity will be abandoned by its users.

One Response to “The ongoing encryption and security debate escalates this time involving a court order and Apple”

  1. The ongoing encryption debate escalates this time involving a court order and Apple | Australian Law Blogs

    […] The ongoing encryption debate escalates this time involving a court order and Apple […]

Leave a Reply





Verified by MonsterInsights