IT News & Events

News about IT at Indiana University and the world

Menu

What's really at stake in the Apple vs. FBI case?

IU cybersecurity expert Von Welch takes a closer look at the two sides in this landmark encryption battle

Criminals stealing private data off our phones is something none of us is likely to support, but how do we feel about law enforcement officials getting access to criminals’ phones? Or even terrorists’? If given the chance, you’d think Apple—one of the world’s leading smartphone manufacturers—would be happy to help the FBI in gathering information from a San Bernardino shooter’s iPhone, right? Wrong.

As noted in an article in the Verge:

"A judge has ordered Apple to comply with an FBI demand to help unlock a phone used by one of the San Bernardino shooters, but the company has vowed to fight the decision, with Tim Cook calling the order ‘an unprecedented step which threatens the security of our customers.’ The battle could determine the future of encryption."

What’s a smartphone user to make of this situation? Von Welch, director of Indiana University’s Center for Applied Cybersecurity Research (CACR), is following this case closely. CACR has been operating at Indiana University since 2003 and is renowned for its leadership in applied cybersecurity technology, education, and policy. Welch sheds some light on who is saying what, and why, in the Apple vs. FBI case.

What’s your take on the FBI vs. Apple clash?

Welch: On the surface, this seems like a simple case—law enforcement has a phone in their possession and they want to access the data on the phone. But, ironically, advances in cybersecurity have turned this into a complicated situation.

We’ve all heard of big breaches such as those at Target, Anthem, and the Federal Office of Personnel Management. After each breach, the question comes up: Why wasn’t the data encrypted?

Apple has been developing data encryption, both on their phones and in their backup mechanism, the iCloud, to protect their customers’ private data. With this encryption, even if a phone is stolen or their servers are breached, private data is protected. With iOS 8, the encryption became such that, even though you have access to the phone, you couldn’t get to the data. And they actually changed their law enforcement procedures to say, starting with iOS 8, they could no longer get data off these phones because of this extensive encryption.

To access the data on a phone, the owner would typically have a PIN, a four- or six-digit number, which unlocks the encryption. In order to break that encryption, Apple or anyone else would have to “brute-force” guess that PIN; that is, try all possible combinations. To protect against this, Apple built into their iOS operating system a limit on the number of guesses. So after so many wrong guesses, the phone locks and wipes itself.

So when the FBI came into possession of the locked iPhone, they didn’t have an obvious way to access the data. However, they realized that if Apple created a new version of their operating system that didn’t have this brute force guessing limit, and loaded it onto the phone, that new software could turn off this guessing limiter and the FBI could break the encryption on the iPhone of the San Bernardino terrorist. So the FBI obtained a type of court order, called an All Writs Act, a mechanism usually used to gather evidence, in order to compel Apple to write such a version of their operating system.

Why is Apple pushing back on this court order?

Welch: Apple’s use of encryption in the pursuit of security is creating a challenge here for law enforcement. The encryption on the phone can’t tell the difference between a criminal with a stolen phone and the FBI with a criminal’s phone; all it knows is whether someone has the right PIN or not. In order to disable that protection, the All Writs Act order obtained by the FBI is telling Apple to do something that is unprecedented as far as we know: create a new version of their operating system with intentionally weakened cybersecurity. In this day, when everyone is struggling with cybersecurity, Apple and many in the community consider this to be undermining efforts toward the greater good.

And the question is: Is this something that the All Writs Act from 1789 can compel Apple to do? In talking to various legal scholars, I think it’s a long shot because the FBI is basically asking them through the court to write a piece of software that they were NOT going to write in the first place and that Apple argues weakens the iPhone’s security and is against its customers’ interests. In essence, on one side we have the FBI who’s saying this needs to be done to find and gather more evidence into this case of domestic terrorism, and on the other side, we have Apple who is saying this is inherently against their corporate interests.

How will the outcome of this case affect the public or the IU community?

Welch: It all goes back to data encryption. In order to provide better cybersecurity and protection against data breaches, other companies and organizations are employing encryption more and more to protect data. IU is certainly no exception. For example, CACR staff manage risks to personal health information used in research at IU using encryption.

However, being able to encrypt that data such that it’s protected from criminals while still being able to give access to law enforcement brings some real challenges. There are policy issues to contend with—with government’s law enforcement and the like—but let me focus on the technical issues. Allowing this access to encrypted data by someone other than the owner means creating what is often referred to as a “backdoor.” I don’t think that’s a great analogy. One I prefer is that analogy of creating a master lock. Imagine we all had front doors so strong that no one could get into our houses, even with a search warrant. One way to allow search warrants to be executed would be to require all front doors to have a master key.

That sounds reasonable, but how do we secure those master keys? What happens if one gets lost or copied? This actually happened recently when a photo of a Transportation Security Administration master key was published in the Washington Post. Once a photo of that key was posted, even just once, anyone could make a copy of it.

So while we’re not talking about a physical master key here, analogous problems exist—one mistake that exposes whatever mechanism a company or organization puts into place to give law enforcement access to data, and the cat is out of the bag. For example, if Apple were to create the weakened version of iOS, and it were to be leaked, it could be used by criminals or other governments, over and over.

What does all this mean for people who have iPhones or any other smartphone?

Welch: Today the question is focused on iPhones, but it’s a broader question about any encrypted data. A number of other technical companies have come out in support of Apple presumably because they understand it will affect them as they implement more cybersecurity with encryption. This case is the first getting at the bigger question: Are we willing to take this risk to our cybersecurity in order to allow access to our data for cases such as this one? We’re going to have to make that choice as a society. That is what’s really at stake here.

We’re seeing high-profile data breaches almost daily, and IT security experts are trying to tighten security to prevent those attacks. But one of the problems with tightening down security is you tighten down security for everybody—bad guy, criminal, and law enforcement alike. Are we going to be allowed to tighten that security to the point where nobody can get in? Or do we have to leave a crack in the door for law enforcement?

As soon as you leave that crack in the door, you have to decide who gets through it, and you have the very hard job of making sure it’s only the people you want. The Apple vs. FBI case is really a fight over who’s going to have access to our data.