Privacy and Encryption Lies You've Been Told

Right now this page is barely a rough draft.

Not all information has been verified and many parts are incomplete.

Do you use your phone to text loved ones? Do you use it for banking? Do you use it for email?

If you don't care if criminals have access to everything you do, then this article doesn't apply to you. Of course if you've read all the articles about the FBI vs Apple then you're probably misinformed, so in that case this article only applies to you if you care to learn the truth nobody else will tell you.

Don't bother reading this if:

When the argument is difficult, switch arguments!

I'm only addressing the issue of access to data stored on the phone and encrypted to prevent unauthorized access. Most things you might read on the topic, particularly those against government decryption, tend to conflate the issues of data transmitted with encryption with those of data stored with encryption.

People who oppose government mandated decryption of phones bring up encrypted communications as soon as they realize the weakness of their arguments. It's not a bad tatic for arguing and you may have noticed politicians taking advantage of it. When people hear an irrefutable point made in an argument, they tend to remember that rather than how unrelated it was to the original point of the argument.

Whenever you start to think "but what about when" then you're probably thinking about the data being transmitted. Feel free to make detailed notes in case I ever write an article on that, but this is not that article.

The next red herring is the "if you have nothing to hide" gambit. The idea that you might want privacy from the government can be used to imply that you've done something wrong. That's an obvious fallacy when you realize that the people making that argument wouldn't want their banking and credit card details on a billboard. The question isn't who you can trust with your personal details, the question is whether you have a right to control who can access your personal information when you prefer to keep it private.

A few shallow thinkers and a few deep ones will reason that if the government with all it's resources want's to find proof that you've done something wrong, they probably can. The question of whether that means there is no point in preventing them from legislating encryption or whether that means we should fight for the right to encryption secure from even government decryption depends on how you answer the next question.

Can't the government decrypt nearly anything?

No.

More than a few people believe that agencies of the US government have enough computing power to decrypt anything they want bad enough. This belief stems from confidence in the resources of a powerful entity like the US government and how difficult they believe encryption must be to break.

We know the iPhone uses AES256 bit encryption, so we can guess what it would take to crack it's encryption.

The US collects over $6,000,000,000 in taxes every year. That sounds like an awful lot of money. Especially when you consider the most powerful computer in the world costs only $390,000,000. (In this context "only" is relative.)

Actual actual reality: nobody cares about his secrets.  (Also, I would be hard-pressed to find that wrench for $5.) xkcd

Cracking 128 bit AES

If you had one Earth for every star in the Milky Way and dedicated the entire electricity production of every one to cracking AES-128, it would still take longer than the life of the universe to present.

Never underestimate the opponents' capabilities

How much computing power does the NSA have is a question with no easy answer.

In 2008, the NSA told Congress it would need a computer capable of 1,000 PFLOPS by 2018. They need to build a system 30 times as powerful as the most powerful computer in the world.

If you're worried we're still not exaggerating the capabilities of the NSA, please stop. We're going to go a lot bigger for this discussion.

We'll start by assuming the NSA has 600 computers like the most powerful one in the world.

Related

Special Thanks

The Tianhe-2 is the largest known supercomputer, and what we're basing our calculations on. It is capable of 33.86 PFLOPS. We're going to start with assuming the NSA could have 600 of those, so for our calculations, we begin by assuming it would have 20,316 PFLOPS of computing power at it's disposal.

IBM's super computer Blue Gene/P was capable of 0.0000136 PFLOPS

But one organization can only have so much of the world's computing computing power at its disposal. Consider the Bitcoin Network. The Bitcoin network is the largest virtual supercomputer in the world, and because the computers don't have to be dedicated to the task and because anyone in the world can own one, it is capable of a staggering 4,873,841.62 PFLOPS

 20,316 PFLOPS

 4,873,841 PFLOPS

 - NSA computing power

 - Bitcoin Virtual Computer April 2016

What this tells us:

Assuming you use all 15 billion computers connected to the internet to attempt to crack one AES256 key, and assuming they are all more powerful than the computers in use today how long would it take to guess the keys that might be used?

It would take 1,254,856,009,386,230,000,000,000,000,000,000,000,000,000,000,000,000 years.

For comparison:

So, no. Modern encryption, like that used in the iPhone, is not something that can be cracked.

Math may be unreliable

There was a computer cluster designed to crack passwords at the rate of 350 billion guesses per second. If all 15 billion computers were capable of that speed, and assuming I didn't lose track of any zeros when I worked it out, and that my math was correct (hah!) then it would take only 3,585,314,300,000,000,000,000,000,000,000,000,000,000,000,000 years.

The Golden Key Fallacy

"Once the key is in the FBI’s possession, the FBI computers can be hacked and the key stolen." - John Kirk

"Technically, they want Apple to build in a backdoor route into that encryption for use by law enforcement agencies, but that’s the same thing: strong encryption with a built-in flaw is not strong encryption. It’s only a matter of time before hackers find and exploit it." - Ben Lovejoy

"...any universal key creates a new backdoor that becomes a target for criminals..." - EFF

Some high ranking government officials have stated that law enforcement needs to have the ability to gain access to data stored on phones. Currently, smart phone manufacturers are moving toward making it impossible even for the manufacturer themselves to decrypt the data on their customers' phones. This conflict between what customers and smart phone manufacturers want with what the government and law enforcement wants has led to attempts to make laws that will force smart phone manufacturers to provide a way to decrypt the data on phones.

If a law requires smart phone manufacturers to be able to decrypt the phones, it could be done by writing the software to allow anyone with a secret key to decrypt any phone. This is what is referred to as a "golden key" because it is one key that can access everything.

"How about gold colored instead?"

The fallacy is that the golden key is the only way or even the likely way phone manufacturers could do what such a law would require. There are actually multiple ways that manufacturers could ensure they can access the data on an encrypted phone:

"This is the same picture as the one above right?"

“It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy. When a child is in danger, law enforcement needs to be able to take every legally available step to quickly find and protect the child and to stop those that abuse children. It is worrisome to see companies thwarting our ability to do so.” - US Attorney General

Notice how the Attorney General doesn't say "backdoor" or "golden key?" That's because he understands that the best way for a technology company to grant access to encrypted data is not with a golden key or backdoor. He's not being deliberately obtuse to the dangers of building backdoors into software, he's astutely dodging taking on the argument for which method they should use.

The Electronic Frontier Foundation is one of the most widely recognized and well respected advocates for individual rights when it comes to the internet. Yet, even they present a straw man argument.

Electronic Frontier Foundation

"Simply put, there is no such thing as a key that only law enforcement can use - any universal key creates a new backdoor that becomes a target for criminals, industrial spies, or foreign adversaries. Since everyone from the Post’s Editorial Board to the current Attorney General seems not to understand this basic technical fact, let’s emphasize it again:

There is no way to put in a backdoor or magic key for law enforcement that malevolent actors won’t also be able to abuse.

But the Attorney General didn't suggest "a backdoor or magic key"

When you see this type of logical fallacy presented as an argument by an organization so talented and well informed, it raises the question of whether they're avoiding the real argument intentionally. (They do great work I need more excuses to link too.)

Apple has a signing key they must keep secure because failure to do so compromises the security of every iPhone sold. They use it to sign the updates that iPhones will accept. If it were to fall into the wrong hands, malicious software could be loaded on iPhones to make them easier to break into or even to make them spy on their owners and report back to a bad guy.

That's not what "could happen" if some law gets passed. That's what already exists now.

Why insist that a golden key is impossible to keep safe when we already depend on phone manufacturers to handle just that?

"63% of voters say they prefer him to their party's current candidate"

Ars Technica explains that most software already has a golden key. But Ars could have gone further. The chip that boots an iPhone has a hard coded key it uses to see if the next software that gets loaded was signed by Apple. That means that if Apple ever lost control of their signing key, it would be impossible to patch phones to reject software that shouldn't be trusted anymore.

When lawmakers consult sympathetic technology and security experts, guess what they'll hear:

How much time has been wasted by people arguing against a golden key? Is it any wonder that legislators would consider creating laws to force phone manufacturers to do decryption when required by law enforcement?

Naturally such laws raise the ire of organizations like the EFF which rightly points out that not everyone in government and law enforcement can be assumed to be trustworthy forever.

Naturally such laws make phone manufacturers nervous since they already have a heavy burden to assist law enforcement and that kind of law would doubtless increase the burden dramatically.

Privacy advocates and security geeks all across the internet rouse to fight against the idea that government should be allowed to access anything just because government says so. They spend countless hours debating all the ways it could go wrong. They pontificate at length about how bad idea such a law would be. They write letters, send faxes and emails and sometimes run internet campaigns to prevent such a bad law from being written.

But when you know the dirty little secret, that there are already golden keys and golden keys aren't even necessary, all the popular arguments no longer hold sway over legislators.

The question of whether anybody would be willing to make such a law has already been answered. Only popular opinion has prevented such bad bills (and they are bad, and these two are examples of whole new levels of bad) from being signed into law.

Some bills are better than others

Even if legislators or phone manufacturers decided to use a golden key scenario, such keys are routinely kept safe. Apple and Google aren't alone in this, but they've publicly fought the idea of letting law enforcement take advantage of their keys. Every banking transaction you complete, every email you hope is private, and every website you sign into has to rely on authorities keeping their secret keys safe. 

Take Blackberry for example. When a country demands Blackberry share the keys or get out, they get out!

Well, maybe not every time. (Hint: Rhymes with Banada)

The US government can not force decryption

Most people recognize the benefits of living in a country with law enforcement dedicated to justice.

Part of the job of law enforcement is to track down and document the crimes. This evidence is vital to a justice system that depends on the courts to convict those who break the law. This is especially important for protection against criminals who break laws in ways that harm other people.

We, as a society of law, want law enforcement to be able to gather evidence and prevent crime and gather evidence of crimes. It seems logical to want to enable law enforcement to access the data that criminals might hide on their computers and smart phones.

Why then, do so many well educated people with respect for the law still argue against laws that force smart phone manufacturers to decrypt phones, even when there is a lawful request?

It wouldn't accomplish the goal.

Even if phone manufacturers build in a decryption method and keep it absolutely secure, that doesn't mean criminals won't have unbreakable encryption. Phones run programs that the manufacturer doesn't create, which means that software running on the phone can use unbreakable encryption. Phones are routinely altered, sometimes just to add abilities unsupported by the manufacturer, but sometimes even to run completely different operating systems.

Encryption doesn't even depend on software. It is possible for people to encrypt their messages or data outside the phone, whether with pen and paper or with other computers so that what is stored on the phone has never been decryptable by the manufacturer, and by extension by law enforcement.

It's not unusual for people who run Windows with systems that encrypt data or even the operating system to do it with software that Microsoft has no control over. 

It's not unusual for people to add encryption tools to their email so that the email program doesn't have a copy of unencrypted email.

Extending those functions to phones is trivial, and won't be hard for criminals to add.

It would give bad people power.

I'll only do use my power for good, I swear. morgueFile

Good people do bad things sometimes. Bad people get elected sometimes.citation - check the news

There are countless examples of people in authority doing things that harm innocent people. Those examples are worse when the people in authority can use the legal system to gain access to private information. Even the best of people can be coerced into helping a criminal and the criminal has much greater potential for harm if the private information of their victim has no protection from the coerced law enforcer. 

Eventually bad things happen, with or without access to private data. Bad things are worse with greater access to private data.

It is inevitable that somebody in authority, even with the best of intentions will use their authority to do something bad. The harm they will do will be increased by the amount of access they have to private data.

It is inevitable that someone with bad intent will be put in authority and the harm they do is increased by the amount of access they have to private data.

It is inevitable that someone who is committed to doing only the right things will be coerced into doing something bad. Their life or their family or their friends may be threatened, and the bad things they are forced to do are worsened depending on the amount of access they have to private data.

So legislating access to private data harms good people and doesn't prevent criminals from using unbreakable encryption. It is hardly surprising that those who seek to protect liberty and justice oppose legislated decryption.

INCOMPLETE: It would hurt the US.

Look what you've done

Today I heard someone refer to the end of Blackberry because Blackberry shared the ability to access otherwise encrypted information with (spoiler) Canadian law enforcement.

When Snowden revealed what the NSA had been up to, other countries started buying network hardware from other countries instead of the USA.

IBM is spending $1.2 billion to build more secure cloud data centers abroad in an attempt to placate nervous foreign customers. - infoworld.com

When export of encryption technology was limited by the US, people and organizations outside the US developed their own. This resulted in the US being unable to prevent encryption development outside the US and in a proliferation of non-US encryption tools.

In the world we live in a company that introduces back doors into their systems cannot compete with those that don't. The US can't force other countries to backdoor their systems, and certainly not so that US enforcement will have access so anyone interested in privacy (this is a large number) will start avoiding buying things from the US. That kinda sucks for our economy. It means fewer jobs, less taxes paid by those who would otherwise be making profits and it means that we control a little less of our own destiny.

INCOMPLETE: What legislated decryption actually accomplishes

Crimes tend to be committed by criminals in only a few circumstances:

Smart criminals will use encryption no matter what law is passed. Smart criminals will not store data that can be used to incriminate them on something that isn't reliably secure from government. People who plan crimes because of passion or temptation won't be caught by decrypting data no matter how draconian the laws we pass are.

This means that terrorism and other murders won't be reduced by legislation that would force decryption. But the thing neither government or privacy advocates want you to think about is that most crime is done by people who are angry, desperate or depressed. Those people are unlikely to think through the steps and potential consequences of their crimes before committing them. Many of these people would leave evidence in their phone which might be useful to law enforcement.

The casual criminal who steals something in order to get enough money to satisfy a drug craving isn't planning the crime. The angry fired employee, ex-spouse, sibling or driver who commits a crime isn't usually planning to commit the crime. The kid who can't see any way out of his situation and decides the world owes him a break isn't planning for the cops to investigate. These are all crimes where planning to escape consequences doesn't happen until after the crime is committed, which makes it very likely a smart phone might contain evidence of interest to law enforcement. Further, these are the people least likely to realize they could protect their personal data with encryption outside the jurisdiction of the US.

This means that being able to ensure most US sales of phones are decryptable would be a boon to law enforcement, but disproportionately affecting the poor. If the FBI just came straight out and said "we want to catch more poor criminals" then they would have a difficult time gathering support.

Likewise if opponents said "decrypting phones is about catching poor criminals" then the political spin is that opponents of phone decryption are assuming the poor are more likely to be criminals, even if that's the truth. (The poor are the most likely to be the angry, the desperate and the depressed people I spoke of earlier.)

The criminals society most fears are the planners. The planners are people like terrorists and mass murderers. The people that are most frightening to us are the ones that are thinking about the consequences of their actions and taking action to minimize what law enforcement can do about it.

Terrorists and mass murderers plan ahead, so they're going to plan to ensure the data they care about isn't available to law enforcement. No law can fix that. Everyone who brings up terrorists and mass murderers as a reason for decryptable phones has missed the obvious or is intentionally misleading you.

What this debate is really about, when you get right down to the truth of the matter, is that those of us who know enough about the government and history to distrust it, don't want to give up our essential privacy in order for cops to be able to send a few more stupid criminals to jail.