Beware the Hardware that Implements Cryptography – Security Boulevard

Beware the Hardware that Implements Cryptography

Testing hundreds of millions of lines of code is a daunting task in and of itself, but it’s made even more difficult when you’re trying to implement cryptography.

Cryptography is not easy to comprehend. It often performs extremely complicated calculations on multithousand-bit values and uses random numbers for some of those calculations.

When generating a cryptographic key, for example, you need to generate lots of random (actually pseudorandom) bits. If the key does not match the random bits exactly, it is then possible for a bad actor to guess it more easily. If that’s the case, then essentially your security efforts via cryptography are moot.

Here’s why it’s so complicated, plus some tips to simplify testing cryptography-enabled software.

Backdoors

When implementing cryptography testing in hardware, using random values makes testing that much more difficult. This is because testing something complicated is challenging, and it’s made even more challenging if you can’t test it with lots of known input-output pairs. If your inputs change every time you do a test, as they would if you were using random numbers, testing may be close to impossible.

To solve for this challenge, most, if not all, hardware that implements cryptography employs methods to bypass the output of its random number generator and use known values in testing.

Such undocumented features, or “backdoors,” are often present in all hardware that implements cryptography and uses random numbers … essentially in all of it.

It is now possible to make hardware in which you can disconnect or disable the way to bypass the use of random numbers, but a clever hacker might be able to find a way to re-enable the backdoor. Although it might not get discovered or disabled before the product is shipped to customers.

Unfortunately, this is a trade-off that may be impossible to avoid. If you want to test your hardware, you may also need to introduce a backdoor in it, and that means that it might not be as secure as you’d like it to be.

Magic or Code?

Today, an application with a million lines of code really isn’t that exciting and it’s easy to find cases where this is happening. At CES 2016, Ford Motor Co. noted that more than 150 million lines of code now go into its F-150 pickup truck. Higher-end cars probably have even more, almost certainly more than 200 million lines. It’s just not a big deal to have millions of lines of code these days.

Hardware is getting just as complex. Before the dot-com boom, I recall being impressed with how Intel was able to manufacture a Pentium processor comprising 3.1 million transistors.

Today, integrated circuits comprising tens of billions of transistors are made. If you think about that, it’s quite amazing: tens of billions of components, all of which are working roughly as they should. It’s the same sort of sufficiently advanced technology that science fiction author Arthur C. Clarke told us could be indistinguishable from magic.

Testing Software vs. Hardware

In some ways, software is easier to test than hardware is. Compiling code in debug mode makes it much easier to test, and it’s relatively easy to compile in release mode once you’re done debugging.

However, it’s still very hard to ensure that the encryption is implemented correctly. Intermediate results and outputs of cryptographic algorithms often look like hundreds, or even thousands, of random bits, which makes debugging code that implements them tricky.

For example, you never get something like “this string is broken” as an intermediate result in cryptographic algorithms, instead, you get something more like:

“0x7649abac8119b246cee98e9b12e9197d8964e0b149c10b7b682e6e39aaeb731c.”

Ugh.

How to Simplify Testing

Fortunately, there’s a simple solution to this problem: Don’t write your own cryptographic software!

If you take something like a college Crypto 101 class, you’ll be taught how cryptographic algorithms work and why they’re secure. And then you’ll be told to never try this on your own because the chances of doing something wrong are simply too great.

Instead, use an existing cryptographic library that’s been validated to a standard such as the U.S. government’s FIPS 140-2, “Security Requirements for Cryptographic Modules.” This doesn’t have to be difficult or expensive-for example, a version of OpenSSL has already received FIPS 140-2 validation.

Unfortunately, lots of people don’t get to that Crypto 101 class and end up trying to implement their own version of cryptographic algorithms. This is extremely difficult to execute successfully and trying to do it is not a good idea.

In conclusion, beware hardware that implements cryptography because testability might have put a backdoor in it. And never, never, never, emphasis on never, implement cryptography when there’s a known-goodsolution that can be utilized in its place.

Browse

Article by channel:

Read more articles tagged: Cryptography