Here’s a really intriguing list of electronic and computer hacks that combine physical, electronic, and software approaches to compromise data and control systems. We’ve already mentioned a few of these in this blog, specifically the ones related to cars, to medical equipment, and to smart phones. Some of them are easily preventable, such as requiring medical device manufacturers not to use hard-coded passwords (duh!). Others require great ingenuity to perpetrate, such as the GPS spoofing attack some Texas students performed on unsuspecting yacht owners. Imagine all the spilled wine and uneaten brie after that incident! It’s those really-difficult-to-pull-off kinds of attacks that usually require the most effort and skill to prevent.
Let’s consider an another attack from this article that relates to a previous blog post. On July 29, I wrote a post entitled “Embedded malware: the next frontier of bad” about the presence of faulty chips in industrial control devices. There is evidence that device manufacturers, particularly those based in China, are producing chips that purposely contain flaws that emerge as security vulnerabilities. Such intentional flaws are called back doors. A back door provides secret entry to a system by someone who otherwise would not have such access. This is a serious concern, because so much of our electronics are made overseas by nations with whom we’re competing and, in worse situations, are having an unhealthy back-and-forth.
This particular article describes a suspected back door in a specific field-programmable gate array, or FPGA. An FPGA is an integrated circuit that looks initially almost like a blank piece of paper. Just as you’d write or draw on a blank piece of paper to give it meaning, you can program an FPGA to give it a particular purpose, transforming it from a task-agnostic wafer of weirdly etched silicon to something that performs a particular, highly specialized task. For example, one of the alumni of our Computer Science department works in a lofty position at a high-frequency trading firm, where they have built FPGAs designed specifically to implement their trading algorithms thousands of times per second. The advantages of FPGAs are that they can be much faster than less customizable chips, and they make it easier to realize unique functionality. If you’re curious and dig Australian accents, you can get a nice introduction to FPGAs here.
The FPGAs described in this article are used as part of a device deployed for military purposes. Using two cool cutting-edge hacking techniques, fuzzing and pipeline emission analysis (PEA), researchers discovered that the FPGA in question sometimes requests and uses an encryption key that is different from the one programmed by the user. They speculate that that key may be a manufacturer-supplied one that unlocks a back door for the manufacturer, or some entity the manufacturer serves, to gain access to the information stored and processed by the FPGA. Clearly, that can’t be good for a tank rolling through a mine field.
Fuzzing and PEA are not your run-of-the-mill hacking tools. Fuzzing is like throwing several wads of sticky-tack at a wall to see which ones stick. Fuzzing entails feeding lots of random or specially crafted data into a piece of software to see what makes it crack. Pipeline emission analysis, the other tool used here, is a form of differential power analysis. Basically, it tries to discern a relationship between the amount of power consumed by the FPGA and the nature of the data it is being asked to process. These are sophisticated kinds of attacks, as you’d expect if you agree with my earlier assertion: the more sophisticated the attack (and, certainly, embedded malware is a sophisticated attack), the greater the craftiness required to thwart it.
I enthusiastically recommend checking out some of the links in this great list. It gives you a very practical perspective on why cyber security is so difficult to do well. It also demonstrates why cyber security is such an appealing domain for students of Computer Science.