Alex Halderman will present his preFPO on Thursday, May 17 at 10 A.M. in CS 302.  The members of his committee are: Ed Felten, advisor; Andrew Appel and Avi Rubin (Johns Hopkins), readers; Adam Finkelstein and Brian Kernighan, non-readers.  Everyone is invited to attend his talk.  The abstract follows below.

-------------

Learning from Security Failures in Non-Traditional Computing Environments  
 
Decades of bitter experience with security failures in desktop computers and network environments have led to intuitions and engineering practices that help us build more robust systems.  However, as computers are becoming smaller and cheaper, they are taking on new forms that challenge our intuitions.  >From RFID tags embedded in sneakers and passports, to "smart" cell phones and car navigation systems, many new applications occupy environments that look little like those with which researchers are familiar, and the old rules increasingly do not apply.  When security fails in such environments, it often fails spectacularly, with layers of vulnerabilities that, compounded, are costly or impossible to repair.  By studying such security disasters, and asking why they are especially severe, we hope to develop new security techniques suitable for a world of ubiquitous computing.  

One traditional security intuition that is now being challenged is the distinction between data and software.  When received from a trusted source, data files are considered a lower risk than software, since only software can contain security bugs that weaken the computer's defenses.  Recent events cause us to question the value of this distinction.  In a widely-publicized incident, the record label Sony-BMG sold several million music CDs (normally considered a data-only medium) that contained undisclosed software intended to thwart copying.  We studied this software using a variety of analytical techniques and discovered that it contained serious defects that threatened consumers' security and privacy.  Merely playing the CDs caused the installation of dangerous software that provided several routes for attackers to subvert the computer's security mechanisms.  These problems were exacerbated by the non-traditional environment of a hybrid music and software CD, which allowed the discs to avoid scrutiny by the ecosystem of security software vendors that monitor for deviant behavior.  

Other security intuitions are being challenged in the realm of embedded computers.  Once, such systems were though of as "dumb" appliances, protected from most security risks by their simple designs and lack of network connectivity, but as they have grown increasingly sophisticated, they have come to resemble full-blown computers, together with the attendant security problems.  We found evidence of this trend in the United States' most widely used electronic voting machine, the Diebold AccuVote-TS.  Using reverse-engineering and novel software analysis methods, we discovered that the machine suffers from many of the same security problems as desktop PCs, including the potential for attackers to install malicious software and spread viruses from machine to machine.  Because of its embedded nature, the machine has none of the protections that are now standard on desktop computers.  As a result, an attacker could exploit these problems to steal votes undetectably on a wide scale.  

In many ways, our approach to these and other examples can be compared to the systematic disaster investigations conducted after major transportation accidents or structural failures.  We seek to learn what went wrong not only in each particular case but also in the broader context that allowed the problems to occur.  Our findings highlight the need to extend existing protections to non-traditional environments, as well as to develop new security techniques that are suitable to these applications.  We propose specific remedies in each case, such as cryptographic privacy protection protocols for camera phones, and efficient machine-assisted auditing techniques for voting.  We also apply our theories to predict problems that may be discovered in the future: for instance, Internet worms that spread by exploiting weaknesses in the new generation of networked videogame consoles.  Finally, we consider legislative and regulatory remedies, including policy measures that have already gained momentum in the wake of our findings concerning the Sony-BMG and Diebold systems.