Ben Burgess will present his Pre FPO "Reverse Engineering Real World Trusted Systems" on Thursday, January 28, 2021 at 4pm via Zoom.

Zoom Information:
https://princeton.zoom.us/j/98643121461
Meeting ID: 986 4312 1461

Examiners: Ed Felten (adviser), Arvind Narayanan, Nick Feamster (University of Chicago)

Readers: Ed Felten (adviser), Prateek Mittal, Danny Huang (New York University)

All are welcome to attend.

Abstract:

Overview
In this talk, I will present two studies that focus on evaluating the security and privacy guarantees of real world systems that have the potential to affect large groups of people. The first study I will discuss evaluates the security of trunking radio systems which are used as the primary communication system for almost all public safety agencies. These radio systems provide higher availability guarantees than cellular phones or conventional two way radios while also promising completely private communications. In this talk, I will discuss the vulnerabilities these systems suffer either due to weak encryption ciphers or flawed protocol design which can allow an adversary to recover sensitive voice or data traffic. The second study I will present looks at the various exam proctoring software solutions that institutions are contracting to proctor their exams during COVID. I will provide a detailed analysis of the impact of these solutions on student privacy along with a look at the security guarantees they provide the institution.

Police Radio
For this part of the talk, I will focus on trunking radio systems designed by Motorola and Harris as they are the largest vendors in the public safety communication market. To protect voice communications, Motorola offers several different encryption options for the user to choose from including common ciphers such as DES and AES along with proprietary options such as their Advanced Digital Privacy (ADP) cipher. An earlier study by Glass et al. reverse engineered ADP and demonstrated that ADP is simply a wrapper around RC4 which has many known vulnerabilities. We demonstrate that the underlying voice traffic can be easily recovered from real world deployments using low cost digital TV receivers despite multipath interference challenges. We then demonstrate that both ADP and DES encrypted traffic can be recovered from these captures at a low cost using either desktop GPU hardware or rentable FPGA clusters. Finally, we look at the over the air rekey (OTAR) mechanism Motorola implements which allows the user to rekey certain voice channels when a radio is lost or stolen. We find serious issues with the OTAR protocol which allows a user to easily compromise this mechanism once they are able to decrypt voice traffic. By compromising the OTAR protocol, an adversary would be able to maintain persistent access to the encrypted communications on the system with no further work until all of the radios were recalled and manually rekeyed using a hardware loader.

We extend this study to look at the other major offering in the public safety market, OpenSky by Harris. This offering has seen widespread adoption across the public safety market due to its perceived security despite lacking interoperability with the Motorola standard. We reverse engineer the protocol specification of the Harris solution to provide the first publicly available look at the underlying design. We demonstrate that traffic is not always encrypted as is widely believed by the agencies deploying the system. Additionally, we design an attack against this system which allows for an adversary to both intercept and forge traffic between the subscriber unit and base station.

Exam Proctoring Software Study

For this part of the talk, I will provide a detailed analysis of the impact of the major exam proctoring software offerings on student privacy along with a look at the security guarantees they provide. I will focus primarily on one of the most popular offerings, Examplify, which is being used to proctor the bar exam in many states. I will discuss several different exploits for it which allow the user to run the software in a virtual machine without detection, extract the exam contents without being logged, and extract the answers for the exam in certain cases. I will then discuss the details of the facial recognition models these solutions use to detect cheating and the potential racial biases of these models. Finally, I will discuss the impact on student privacy by looking at what the system service logs and shares with the developer.