Wednesday, November 11, 2015 - 11:00am
Location:8102 Gates & Hillman Centers
Speaker:DAVID NAYLOR, Ph.D. Student http://www.cs.cmu.edu/~dnaylor/
For More Information, Contact:deb @ cs.cmu.edu
Using a communication network effects an inherent privacy risk: when personal data enters the network, it is processed and observed by third parties (i.e., other than the client and the server) the user may not trust. Techniques exist to protect user privacy while using networks (e.g., encryption, onion routing), but these techniques nearly always come at a cost (e.g., decreased accountability, performance, or functionality).
Our primary contribution in this thesis is a set of techniques for balancing these Privacy vs. X tradeoffs. First we present the design of a network architecture that balances privacy and accountability with two "source" addresses, one for return traffic and one for reporting the sender as malicious. This second "accountability" address can point to a third party delegate (so it does not reveal the packet's sender) and the mere existence of a dedicated accountability address means the return address can be hidden. Second, we present a protocol for including middleboxes in encrypted sessions but with potentially restricted access, balancing privacy with functionality and performance.
Our second contribution is a general methodology for measuring network privacy. Since the techniques we develop are designed to facilitate different balances between privacy and X, some variants may provide less privacy than others (and do so at different costs). Unfortunately, there is no general technique for quantifying "how private" a tool or network architecture is; we design ours to be compatible with arbitrary header formats and network device behavior.
Finally, we propose a new approach for session setup and management that balances power among users, administrators, and applications. Privacy tools (both existing ones and the ones we propose) introduce complexity to configuring a network connection. Today, enabling and configuring privacy tools is predominantly left to applications, which may lack the incentives or the contextual information to do so properly. Users and administrators have only what control each application decides to expose; there are generally not consistent, system-wide control knobs.
Thesis Committee:Peter Steenkiste (Chair)Vyas SekarSrini SeshanAdrian Perrig (ETH Zurich)Dave Oran (Cisco)
Copy of Thesis Summary