Poorvi Vora - Research: Privacy

My privacy research combines three threads. The most recent---first---strand is a game-theoretic study of privacy in auctions; the second is the information-theoretic study of inference attacks; the third is work I did while at Hewlett-Packard in the early 2000s, when DRM and related privacy issues were just beginning to be noticed. The publications in the inference-attacks thread are also listed in my research on information-theoretic approaches to security
Game-Theoretic Approaches to Privacy
This research was funded in part by HP Research Gift (2004), and GW Dilthey Award (2004), and is joint with graduated doctoral student Yu-An Sun (now at Xerox Research) and Economics faculty member Sumit Joshi.

In this research, we examine various instances of the problem where a seller changes the rules of the auctions after bidders submit their bids (eBay's second-chance offer is an example). We view this as a privacy problem because a bidder's information (the bid) is used against him, to the seller's benefit, when the rules are changed. We observe that, in certain types of situations, the bidder can provide a disincentive to the seller by introducing uncertainty in the bid. In [1], we find that the bidder will quantize his bid with the quantization intervals depending on the probability of the seller changing the rules. We have derived detailed seller and bidder strategies for versions of the second-chance offer; in several cases the strategies are randomized [2-4]. For simple versions of the game, we have examined differences in bidder payoff between the use of strategic bidding vs. cryptographic protection, and find that strategic bidding provides better protection [5].
Inference Attacks

In this research, we point out that, when information is perturbed before revelation, efficient inference attacks on statistical databases are simply error-correcting codes. This allows us to use results such as the channel coding theorem to bound attack efficiency. While this has been viewed by some as an impossibility result (that perturbation is not enough to protect against inference attacks) our view of it is as an examination of the cost of inference attacks, and hence a related privacy measure. This motivated a view of the privacy problem as one of trading secrecy for benefit, a theory of "variable privacy". This is single author work.
Software Architecture:
In this work, while at HP, our approach has been to attempt to address privacy issues in DRM protocols, as well as to propose privacy architectures. This work was joint with several outstanding co-workers at HP.

Email: my first name at gwu dot edu
Last modified: 15:18:19, Tuesday, 26 March, 2013 local time