09-02-2013, 04:52 PM
Incentive Compatible Privacy-Preserving Distributed Classification
Incentive Compatible Privacy.pdf (Size: 541.6 KB / Downloads: 22)
INTRODUCTION
INFORMATION has become a power currency in our society.
As such, people treat information with care and secrecy.
There are times, however, when information needs to be
shared among owners for the betterment of society, or
simply for their own profit. Data mining seeks to take
information and aggregate it into models that are more
useful than the original information. Since people are
cautious and do not wish to give up their private information,
the need for privacy-preserving data mining has arisen. In
addition to the simple desire for privacy, certain government
regulations, such as the Health Insurance Portability and
Accountability Act (HIPAA) [3] require that certain data be
kept private.
Techniques for privacy preserving data mining are many
in number. They include anonymization of data [35], [25],
[38], noise addition techniques [15], [9], and cryptographic
techniques [31], [7] in addition to countless others. The
cryptographic techniques have the distinction of being able
to compute models based on unperturbed data, since the
cryptography ensures that the data will not be revealed.
However, they make no guarantees that participants will
not use false data anyway.
RELATED WORK
Cryptography and game theory have a great deal in
common, in terms of the goals they try to achieve. The
problems tackled by cryptography generally seek to assure
that participants in certain activities are forbidden to
deviate (profitably) from the prescribed protocol by
rendering such actions detectable, impossible, or computationally
infeasible. Similarly, mechanism design seeks to
forbid deviations, but it does so by rendering the deviations
unprofitable. It is understandable, therefore, that a fair
amount of work has been done to use the techniques of one
to solve the problems of the other. Most of this work is not
directly related to ours, since a fair amount of the game
theoretic security work deals with specific functions, and
the individual steps of the computations of those functions.
Shoham and Tennenholtz [34], define the class of NCC,
or noncooperatively computable functions, and define specifically
the boolean functions that are NCC. In addition, the
paper defined two additional classes, p-NCC and s-NCC,
which stand for probabilistic-
GAME THEORETIC BACKGROUND
Game theory is the study of competitive behavior among
multiple parties. A game contains four basic elements:
players, actions, payoffs, and information [32]. Players have
actions that they can perform at designated times in the game,
and as a result of the actions in the game, players receive
payoffs. The players have different pieces of information, on
which the payoffs may depend, and it is the responsibility of
the player to use a profitable strategy to increase his or her
payout. A player, who acts in such a way as to maximize his
or her payout, is termed rational. Games take many forms,
and vary in the four attributes mentioned above, but all
games deal with them. The specific game we describe in this
paper is a finite player, single round, simultaneous action,
incomplete information game, with payouts based on the
final result of players’ actions.