Loughborough University
Browse

A differential privacy mechanism that accounts for network effects for crowdsourcing systems

Download (484.58 kB)
journal contribution
posted on 2021-11-04, 11:37 authored by Y Luo, Nick JenningsNick Jennings
In crowdsourcing systems, it is important for the crowdsource campaign initiator to incentivize users to share their data to produce results of the desired computational accuracy. This problem becomes especially challenging when users are concerned about the privacy of their data. To overcome this challenge, existing work often aims to provide users with differential privacy guarantees to incentivize privacy-sensitive users to share their data. However, this work neglects the network effect that a user enjoys greater privacy protection when he aligns his participation behaviour with that of other users. To explore this network effect, we formulate the interaction among users regarding their participation decisions as a population game, because a user's welfare from the interaction depends not only on his own participation decision but also the distribution of others' decisions. We show that the Nash equilibrium of this game consists of a threshold strategy, where all users whose privacy sensitivity is below a certain threshold will participate and the remaining users will not. We characterize the existence and uniqueness of this equilibrium, which depends on the privacy guarantee, the reward provided by the initiator and the population size. Based on this equilibria analysis, we design the PINE (Privacy Incentivization with Network Effects) mechanism and prove that it maximizes the initiator's payoff while providing participating users with a guaranteed degree of privacy protection. Numerical simulations, on both real and synthetic data, show that (i) PINE improves the initiator's expected payoff by up to 75%, compared to state of the art mechanisms that do not consider this effect; (ii) the performance gain by exploiting the network effect is particularly good when the majority of users are flexible over their privacy attitudes and when there are a large number of low quality task performers.

History

Published in

Journal of Artificial Intelligence Research

Volume

69

Pages

1127 - 1164

Publisher

AI Access Foundation

Version

  • VoR (Version of Record)

Rights holder

© AI Access Foundation

Publisher statement

Individual users may read, download, copy, distribute, print, search, or link to the full texts of indvidual articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose as specified in our licensing terms at https://jair.org/index.php/jair/about#jair-license

Acceptance date

2020-12-01

Publication date

2020-12-03

Copyright date

2020

ISSN

1076-9757

eISSN

1943-5037

Language

  • en

Usage metrics

    Loughborough Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC