DOI | Resolve DOI: https://doi.org/10.1007/978-981-99-9785-5_17 |
---|
Author | Search for: Wang, LeORCID identifier: https://orcid.org/0000-0002-4939-1642; Search for: Yan, HaonanORCID identifier: https://orcid.org/0000-0002-1784-6091; Search for: Lin, XiaodongORCID identifier: https://orcid.org/0000-0001-8916-6645; Search for: Xiong, Pulei1ORCID identifier: https://orcid.org/0000-0002-3460-6946 |
---|
Affiliation | - National Research Council of Canada. Digital Technologies
|
---|
Format | Text, Book Chapter |
---|
Conference | AIS&P, December 3-5, 2023, Guangzhou, China |
---|
Subject | Machine Learning as a Service; bilateral privacy; privacy leakage; model extraction; differential privacy |
---|
Abstract | With the continuous promotion and deepened application of Machine Learning-as-a-Service (MLaaS) across various societal domains, its privacy problems occur frequently and receive more and more attention from researchers. However, existing research focuses only on the client-side query privacy problem or only focuses on the server-side model privacy problem, and lacks a simultaneous focus on bilateral privacy defense schemes. In this paper, we design privacy-preserving mechanisms based on differential privacy for the client and server side respectively for the first time. By injecting noise into query requests and model responses, both the client and server sides in MLaaS are privacy-protected. Experimental results also demonstrate the effectiveness of the proposed solution in ensuring accuracy and providing privacy protection for both the clients and servers in MLaaS. |
---|
Publication date | 2024-02-04 |
---|
Publisher | Springer Nature |
---|
In | |
---|
Series | |
---|
Language | English |
---|
Peer reviewed | Yes |
---|
Export citation | Export as RIS |
---|
Report a correction | Report a correction (opens in a new tab) |
---|
Record identifier | 02579be0-382c-454c-9051-d2a4345919fb |
---|
Record created | 2024-02-27 |
---|
Record modified | 2024-02-28 |
---|