privacy leakage
Recently Published Documents


TOTAL DOCUMENTS

242
(FIVE YEARS 142)

H-INDEX

13
(FIVE YEARS 4)

2022 ◽  
Vol 16 (4) ◽  
pp. 1-21
Author(s):  
Honghui Xu ◽  
Zhipeng Cai ◽  
Wei Li

Multi-label image recognition has been an indispensable fundamental component for many real computer vision applications. However, a severe threat of privacy leakage in multi-label image recognition has been overlooked by existing studies. To fill this gap, two privacy-preserving models, Privacy-Preserving Multi-label Graph Convolutional Networks (P2-ML-GCN) and Robust P2-ML-GCN (RP2-ML-GCN), are developed in this article, where differential privacy mechanism is implemented on the model’s outputs so as to defend black-box attack and avoid large aggregated noise simultaneously. In particular, a regularization term is exploited in the loss function of RP2-ML-GCN to increase the model prediction accuracy and robustness. After that, a proper differential privacy mechanism is designed with the intention of decreasing the bias of loss function in P2-ML-GCN and increasing prediction accuracy. Besides, we analyze that a bounded global sensitivity can mitigate excessive noise’s side effect and obtain a performance improvement for multi-label image recognition in our models. Theoretical proof shows that our two models can guarantee differential privacy for model’s outputs, weights and input features while preserving model robustness. Finally, comprehensive experiments are conducted to validate the advantages of our proposed models, including the implementation of differential privacy on model’s outputs, the incorporation of regularization term into loss function, and the adoption of bounded global sensitivity for multi-label image recognition.


2022 ◽  
Vol 12 (2) ◽  
pp. 842
Author(s):  
Junxin Huang ◽  
Yuchuan Luo ◽  
Ming Xu ◽  
Bowen Hu ◽  
Jian Long

Online ride-hailing (ORH) services allow people to enjoy on-demand transportation services through their mobile devices in a short responding time. Despite the great convenience, users need to submit their location information to the ORH service provider, which may incur unexpected privacy problems. In this paper, we mainly study the privacy and utility of the ride-sharing system, which enables multiple riders to share one driver. To solve the privacy problem and reduce the ride-sharing detouring waste, we propose a privacy-preserving ride-sharing system named pShare. To hide users’ precise locations from the service provider, we apply a zone-based travel time estimation approach to privately compute over sensitive data while cloaking each rider’s location in a zone area. To compute the matching results along with the least-detouring route, the service provider first computes the shortest path for each eligible rider combination, then compares the additional traveling time (ATT) of all combinations, and finally selects the combination with minimum ATT. We designed a secure comparing protocol by utilizing the garbled circuit, which enables the ORH server to execute the protocol with a crypto server without privacy leakage. Moreover, we apply the data packing technique, by which multiple data can be packed as one to reduce the communication and computation overhead. Through the theoretical analysis and evaluation results, we prove that pShare is a practical ride-sharing scheme that can find out the sharing riders with minimum ATT in acceptable accuracy while protecting users’ privacy.


Entropy ◽  
2022 ◽  
Vol 24 (1) ◽  
pp. 110
Author(s):  
Onur Günlü

The problem of reliable function computation is extended by imposing privacy, secrecy, and storage constraints on a remote source whose noisy measurements are observed by multiple parties. The main additions to the classic function computation problem include (1) privacy leakage to an eavesdropper is measured with respect to the remote source rather than the transmitting terminals’ observed sequences; (2) the information leakage to a fusion center with respect to the remote source is considered a new privacy leakage metric; (3) the function computed is allowed to be a distorted version of the target function, which allows the storage rate to be reduced compared to a reliable function computation scenario, in addition to reducing secrecy and privacy leakages; (4) two transmitting node observations are used to compute a function. Inner and outer bounds on the rate regions are derived for lossless and lossy single-function computation with two transmitting nodes, which recover previous results in the literature. For special cases, including invertible and partially invertible functions, and degraded measurement channels, exact lossless and lossy rate regions are characterized, and one exact region is evaluated as an example scenario.


2022 ◽  
Vol 2022 ◽  
pp. 1-18
Author(s):  
Zhejian Zhang

As one of the cores of data analysis in large social networks, community detection has become a hot research topic in recent years. However, user’s real social relationship may be at risk of privacy leakage and threatened by inference attacks because of the semitrusted server. As a result, community detection in social graphs under local differential privacy has gradually aroused the interest of industry and academia. On the one hand, the distortion of user’s real data caused by existing privacy-preserving mechanisms can have a serious impact on the mining process of densely connected local graph structure, resulting in low utility of the final community division. On the other hand, private community detection requires to use the results of multiple user-server interactions to adjust user’s partition, which inevitably leads to excessive allocation of privacy budget and large error of perturbed data. For these reasons, a new community detection method based on the local differential privacy model (named LDPCD) is proposed in this paper. Due to the introduction of truncated Laplace mechanism, the accuracy of user perturbation data is improved. In addition, the community divisive algorithm based on extremal optimization (EO) is also refined to reduce the number of interactions between users and the server. Thus, the total privacy overhead is reduced and strong privacy protection is guaranteed. Finally, LDPCD is applied in two commonly used real-world datasets, and its advantage is experimentally validated compared with two state-of-the-art methods.


Author(s):  
Chengwen Luo ◽  
Zhongru Yang ◽  
Xingyu Feng ◽  
Jin Zhang ◽  
Hong Jia ◽  
...  

Face recognition (FR) has been widely used in many areas nowadays. However, the existing mainstream vision-based facial recognition has limitations such as vulnerability to spoofing attacks, sensitivity to lighting conditions, and high risk of privacy leakage, etc. To address these problems, in this paper we take a sparkly different approach and propose RFaceID, a novel RFID-based face recognition system. RFaceID only needs the users to shake their faces in front of the RFID tag matrix for a few seconds to get their faces recognized. Through theoretical analysis and experiment validations, the feasibility of the RFID-based face recognition is studied. Multiple data processing and data augmentation techniques are proposed to minimize the negative impact of environmental noises and user dynamics. A deep neural network (DNN) model is designed to characterize both the spatial and temporal feature of face shaking events. We implement the system and extensive evaluation results show that RFaceID achieves a high face recognition accuracy at 93.1% for 100 users, which shows the potential of RFaceID for future facial recognition applications.


Electronics ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 25
Author(s):  
Jaehun Park ◽  
Kwangsu Kim

Face recognition, including emotion classification and face attribute classification, has seen tremendous progress during the last decade owing to the use of deep learning. Large-scale data collected from numerous users have been the driving force in this growth. However, face images containing the identities of the owner can potentially cause severe privacy leakage if linked to other sensitive biometric information. The novel discrete cosine transform (DCT) coefficient cutting method (DCC) proposed in this study combines DCT and pixelization to protect the privacy of the image. However, privacy is subjective, and it is not guaranteed that the transformed image will preserve privacy. To overcome this, a user study was conducted on whether DCC really preserves privacy. To this end, convolutional neural networks were trained for face recognition and face attribute classification tasks. Our survey and experiments demonstrate that a face recognition deep learning model can be trained with images that most people think preserve privacy at a manageable cost in classification accuracy.


2021 ◽  
Author(s):  
Pranav Kotak ◽  
Shweta Bhandari ◽  
Akka Zemmari ◽  
Jaykrishna Joshi
Keyword(s):  

2021 ◽  
Vol 13 (24) ◽  
pp. 13713
Author(s):  
Xuesong Gao ◽  
Hui Wang ◽  
Lun Liu

People’s movement trace harvested from mobile phone signals has become an important new data source for studying human behavior and related socioeconomic topics in social science. With growing concern about privacy leakage of big data, mobile phone data holders now tend to provide aggregate-level mobility data instead of individual-level data. However, most algorithms for measuring mobility are based on individual-level data—how the existing mobility algorithms can be properly transformed to apply on aggregate-level data remains undiscussed. This paper explores the transformation of individual data-based mobility metrics to fit with grid-aggregate data. Fifteen candidate metrics measuring five indicators of mobility are proposed and the most suitable one for each indicator is selected. Future research about aggregate-level mobility data may refer to our analysis to assist in the selection of suitable mobility metrics.


Author(s):  
Onur Günlü

The problem of reliable function computation is extended by imposing privacy, secrecy, and storage constraints on a remote source whose noisy measurements are observed by multiple parties. The main additions to the classic function computation problem include 1) privacy leakage to an eavesdropper is measured with respect to the remote source rather than the transmitting terminals’ observed sequences; 2) the information leakage to a fusion center with respect to the remote source is considered as a new privacy leakage metric; 3) the function computed is allowed to be a distorted version of the target function, which allows to reduce the storage rate as compared to a reliable function computation scenario in addition to reducing secrecy and privacy leakages; 4) two transmitting node observations are used to compute a function. Inner and outer bounds on the rate regions are derived for lossless and lossy single-function computation with two transmitting nodes, which recover previous results in the literature. For special cases that include invertible and partially-invertible functions, and degraded measurement channels, exact lossless and lossy rate regions are characterized, and one exact region is evaluated for an example scenario.


2021 ◽  
Author(s):  
Guannan Hu ◽  
Kensuke Fukuda
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document