scholarly journals Encrypted Databases for Differential Privacy

2019 ◽  
Vol 2019 (3) ◽  
pp. 170-190
Author(s):  
Archita Agarwal ◽  
Maurice Herlihy ◽  
Seny Kamara ◽  
Tarik Moataz

Abstract The problem of privatizing statistical databases is a well-studied topic that has culminated with the notion of differential privacy. The complementary problem of securing these differentially private databases, however, has—as far as we know—not been considered in the past. While the security of private databases is in theory orthogonal to the problem of private statistical analysis (e.g., in the central model of differential privacy the curator is trusted) the recent real-world deployments of differentially-private systems suggest that it will become a problem of increasing importance. In this work, we consider the problem of designing encrypted databases (EDB) that support differentially-private statistical queries. More precisely, these EDBs should support a set of encrypted operations with which a curator can securely query and manage its data, and a set of private operations with which an analyst can privately analyze the data. Using such an EDB, a curator can securely outsource its database to an untrusted server (e.g., on-premise or in the cloud) while still allowing an analyst to privately query it. We show how to design an EDB that supports private histogram queries. As a building block, we introduce a differentially-private encrypted counter based on the binary mechanism of Chan et al. (ICALP, 2010). We then carefully combine multiple instances of this counter with a standard encrypted database scheme to support differentially-private histogram queries.

Author(s):  
Mengdi Huai ◽  
Di Wang ◽  
Chenglin Miao ◽  
Jinhui Xu ◽  
Aidong Zhang

Although releasing crowdsourced data brings many benefits to the data analyzers to conduct statistical analysis, it may violate crowd users' data privacy. A potential way to address this problem is to employ traditional differential privacy (DP) mechanisms and perturb the data with some noise before releasing them. However, considering that there usually exist conflicts among the crowdsourced data and these data are usually large in volume, directly using these mechanisms can not guarantee good utility in the setting of releasing crowdsourced data. To address this challenge, in this paper, we propose a novel privacy-aware synthesizing method (i.e., PrisCrowd) for crowdsourced data, based on which the data collector can release users' data with strong privacy protection for their private information, while at the same time, the data analyzer can achieve good utility from the released data. Both theoretical analysis and extensive experiments on real-world datasets demonstrate the desired performance of the proposed method.


2020 ◽  
Vol 16 (4) ◽  
pp. 291-300
Author(s):  
Zhenyu Gao ◽  
Yixing Li ◽  
Zhengxin Wang

AbstractThe recently concluded 2019 World Swimming Championships was another major swimming competition that witnessed some great progresses achieved by human athletes in many events. However, some world records created 10 years ago back in the era of high-tech swimsuits remained untouched. With the advancements in technical skills and training methods in the past decade, the inability to break those world records is a strong indication that records with the swimsuit bonus cannot reflect the real progressions achieved by human athletes in history. Many swimming professionals and enthusiasts are eager to know a measure of the real world records had the high-tech swimsuits never been allowed. This paper attempts to restore the real world records in Men’s swimming without high-tech swimsuits by integrating various advanced methods in probabilistic modeling and optimization. Through the modeling and separation of swimsuit bias, natural improvement, and athletes’ intrinsic performance, the result of this paper provides the optimal estimates and the 95% confidence intervals for the real world records. The proposed methodology can also be applied to a variety of similar studies with multi-factor considerations.


Micromachines ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 118
Author(s):  
Jean-Laurent Pouchairet ◽  
Carole Rossi

For the past two decades, many research groups have investigated new methods for reducing the size and cost of safe and arm-fire systems, while also improving their safety and reliability, through batch processing. Simultaneously, micro- and nanotechnology advancements regarding nanothermite materials have enabled the production of a key technological building block: pyrotechnical microsystems (pyroMEMS). This building block simply consists of microscale electric initiators with a thin thermite layer as the ignition charge. This microscale to millimeter-scale addressable pyroMEMS enables the integration of intelligence into centimeter-scale pyrotechnical systems. To illustrate this technological evolution, we hereby present the development of a smart infrared (IR) electronically controllable flare consisting of three distinct components: (1) a controllable pyrotechnical ejection block comprising three independently addressable small-scale propellers, all integrated into a one-piece molded and interconnected device, (2) a terminal function block comprising a structured IR pyrotechnical loaf coupled with a microinitiation stage integrating low-energy addressable pyroMEMS, and (3) a connected, autonomous, STANAG 4187 compliant, electronic sensor arming and firing block.


2021 ◽  
Author(s):  
Tomas Rosén ◽  
Ruifu Wang ◽  
HongRui He ◽  
Chengbo Zhan ◽  
Shirish Chodankar ◽  
...  

During the past decade, cellulose nanofibrils (CNFs) have shown tremendous potential as a building block to fabricate new advanced materials that are both biocompatible and biodegradable. The excellent mechanical properties...


2002 ◽  
Vol 2 (4-5) ◽  
pp. 423-424 ◽  
Author(s):  
MAURICE BRUYNOOGHE ◽  
KUNG-KIU LAU

This special issue marks the tenth anniversary of the LOPSTR workshop. LOPSTR started in 1991 as a workshop on Logic Program Synthesis and Transformation, but later it broadened its scope to logic-based Program Development in general.The motivating force behind LOPSTR has been a belief that declarative paradigms such as logic programming are better suited to program development tasks than traditional non-declarative ones such as the imperative paradigm. Specification, synthesis, transformation or specialisation, analysis, verification and debugging can all be given logical foundations, thus providing a unifying framework for the whole development process.In the past ten years or so, such a theoretical framework has indeed begun to emerge. Even tools have been implemented for analysis, verification and specialisation. However, it is fair to say that so far the focus has largely been on programming-in-the-small. So the future challenge is to apply or extend these techniques to programming-in-the-large, in order to tackle software engineering in the real world.


1976 ◽  
Vol 50 (4) ◽  
pp. 503-513 ◽  
Author(s):  
Robert Craig West

Students of the origins and accomplishments of government regulation of economic activity have open suspected that the laws on which regulation is based were addressed to problems and conditions of the past that no longer prevailed, or — what is worse — assumptions about the “real world” that are highly unrealistic. This is Professor West's main conclusion about the Federal Reserve Act of 1913, especially as regards its discount rate and international exchange policies.


2001 ◽  
Vol 14 ◽  
pp. 56-57 ◽  
Author(s):  
Bettina Bergmann

We have reached an important moment in the study of the Roman house. The past 20 years have been extremely active, with scholars approaching domestic space down different disciplinary and methodological avenues. Since the important essay on Campanian houses by A. Wallace-Hadrill in 1988, new excavations and scores of books and articles have changed the picture of Pompeii and, with it, that of the Roman house. Theoretical archaeologists have taken the lead, approaching Pompeii as an "archaeological laboratory" in which, armed with the interpretative tools of spatial and statistical analysis, they attempt to recover ancient behavioral patterns. The interdisciplinary picture that emerges is complex and inevitably contradictory. There is so much new information and such a tangle of perspectives that it is time to consider what we have learned and what kinds of interpretative tools we might best employ. Without doubt this is an exciting time in Roman studies. But two overviews of recent scholarship to appear this year, the present one by R. Tybout and another by P. Allison (AJA 105.2 [2001]), express considerable frustration and resort to ad hominem recriminations that signal a heated backlash, at least among some.


2021 ◽  
pp. 026638212110619
Author(s):  
Sharon Richardson

During the past two decades, there have been a number of breakthroughs in the fields of data science and artificial intelligence, made possible by advanced machine learning algorithms trained through access to massive volumes of data. However, their adoption and use in real-world applications remains a challenge. This paper posits that a key limitation in making AI applicable has been a failure to modernise the theoretical frameworks needed to evaluate and adopt outcomes. Such a need was anticipated with the arrival of the digital computer in the 1950s but has remained unrealised. This paper reviews how the field of data science emerged and led to rapid breakthroughs in algorithms underpinning research into artificial intelligence. It then discusses the contextual framework now needed to advance the use of AI in real-world decisions that impact human lives and livelihoods.


Author(s):  
Darrel Moellendorf

This chapter notes that normative International Political Theory (IPT) developed over the past several decades in response to political, social, and economic events. These included the globalization of trade and finance, the increasing credibility of human-rights norms in foreign policy, and a growing awareness of a global ecological crisis. The emergence of normative IPT was not simply an effort to understand these events, but an attempt to offer accounts of what the responses to them should be. Normative IPT, then, was originally doubly responsive to the real world. Additionally, this chapter argues that there is a plausible account of global egalitarianism, which takes the justification of principles of egalitarian justice to depend crucially on features of the social and economic world. The account of global egalitarianism applies to the current circumstances in part because of features of those circumstances.


1941 ◽  
Vol 1 (1) ◽  
pp. 26-41 ◽  
Author(s):  
Simon Kuznets

This paper deals with the relation between statistical analysis as applied in economic inquiry and history as written or interpreted by economic historians. Although both these branches of economic study derive from the same body of raw materials of inquiry—the recordable past and present of economic society—each has developed in comparative isolation from the other. Statistical economists have failed to utilize adequately the contributions that economic historians have made to our knowledge of the past; and historians have rarely employed either the analytical tools or the basic theoretical hypotheses of statistical research. It is the thesis of this essay that such failure to effect a close interrelation between historical approach and statistical analysis needs to be corrected in the light of the final goal of economic study.


Sign in / Sign up

Export Citation Format

Share Document