QFlow: Quantitative Information Flow for Security-Aware Hardware Design in Verilog

Author(s):  
Lennart M. Reimann ◽  
Luca Hanel ◽  
Dominik Sisejkovic ◽  
Farhad Merchant ◽  
Rainer Leupers
2012 ◽  
Vol 37 (6) ◽  
pp. 1-5 ◽  
Author(s):  
Quoc-Sang Phan ◽  
Pasquale Malacaria ◽  
Oksana Tkachuk ◽  
Corina S. Păsăreanu

2015 ◽  
Vol 6 (2) ◽  
pp. 23-46
Author(s):  
Tom Chothia ◽  
Chris Novakovic ◽  
Rajiv Ranjan Singh

This paper presents a framework for calculating measures of data integrity for programs in a small imperative language. The authors develop a Markov chain semantics for their language which calculates Clarkson and Schneider's definitions of data contamination, data suppression, program suppression and program transmission. The authors then propose their own definition of program integrity for probabilistic specifications. These definitions are based on conditional mutual information and entropy; they present a result relating them to mutual information, which can be calculated by a number of existing tools. The authors extend a quantitative information flow tool (CH-IMP) to calculate these measures of integrity and demonstrate this tool with examples including error correcting codes, the Dining Cryptographers protocol and the attempts by a number of banks to influence the Libor rate.


2014 ◽  
Vol 25 (2) ◽  
pp. 457-479 ◽  
Author(s):  
MICHAEL BACKES ◽  
BORIS KÖPF

We provide a novel definition of quantitative information flow, called transmissible information, that is suitable for reasoning about informational-theoretically secure (or non-cryptographic) systems, as well as about cryptographic systems with their polynomially bounded adversaries, error probabilities, etc. Transmissible information captures deliberate communication between two processes, and it safely over-approximates the quantity of information that a process unintentionally leaks to another process.We show that transmissible information is preserved under universal composability, which constitutes the prevalent cryptographic notion of a secure implementation. This result enables us to lift quantitative bounds of transmissible information from simple ideal functionalities of cryptographic tasks to actual cryptographic systems.We furthermore prove a connection between transmissible information in the unconditional setting and channel capacity, based on the weak converse of Shannon's coding theorem. This connection enables us to compute an upper bound on the transmissible information for a restricted class of protocols, using existing techniques from quantitative information flow.


Sign in / Sign up

Export Citation Format

Share Document