Bloom Filters in Probabilistic Verification

Author(s):  
Peter C. Dillinger ◽  
Panagiotis Manolios
2018 ◽  
pp. 47-53
Author(s):  
B. Z. Shmeylin ◽  
E. A. Alekseeva

In this paper the tasks of managing the directory in coherence maintenance systems in multiprocessor systems with a large number of processors are solved. In microprocessor systems with a large number of processors (MSLP) the problem of maintaining the coherence of processor caches is significantly complicated. This is due to increased traffic on the memory buses and increased complexity of interprocessor communications. This problem is solved in various ways. In this paper, we propose the use of Bloom filters used to accelerate the determination of an element’s belonging to a certain array. In this article, such filters are used to establish the fact that the processor belongs to some subset of the processors and determine if the processor has a cache line in the set. In the paper, the processes of writing and reading information in the data shared between processors are discussed in detail, as well as the process of data replacement from private caches. The article also shows how the addresses of cache lines and processor numbers are removed from the Bloom filters. The system proposed in this paper allows significantly speeding up the implementation of operations to maintain cache coherence in the MSLP as compared to conventional systems. In terms of performance and additional hardware and software costs, the proposed system is not inferior to the most efficient of similar systems, but on some applications and significantly exceeds them.


2006 ◽  
Vol 39 (21) ◽  
pp. 310-315
Author(s):  
Maciej Wołowiec ◽  
Jakub Botwicz ◽  
Piotr Sapiecha

Electronics ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1455
Author(s):  
Rafael Genés-Durán ◽  
Juan Hernández-Serrano ◽  
Oscar Esparza ◽  
Marta Bellés-Muñoz ◽  
José Luis Muñoz-Tapia

Distrust between data providers and data consumers is one of the main obstacles hampering the take-off of digital-data commerce. Data providers want to get paid for what they offer, while data consumers want to know exactly what they are paying for before actually paying for it. In this article, we present a protocol that overcomes this obstacle by building trust based on two main ideas. First, a probabilistic verification protocol, where some random samples of the real dataset are shown to buyers in order to allow them to make an assessment before committing any payment; and second, a guaranteed, protected payment process enforced with smart contracts on a public blockchain that guarantees the payment of data if and only if the provided data meet the agreed terms, and that honest players are otherwise refunded.


2018 ◽  
Vol 15 (10) ◽  
pp. 117-128 ◽  
Author(s):  
Jinyuan Zhao ◽  
Zhigang Hu ◽  
Bing Xiong ◽  
Keqin Li

Sign in / Sign up

Export Citation Format

Share Document