A Codec Architecture for the Compression of Short Data Blocks

2017 ◽  
Vol 27 (02) ◽  
pp. 1850019 ◽  
Author(s):  
Jürgen Freudenberger ◽  
Mohammed Rajab ◽  
Daniel Rohweder ◽  
Malek Safieh

This work proposes a lossless data compression algorithm for short data blocks. The proposed compression scheme combines a modified move-to-front algorithm with Huffman coding. This algorithm is applicable in storage systems where the data compression is performed on block level with short block sizes, in particular, in non-volatile memories. For block sizes in the range of 1[Formula: see text]kB, it provides a compression gain comparable to the Lempel–Ziv–Welch algorithm. Moreover, encoder and decoder architectures are proposed that have low memory requirements and provide fast data encoding and decoding.

2013 ◽  
Vol 842 ◽  
pp. 712-716
Author(s):  
Qi Hong ◽  
Xiao Lei Lu

As a lossless data compression coding, Huffman coding is widely used in text compression. Nevertheless, the traditional approach has some deficiencies. For example, same compression on all characters may overlook the particularity of keywords and special statements as well as the regularity of some statements. In terms of this situation, a new data compression algorithm based on semantic analysis is proposed in this paper. The new kind of method, which takes C language keywords as the basic element, is created for solving the text compression of source files of C language. The results of experiment show that the compression ratio has been improved by 150 percent roughly in this way. This method can be promoted to apply to text compression of the constrained-language.


2016 ◽  
Vol 78 (6-4) ◽  
Author(s):  
Muhamad Azlan Daud ◽  
Muhammad Rezal Kamel Ariffin ◽  
S. Kularajasingam ◽  
Che Haziqah Che Hussin ◽  
Nurliyana Juhan ◽  
...  

A new compression algorithm used to ensure a modified Baptista symmetric cryptosystem which is based on a chaotic dynamical system to be applicable is proposed. The Baptista symmetric cryptosystem able to produce various ciphers responding to the same message input. This modified Baptista type cryptosystem suffers from message expansion that goes against the conventional methodology of a symmetric cryptosystem. A new lossless data compression algorithm based on theideas from the Huffman coding for data transmission is proposed.This new compression mechanism does not face the problem of mapping elements from a domain which is much larger than its range.Our new algorithm circumvent this problem via a pre-defined codeword list.  The purposed algorithm has fast encoding and decoding mechanism and proven analytically to be a lossless data compression technique.


2013 ◽  
Vol 21 (2) ◽  
pp. 133-143
Author(s):  
Hiroyuki Okazaki ◽  
Yuichi Futa ◽  
Yasunari Shidama

Summary Huffman coding is one of a most famous entropy encoding methods for lossless data compression [16]. JPEG and ZIP formats employ variants of Huffman encoding as lossless compression algorithms. Huffman coding is a bijective map from source letters into leaves of the Huffman tree constructed by the algorithm. In this article we formalize an algorithm constructing a binary code tree, Huffman tree.


2012 ◽  
Vol 2012 ◽  
pp. 1-20 ◽  
Author(s):  
Jonathan Gana Kolo ◽  
S. Anandan Shanmugam ◽  
David Wee Gin Lim ◽  
Li-Minn Ang ◽  
Kah Phooi Seng

Energy is an important consideration in the design and deployment of wireless sensor networks (WSNs) since sensor nodes are typically powered by batteries with limited capacity. Since the communication unit on a wireless sensor node is the major power consumer, data compression is one of possible techniques that can help reduce the amount of data exchanged between wireless sensor nodes resulting in power saving. However, wireless sensor networks possess significant limitations in communication, processing, storage, bandwidth, and power. Thus, any data compression scheme proposed for WSNs must be lightweight. In this paper, we present an adaptive lossless data compression (ALDC) algorithm for wireless sensor networks. Our proposed ALDC scheme performs compression losslessly using multiple code options. Adaptive compression schemes allow compression to dynamically adjust to a changing source. The data sequence to be compressed is partitioned into blocks, and the optimal compression scheme is applied for each block. Using various real-world sensor datasets we demonstrate the merits of our proposed compression algorithm in comparison with other recently proposed lossless compression algorithms for WSNs.


Author(s):  
H. Ferrada ◽  
T. Gagie ◽  
T. Hirvola ◽  
S. J. Puglisi

Advances in DNA sequencing mean that databases of thousands of human genomes will soon be commonplace. In this paper, we introduce a simple technique for reducing the size of conventional indexes on such highly repetitive texts. Given upper bounds on pattern lengths and edit distances, we pre-process the text with the lossless data compression algorithm LZ77 to obtain a filtered text, for which we store a conventional index. Later, given a query, we find all matches in the filtered text, then use their positions and the structure of the LZ77 parse to find all matches in the original text. Our experiments show that this also significantly reduces query times.


2011 ◽  
Vol 403-408 ◽  
pp. 2441-2444
Author(s):  
Hong Zhi Lu ◽  
Xue Jun Ren

According to the theory of simple linear regression model, this paper designed a lossless sensor data compression algorithm based on one-dimensional linear regression model. The algorithm computes the linear fitting values of sensor data’s differences and fitting residuals, which are input to a normal distribution entropy encoder to perform compression. Compared with two typical lossless compression algorithms, the proposed algorithm indicated better compression ratios.


Sign in / Sign up

Export Citation Format

Share Document