购买
下载掌阅APP,畅读海量书库
立即打开
畅读海量书库
扫码下载掌阅APP

2.1
Introduction

Information theory [5] and coding technology were born in the procedure of the changing of communication technology from analog communication to digital communication [6] .In fact,the earliest Morse telegraph system was a digital communication system.The system used dot,dash and space to encode English letters and transmit them in the form of electrical current pulses in the wire.However,the telephone that adopted analog communication provided more convenient voice communication and soon replaced the telegraph as the mainstream communication mode.Analog communication is represented by telephone and AM/FM receiver.It is very simple in concept and system architecture.Its core components are modem and amplifier.The main requirement of system design is linearity,that is,the output waveform y t )should be a linear function of the input waveform x t ),i.e., y t )= kx t ),otherwise distortion will occur.However,this simple design requirement is very difficult to implement.Due to the nonlinearity of the amplifier,noise interference in the transmission,and the unpredictability of the waveform sent by the source,it is difficult to achieve perfect linearity when designing an analog communication system.

In contrast to analog communication system,digital communication system is very complex in concept and system composition.Both sender and receiver include many function modules.Nevertheless,the design and implementation of a digital communication system is much easier,and the system performance is far better than analog communication.Firstly,a digital communication system sends a waveform out of finite waveforms within a limited time interval,while an analog communication system selects one of the infinite waveforms to send.Secondly,different from analog communication systems,the task of the receiver in a digital communication system is not to accurately reproduce the transmitted waveform,but to determine which waveform is sent out from the transmitter according to the received signal disturbed by noise.Therefore,a digital communication system is more concerned with the probability of decision error than the distortion of the signal waveform,which greatly reduces the requirements for system linearity.Thirdly,for relay communication,analog communication cannot correct the waveform distortion at the intermediate node,so it has the effect of distortion accumulation.In a digital communication system,as long as the intermediate node can correctly determine which waveform is sent from the upstream,it can regenerate the waveform and transmit it to the downstream,thus eliminating the distortion accumulation.Finally,digital communication can use coding technology,such as password and error correction code,which greatly improves the security and reliability of communication.

By analyzing and comparing the technical characteristics of analog communication and digital communication,it can be found that the whole system of digital communication,including the transmitter,channel and receiver,presents obvious statistical characteristics,which requires researchers and engineers to make significant changes and updates to the communication equipment manufacturing industry.Moreover,at the theoretical level,such as information representation,transmission,detection,estimation,sampling,quantization and coding,it is necessary to establish a theoretical framework that can reflect the statistical characteristics of communication systems.Information theory was born in this context.

In the 1940s,as people view communication engineering more and more from the perspective of statistical science,and more and more apply the methods of probability statistics and stochastic process to model and analyze communication problems,it is necessary to establish a set of statistical theory on information quantity.Additionally,there are some theoretical problems to be solved,including:

(1)What is the essence of information?

(2)How to measure information?

(3)What is the performance bound of channel transmission?

Based on the work of Nyquist,Hartley,and Reeves,Shannon published a paper A Mathematical Theory of Communication in the Bell System Technical Journal in July and October 1948,which is hereinafter referred to as Shannon's 1948 paper.The paper systematically answers the above three problems,builds the framework of information theory,and marks the birth of information theory.

The concept of information in a narrow sense is generated from and serves for the communication engineering,which reflects the statistical characteristics of information,so it is also called statistical information.In history,different researchers have put forward a variety of definitions for information.For example,Hartley believes that information is the way that the sender selects symbols from the symbol table and proposes to measure the amount of information with the freedom of selecting symbols.Longe believes that information is something that reflects the form,relationship and difference of things.Shannon's 1948 paper uses uncertainty to describe information,and believes that “Information is the description of the uncertainty of motion states or existence ways of things” .This understanding can better reflect the statistical characteristics of information.Because the uncertain things in mathematics are random variable or random process,the information source can be modelled as a random variable or a random process.Based on this understanding,Shannon inherited and developed Hartley's idea of logarithmic probability measure and defined entropy measure for the amount of information.The concept of entropy plays the cornerstone role in information theory.

In the application of communication engineering,information,message and signals are three easily confused concepts.We can understand the meaning of information by comparing these three concepts.Information is the description of the uncertainty of motion states or existence ways of things,and it is the connotation.Message is the external show of information,such as text,picture,voice,etc.,so it is the extension.Signals are physical forms of messages,such as sound waves and radio waves.

The purpose of studying information theory is to improve the efficiency and reliability of communication systems.Efficiency refers to describing the information source with as few bits as possible,thus reducing the transmission time.Source coding can improve the efficiency of communication by compressing the redundant components of the source.Reliability means that the message transmission is accurate,that is,the message received by the sink through the decoding decision is identical to the message sent by the source,or the message sent by the source can be recovered or corrected from the wrong receiving sequence even if an error occurs.Reliability depends on channel coding,which is also called error control coding or error correction coding.Channel coding and source coding are completely different in function and working principle.Channel coding needs to add redundant bits to the information bits,and forms a certain constraint relationship between the information bits and the redundant bits.Error detection is performed by checking the constraint relationship between the two parts of bits at the sink end.If errors occur,they should be corrected with best effort.Although the history of source coding is earlier than channel coding(the earliest Morse code is a source coding,and Hamming code appeared more than 100 years later),its development lags behind that of channel coding.At present,the channel coding based on algebraic theory has formed a complete theoretical system and a unified construction method,but the source coding has not yet formed a unified construction theory.Different source coding methods need to be used for different sources(text,voice,image,video,etc.).With the development of information theory,source coding and channel coding have gradually developed into two relatively independent research fields,in which researchers are still working and new coding methods are often proposed. Ah0DJQ2kmHVgDs+1Wv2Q10Uo1TUwTJyhB6tcJY43Xt4Rc3tXKrOQqdZgO4z9agmd

点击中间区域
呼出菜单
上一章
目录
下一章
×