资源预览内容
第1页 / 共24页
第2页 / 共24页
第3页 / 共24页
第4页 / 共24页
第5页 / 共24页
第6页 / 共24页
第7页 / 共24页
第8页 / 共24页
第9页 / 共24页
第10页 / 共24页
亲,该文档总共24页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
2014-8-311Information theoryY. DINGEmail: Email: Email: Email: eeyhdingscut.edu.cneeyhdingscut.edu.cneeyhdingscut.edu.cneeyhdingscut.edu.cn TelTelTelTel:87112490871124908711249087112490Reference book:Principles of digital communicationsAuthors:Suili FENG et alPublishing house:PHEI2About the course:3?Material ?Energy ?InformationAbout the course:Chapter 4. Fundamentals of information theory?Measure of information (entropy) ?Discrete channel and capacity ?Continuous source, channel and capacity ?Source coding; ?Rate distortion theory4.1 Introduction2014-8-312?Message v. s. Information(1)Message can be, but not limited to: symbols, letters, numbers, speeches, images etc. (2) Message may contain information, or no information. (e.g. SMS message, multi-media message )(3) Amount of information: amount of uncertainty reduced by the reception of the message.(4)Purpose of communication: information transmission(5) Milestone of information theory: “A Mathematical Theory of Communication” by Claude Elwood Shannon, 1948.第4章 信息论基础4.2 Measure of information (entropy)?Measure of information(1) Amount of information=amount of uncertaintyreduced by the reception of message;Uncertainty ? likelihood? probability Amount of information: a function of probability/probabilities ? (relationship with probability?)(2)Different messages may be different in amount of information.Measurement: should be additive in amount of informationAmount of information: a function of probability/probabilities第4章 信息论基础Measurement of discrete sourceDescription of statistical discrete source with N possible symbols:( )11=Niixp( )( )()()NN xpxpxpXpxxxX.:.:2121第4章 信息论基础BPSK,QPSK,16QAM ?Measurement of discrete sourceAmount of Information : a function of probability:ifandare statistically independent, satisfies the additivity property:definewe have( )( )iixpfxI=ixjx()()( )( )( )( )jijijijixpfxpfxpxpfxxpfxxI+=( )( )( )1logii iI xfp xp x=()()( )()( )()11loglogijijij ijI x xfp x xfp xp xp xp x=+第4章 信息论基础( )iI x()ijI x xMeasurement of discrete sourceDefinition: the amount of information carried by message x x x xi i i i:base 2 log: bitnatural log:nitbase 10 log: hart( )( )( )1loglogii iI P xP xP x= 第4章 信息论基础2014-8-313Measurement of discrete sourceExample:Source x follows the distribution as:four possible symbols are statistically independent, calculate the amount of information contained in the sequence S=“113200”.( )81414183:3210: XpX( )( )( ) ( ) ( ) ( ) ( ) ( )( )( )( )( )( )( ) ()bitsppppppppppppSpSI83.111415. 11415. 1232201log01log21log31log11log11log0023111log1log=+=+=第4章 信息论基础Information contained in one QPSK/16QAM symbol??Entropy: average amount of informationEntropy of discrete sourceDefinition: the entropy of sourceis defined as Physical significance: average amount of information contained in one symbol.NixXi,.,2 , 1,:=( )( )( )iNiixpxpXHlog1=第4章 信息论基础Statistical expectation?Entropy of discrete sourceExample:calculate the entropy of source X( )81414183:3210:XpX()( )( )()4133111111logloglogloglog884444881.906iiiH Xp xp x= = +=bits symbol第4章 信息论基础?Entropy of discrete sourceExample:assumethatthesymbolsabovearestatistically independent, calculated the information contained in the following sequence.201 020 130 213 001 203 210 100 321 010 023 102 002 10 312 032 100 120 210 (1)Exact computation (2)approximation using entropySolution 1: exact computation based on probability Solution 2: approximation using entropy( )()413111log23log14log13log7log8448107.55iiiInp x= = +=bits()()()()4123 14 1371.096108.62iiIn H X=+=bits第4章 信息论基础?Maximum entropy theoremDefinition: convex setfor we haveDefinition: for 型凸函数型凸函数型凸函数型凸函数(下凸函数下凸函数下凸函数下凸函数, convex function) 型凸函数型凸函数型凸函数型凸函数(上凸函数上凸函数上凸函数上凸函数, concave function)(凸凸凸凸/凹函数凹函数凹函数凹函数?)()1,2,.,iiin ixxxx=?()1,2,.,jjjn jxxxx=? nxR?01() 1ijxxx+?()()( ) ()()11ijijfxxf xf x+?()()( ) ()()11ijijfxxf xf x+? ,ijx xx? ? nxR?01第4章 信息论基础?Maximum entropy theoremConvex function has minimaConcave function has maxima ()2f x()121xx+() ()()121f xf x+()1f x1x()()121fxx+2xx( )f x第4章 信息论基础Example: Concave function2014-8-314?Maximum entropy theoremis a concave function, probability vector meets , we have:Using the above conclusion, we have the following theorem :Theorem: entropy is a concave function of vector()H X( )()()()12,.,Np xp xp x()12,.,Npp pp=?( )f x()( )11NNiiiiiifp xp f x=第4章 信息论基础Q:is concave, when does takes the maximum value?()H X11Niip=11Niip=()H X?Maximum entropy theoremTheorem: Entropy takes the maximum if x follows equal probability distribution:Namely:equally distributed source has greatest uncertainty.( )()()= =NiNNNNNNNHxpxpxpH1211log1
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号