资源预览内容
第1页 / 共93页
第2页 / 共93页
第3页 / 共93页
第4页 / 共93页
第5页 / 共93页
第6页 / 共93页
第7页 / 共93页
第8页 / 共93页
第9页 / 共93页
第10页 / 共93页
亲,该文档总共93页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
Chapter 7 Channel Coding Theory,7.1 The characteristic of continuous source 7.2 The channel capacity of continuous source 7.3 Error Control and the fundamental of Channel encoding and decoding 7.4 Linear Block Code 7.5 Convolutional Code,7.1 The characteristic of continuous source,7.1.1 Continuous source 7.1.2 The entropy of continuous source 7.1.3 The maximum entropy of continuous source,7.1 The characteristic of continuous source,7.1.1 Continuous source In practical, the output of source is usually continuous signal, such as voice signal, television picture signal. Because they are continuous and random, the source is called continuous source, and the message source output can describe with stochastic process. For one continuous source , when a specific moment given, the value it takes is continuous, that means the time and amplitude are all continuous function.,7.1.2 The entropy of continuous source,The simplest continuous source can be described with one-dimension random variables. Random variable exists non-negative function ( ). And it satisfies Then it is thought that has a continuous distribution, or is a continuous variable. is the probability density function, is the probability distribution function.,Continuous variable satisfies: (1) (2) (3) is a monotone non-decreasing function. (4) is continuous from its left, that is. (5),Definition 7.1.1 For continuous source, if its probability density is, the entropy of the source is,7.1.3 The maximum entropy of continuous source,Theorem 7.1.1 For source which is in its uniform probability distribution has a maximum output entropy. Proof: under the constraint condition to calculate the when,reaches its maximum. Let , calculate its partial derivative and let it be zero, then After the simplification,Solve the equation to get Because , there is . Then,The definition of Channel Coding A way of encoding data in a communications channel that adds patterns of redundancy into the transmission path in order to lower the error rate. Such methods are widely used in wireless communications.,7.2 The channel capacity of continuous source,7.2.1 Random coding 7.2.2 Coding theorem,7.2 The channel capacity of continuous source,7.2.1 Random coding For random coding, there are choices when choose messages groups corresponding to the set M (code sets) from points of the N-dimensional vector space the proportion of set M account for the total points of the vector space :,7.2.1 Random coding,If set M is chosen from code-candidate at random, the average Error Probability is:,7.2.1 Random coding,To assume a code word of cord set from : turn to a received code word through the DMC channel,7.2.1 Random coding,Show function,Gallagher point the upper limit of the bit error:,7.2.2 Coding theorem The average probability of error:,7.2.2 Coding theorem,Look at the above formulary, in this formulary the code word probability is the product of its each symbol. The upper limit of concerns only with the channel but not the method of encoding.,7.2.2 Coding theorem,Encoding rate:,Defined function:,Reliability function:,Conclusion :,7.2.2 Coding theorem,Explains: These three figures show the relationships of the parameter in the channel respectively. If is settled, the relationship of E(R) and R is:,7.2.2 Coding theorem,Noisy Channel Coding Theorem : If the transmission rate R is less than C, then for any 0 there exists a code with block length n large enough whose error probability is less than .,7.2.2 Coding theorem,Converse to the Noisy Channel Coding Theorem: If RC, the probability of an error in a decoded block must approach one regardless of the code that might be chosen. These two theorems always appear together and are called: Noisy Channel Coding Theorem,7.3 Error Control and the fundamental of Channel encoding and decoding,7.3.1 The Error Control Method 7.3.2 code distance, error corrected and checked 7.3.3 Optimal decoding and maximum likelihood decoding,7.3 Error Control and the fundamental of Channel encoding and decoding,7.3.1 The Error Control Method (1)Method 1: For the same bit rate, the channel capacity is large as well as the reliability function E (R); For the same channel capacity, when the rate decreases The reliability function E (R) increased.,This figure illustrates the method of increasing E(R),7.3.1 The Error Control Method,The following measures can be taken to reduce the probability of error: (1).increase the channel capacity: Extend the bandwidth; Increase the power; Reduce noise.,7.3.1 The Error Control Method,(2)Reduce the bit rate R: q and N are unchanged but decrease k, which means that reduce the rate of information source and it transmits less information each second; q and k are unchanged but increase N, which means that increase symbol rate (baud rate ) and occupy more bandwidth ; k and N are unchanged but decrease N, which
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号