资源预览内容
第1页 / 共74页
第2页 / 共74页
第3页 / 共74页
第4页 / 共74页
第5页 / 共74页
第6页 / 共74页
第7页 / 共74页
第8页 / 共74页
第9页 / 共74页
第10页 / 共74页
亲,该文档总共74页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
1Communication Theory2Lecture 3 Random ProcessCommunication Theory3Lecture 3 Random Processl3.1 Basic concepts of random processnWhat is random process?uA random process is a natural extension of the concept of random variable when dealing with signals. In analyzing communication systems we are basically dealing with time varying signals. In our development so far, we have assumed that all the signals are deterministic. In many situations the deterministic assumption on time-varying signals is not a valid assumption, and it is more appropriate to model signals as random rather than deterministic functions. 4uOne way is to view a random process as a collection of time functions, or signals, corresponding to various outcomes of a random experiment.【Ex】We have n oscilloscopes recording n receivers noise outputspSample function i (t): one realization of random process, which is deterministic functionpRandom process: (t) =1 (t), 2 (t), , n (t) is the set of all sample functions.Lecture 3 Random Process5uRandom process can also be viewed as extension of random variablepAt any time t1, each sample function i (t) is deterministic i (t1) . However, every i (t1) is unpredictable. pAt any time t1, the value of sample function is a random variable, denoted as (t1).pIn another word, at any time, random process is a random variable.pA random process is represented as a collection of random variables indexed by different time index.Lecture 3 Random Process6n3.1.1Distribution function of random processuLet (t) be a random process. At any time t1, (t1) is a random variable, whose statistical property is described by distribution function or probability density function (PDF).uOne dimensional distribution function:uOne dimensional PDFif the partial derivative exists.Lecture 3 Random Process7uTwo dimensional distribution function :uTwo dimensional PDF: if the partial derivative exists.un dimensional distribution function :un dimensional PDF:Lecture 3 Random Process8n3.1.2 Statistical averagesuMeanAt any time t1, (t1) is a random variable, whose mean iswhere f (x1, t1) the PDF of (t1)Since t1can be any t, we haveLecture 3 Random Process9 The mean of (t) is deterministic function of time, denoted as a ( t ), which represents the center of the n sample functions of the random process.a (t )Lecture 3 Random Process10uVariancedenoted as 2( t ). The any time t1 is replaced by t. Since the variance is the mean square value subtracted by the square of mean.mean square valuesquare of meanLecture 3 Random Process11uCorrelation function where (t1) and (t2) are the random variables observed at t1 and t2, respectively. It can be found that R(t1, t2) is deterministic function of t1 and t2.uCovariance functionwhere a ( t1 ) a ( t2 ) means of (t) at t1 and t2, respectively f2 (x1, x2; t1, t2) two dimensional PDF of (t).Lecture 3 Random Process12pThe relationship of correlation and covariance functionsIf a(t1) = a(t2)=0, we have B(t1, t2) = R(t1, t2)uCross-correlation functionwhere (t) and (t) are two random processes.Thus, R(t1, t2) is also called autocorrelation function.Lecture 3 Random Process13l3.2 Stationary random processn3.2.1 DefinitionIf the joint PDF of a random process (t) is independent with the time origin, i.e., for any positive integer n and all real numbers ,we say that the random process is strictly stationary random process.Lecture 3 Random Process14uPropertiesThe one-dimensional PDF of strictly stationary random process is independent with twhile the two-dimensional PDF is only correlated with the time difference = t2 t1uStatistical averagesLecture 3 Random Process15(1) The mean is independent with t; (2) The autocorrelation is only correlated with the time difference . A random process is wide-sense stationary (WSS) if (1) and (2) are satisfied. Obviously, a strictly stationary random process is WSS random process. However, a WSS random process may not be strictly stationary random process. Most of signals and noises in a communication system can be considered as a stationary random process. Therefore, the study of stationary random process is important. Lecture 3 Random Process16n3.2.2 Ergodic ProcessuTwo types of averages Statistical average (or ensemble average) uTime averageuIn practice, the statistical averages (mean, correlation function) of the random process, is often difficult to obtain since we should measure a large number of realizations (samples) simultaneously. It is nature to think whether we can obtain the statistical averages by measure time average of a sample function of the stationary process?uThe answer is yes.Lecture 3 Random Process17uConditions for ergodicityLet x(t) be a realization (sample) of stationary process (t), whose mean over time and correlation function over time are defined as If the following conditions are satisfied,the stationary process is ergodic.Lecture 3 Random Process18uErgodicity Statistical average (or ensemble average) = time averageuAn ergodic random process is stationary. However, a stationary random process may not be ergodic. uA significant part of signals and noises in a communication system can be considered as an ergodic random process.Lecture 3 Random Process19u Ex3-1 Let a cosine wave with a random phase bewhere A and c are constant. is random variables uniformly distributed over (0, 2). Is (t) ergidic?【Sol】(1) we obtain the statistical average of (t) as MeanLecture 3 Random Process20Autocorrelation functionLet t2 t1 = . We haveThe mean of (t) is constant and autocorrelation is independent with t. (t) is WSS stationary.Lecture 3 Random Process21 (2) Time average of (t)Compare with the statistical and time averages, The cosine wave with random phase is ergodic. However, consider another example Lecture 3 Random Process22n3.2.3 Autocorrelation function of stationary processuPropertiesp The average power of (t)p even function of p The upper bound of R()R() has the highest value at = 0.p The DC power of (t)p The AC power of (t). When a=0, R(0) = 2.Lecture 3 Random Process23n3.2.4 PSD of stationary processuDefinitionpFor deterministic signal, the PSD iswhere FT ( f ) is the FT of fT (t) which is the truncate of f (t).Lecture 3 Random Process24pLet (t) denote a random process and let f (t) denote a sample function of this process. Various sample functions give rise to various PSDs. It makes sense to define the power spectrum as the ensemble average of these various PSDs, i.e., Lecture 3 Random Process25uComputation of PSDpWiener-Khinchin theorem The autocorrelation function and PSD of stationary random process are a Fourier transform pair. It is an important tool in the theory and applications of stationary random process, which connect the analysis methods in frequency domain and time domain.Lecture 3 Random Process26pBased on Wiener-Khinchin theorem, we obtain The PSD of any realization (sample) of ergodic random process is equal to the PSD of random process.【Proof】 Since autocorrelation function of any realization (sample) of ergodic random process is equal to the autocorrelation function of random process. After FT, i.e. where Lecture 3 Random Process27PSD P ( f ) is non-negative, real and evenandsince R() is real and even.Lecture 3 Random Process28p【Ex3-2】Obtain the PSD of (t) = Acos(ct + ) where is random phase.【Sol】In 【Ex3-1】, we have obtain the autocorrelation of (t)Since the autocorrelation function and PSD of stationary random process are a Fourier transform pair, and The PSD isThe average power isLecture 3 Random Process29l 3.3 Gaussian (normal) random processn3.3.1 DefinitionuIf n-dimensional PDF of random process (t) is Gaussian, (t) is Gaussian (normal) random process.u n-dimensional PDF iswhere Lecture 3 Random Process30|B| the determinant of normalized covariance matrix|B|jk the determinant of B without jth row and kth columnbjk normalized covariance functionLecture 3 Random Process31n 3.3.2 PropertiesuGaussian random process is only depend on the mean, variance and normalized covariance.uWSS Gaussian random process is also strictly stationary. Since the n-dimensional PDF of Gaussian random process is only depend on the mean, variance and normalized covariance, if Gaussian random process is WSS, i.e., the mean, variance and normalized covariance are independent with time origin, the n-dimensional PDF is also independent with time origin, i.e., Gaussian random process is also strictly stationary.Lecture 3 Random Process32uIf the values of Gaussian random process at different time are uncorrelated, i.e. for all j k, bjk =0,the PDF is simplified aswhich state that uncorrelated means independent.uLinear transform of Gaussian random process is also Gaussian, i.e. if the input of a linear system is Gaussian, the output is also GaussianLecture 3 Random Process33n 3.3.3 Gaussian random variableuDefinition: At any time, Gaussian random process is a Gaussian random variable, whose PDF iswherea mean 2 varianceLecture 3 Random Process34uPropertiespf (x) is symmetric about x = ap pWhen a = 0 and = 1, it is standard normal distributionLecture 3 Random Process35uGaussian distribution function This integral has no closed form solution. pExpress Gaussian distribution function by using error function Let . We have and where error functionLecture 3 Random Process36pExpress Gaussian distribution function by using complementary error functionwhereWhen x 2, Lecture 3 Random Process37pExpress Gaussian distribution function by using Q-function Definition of Q-function The relationship of Q-function and erfc functionThe relationship of Q-function and Gaussian distribution function F(x)Lecture 3 Random Process38l3.4 Transmission over LTI SystemsnFor deterministic signalswhere vi input signal, vo output signalCorresponding Fourier TransformnFor random signalsuAssume i(t) stationary random process a mean Ri() autocorrelation function Pi() PSDObtain the mean, autocorrelation function, PSD, and PDF of output signal o(t).Lecture 3 Random Process39uThe mean of o(t) Statistically averaging over the two sides of equation,we haveSince the input signal is stationary, where H(0) is the frequency response at f = 0 of the LTI system.Lecture 3 Random Process40uThe autocorrelation function of o(t)Since the input signal is stationary, we have From above, we know that if the input of LTI system is stationary, the output is also stationary.Lecture 3 Random Process41uThe PSD of o(t)Computing FT of autocorrelation functionwe obtainLet = + - and substitute into above equation,i.e. Application: obtain Ro() from Po( f ) by IFTLecture 3 Random Process42uPDF of o(t)pIf the input of LTI system is Gaussian, the output is also GaussianFrom principle of integration, can be expressed as The sum of infinite of Gaussian is also Gaussian.Lecture 3 Random Process43l3.5 Bandpass (narrowband) random processnDefinition If the PSD of random process (t) is concentrated in the relatively narrow range of the band f around fc, i.e. f fc, where fc is far away from the zero frequency, the random process is called bandpass random process. Lecture 3 Random Process44nTypical sample function and PSD of bandpass random processesLecture 3 Random Process45nExpression of bandpass random processwhere a (t) random envelop (t) random phase c central angel frequencyThe varying of a (t) and (t) is much slower than the carrier cosct.Lecture 3 Random Process46nExpansion of bandpass random processis expanded aswhere in-phase component of (t) quadrature component of (t) The statistical property of (t) is determined by those of a (t) and (t) or c(t) and s(t). Lecture 3 Random Process47n3.5.1 The statistical property of c(t) and s(t)pMeanWe obtain Since (t) is stationary and has zero mean at any time t, E(t) = 0, Lecture 3 Random Process48pThe autocorrelation of (t)whereSince (t) is stationary, the right-hand side of the equation is independent with t.Let t = 0. Lecture 3 Random Process49Furthermore,Thus, we haveLet t = /2c.From above analysis, if (t) is stationary, c(t) and s(t) are also stationary.Lecture 3 Random Process50pFurther,satisfy simultaneously. We haveFrom property of cross-correlation function, Substituting into above equation, we haveRsc() is odd function, Similarly, Lecture 3 Random Process51Substituting into following equationsWe obtain i.e.Lecture 3 Random Process52pFrom the property of stationary that the random process is independent with t and WhenIf (t) is Gaussian, c(t1) and s(t2) are also Gaussian.pFrom c(t) and s(t) are uncorrelated at = 0. Since they are also Gaussian, c(t) and s(t) are independent. Lecture 3 Random Process53uConclusion: The in phase component c(t) and quadrature component s(t), of a bandpass stationary zero mean Gaussian random process (t), are stationary zero mean Gaussian process. Their variances are the same. At the same time, c and s are independent.Lecture 3 Random Process54n3.5.2 The statistical property of envelop a(t) and phase (t)uThe joint PDF f (a , )由可以求得Lecture 3 Random Process55We have wherea 0, = (0 2)Lecture 3 Random Process56uThe PDF of aa is Rayleigh distributed random variable.Lecture 3 Random Process57uThe PDF of is uniformly distributed random variable.Lecture 3 Random Process58uConclusion: For a bandpass stationary zero mean Gaussian random process (t), its envelop a(t) is Rayleigh random process and the phase (t) is uniform random process. a(t) and (t) are independent, i.e.Lecture 3 Random Process59l3.6 Cosine wave plus bandpass Gaussian noisenThe expression where bandpass Gaussian noise random phase of cosine wave, uniformly distributed over (0 2 A and c deterministic amplitude and angel frequencyThuswhereLecture 3 Random Process60nThe envelop and phase expression of cosine wave plus bandpass Gaussian noiseEnvelop:Phase :Lecture 3 Random Process61nThe statistical property of cosine wave plus bandpass Gaussian noiseuThe PDF f (z) of the envelopIf is constant, zc and zs are independent Gaussian random variables,Given , the joint PDF of zc and zs isLecture 3 Random Process62From the relationship of zc, zs and z, Given , the joint PDF of z and isThusLecture 3 Random Process63Sincewe havewhere I0(x) modified Bessel function of first kind and zero order Thusf (, z) is independent with . The PDF of z isRice distributionLecture 3 Random Process64uDiscussionpWhen the signal is weak, i.e. A 0, Az/n2 is small.I0 (Az/n2) 1, the Rice distribution is degenerated as Rayleigh distribution.pWhen Az/n2 is large, we havef(z) is approximated as Gaussian distributionLecture 3 Random Process65pThe PDF f (z)Lecture 3 Random Process66nThe PDF of phaseF()Lecture 3 Random Process67l3.7 Gaussian white noise and band limited white noise nWhite noise n (t)uDefinition: The noise with PSD two side PSDor one side PSDwhere n0 positive constantuThe autocorrelation function of white noise IFT of PSDLecture 3 Random Process68uThe PSD and autocorrelation function of white noiseLecture 3 Random Process69uThe power of white noiseSince the bandwidth of white noise is unlimited, the average power is infinityorpThe white noise is an assumption of noise.pIn practice, when the PSD of noise is much larger than the communication frequency band, the above assumption is valid.pIf the PDF of white noise is Gaussian, we call it Gaussian white noise.pIn any two time instances, the Gaussian white noises are independent. Lecture 3 Random Process70nLowpass white noiseuDefinition: If the input of lowpass filter is white noise, the output is lowpass white noise. uPSDpFrom above, the PSD of noise is limited in | f | fH. The noise with such PSD is called lowpass white noise.uAutocorrelation functionLecture 3 Random Process71uThe curves of PSD and autocorrelation functionpFrom the curves, the lowpass white noise is only uncorrelated at the time difference ofLecture 3 Random Process72nBandpass white noiseuDefinition: If the input of bandpass filter is white noise, the output is bandpass white noise. uPSD of bandpass filter wherefc central frequency, B bandwidthThe PSD of bandpass white noiseLecture 3 Random Process73uAutocorrelation functionLecture 3 Random Process74uThe PSD and autocorrelation of bandpass white noiseLecture 3 Random Process
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号