资源预览内容
第1页 / 共56页
第2页 / 共56页
第3页 / 共56页
第4页 / 共56页
第5页 / 共56页
第6页 / 共56页
第7页 / 共56页
第8页 / 共56页
第9页 / 共56页
第10页 / 共56页
亲,该文档总共56页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述
<p>&lt;p&gt;Time Series Math 419/592 Winter 2009 Prof. Andrew Ross Eastern Michigan University Overview of Stochastic Models But first, a word from our sponsor Take Math 560 (Optimization) this fall! Sign up soon or it will disappear Outline ?Look at the data! ?Common Models ?Multivariate Data ?Cycles/Seasonality ?Filters Look at the data! or else! Atmospheric CO2 Years: 1958 to now; vertical scale 300 to 400ish Ancient sunspot data Our Basic Procedure 1.Look at the data 2.Quantify any pattern you see 3.Remove the pattern 4.Look at the residuals 5.Repeat at step 2 until no patterns left Our basic procedure, version 2.0 ?Look at the data ?Suck the life out of it ?Spend hours poring over the noise ?What should noise look like? One of these things is not like the others Stationarity ?The upper-right-corner plot is Stationary. ?Mean doesnt change in time ?no Trend ?no Seasons (known frequency) ?no Cycles (unknown frequency) ?Variance doesnt change in time ?Correlations dont change in time ?Up to here, weakly stationary ?Joint Distributions dont change in time ?That makes it strongly stationary Our Basic Notation ?Time is “t”, not “n” ?even though its discrete ?State (value) is Y, not X ?to avoid confusion with x-axis, which is time. ?Value at time t is Yt, not Y(t) ?because time is discrete ?Of course, other books do other things. Detrending: deterministic trend ?Fit a plain linear regression, then subtract it out: ?Fit Yt = m*t + b, ?New data is Zt = Yt m*t b ?Or use quadratic fit, exponential fit, etc. Detrending: stochastic trend ?Differencing ?For linear trend, new data is Zt = Yt Yt-1 ?To remove quadratic trend, do it again: ?Wt = Zt Zt-1=Yt 2Yt-1 + Yt-2 ?Like taking derivatives ?Whats the equivalent if you think the trend is exponential, not linear? ?Hard to decide: regression or differencing? Removing Cycles/Seasons ?Will get to it later. ?For the next few slides, assume no cycles/seasons. A brief big-picture moment ?How do you compare two quantities? ?Multiply them! ?If theyre both positive, youll get a big, positive answer ?If theyre both big and negative ?If one is positive and one is negative ?If one is big e.g. ?River flow data over many decades ?Traffic on computer networks How to calculate ACF ?R, Splus, SAS, SPSS, Matlab, Scilab will do it for you ?Excel: download PopTools (free!) ?http:/www.cse.csiro.au/poptools/ ?Excel, etc: do it yourself. ?First find avg. and std.dev. of data ?Next, find AutoCoVariance Function (ACVF) ?Then, divide by variance of data to get ACF ACVF at lag h ?Y-bar is mean of whole data set ?Not just mean of N-h data points ?Left side: old way, can produce correl&amp;gt;1 ?Right side: new way ?Difference is “End Effects” ?Pg 30 of Pea, Tiao, Tsay ?(if it makes a difference, youre up to no good?) Common Models ?White Noise ?AR ?MA ?ARMA ?ARIMA ?SARIMA ?ARMAX ?Kalman Filter ?Exponential Smoothing, trend, seasons White Noise ?Sequence of I.I.D. Variables et ?mean=zero, Finite std.dev., often unknown ?Often, but not always, Gaussian AR: AutoRegressive ?Order 1: Yt=a*Yt-1 + et ? E.g. New = (90% of old) + random fluctuation ?Order 2: Yt=a1*Yt-1 +a2*Yt-2+ et ?Order p denoted AR(p) ?p=1,2 common; &amp;gt;2 rare ?AR(p) like pth order ODE ?AR(1) not stationary if |a|&amp;gt;=1 ?EYt = 0, can generalize Things to do with AR ?Find appropriate order ?Estimate coefficients ?via Yule-Walker eqn. ?Estimate std.dev. of white noise ?If estimated |a|&amp;gt;0.98, try differencing. MA: Moving Average ?Order 1: ?Yt = b0et +b1et-1 ?Order q: MA(q) ?In real data, much less common than AR ?But still important in theory of filters ?Stationary regardless of b values ?EYt = 0, can generalize ACF of an MA process ?Drops to zero after lag=q ?Thats a good way to determine what q should be! ACF of an AR process? ?Never completely dies off, not useful for finding order p. ?AR(1) has exponential decay in ACF ?Instead, use Partial ACF=PACF, which dies after lag=p ?PACF of MA never dies. ARMA ?ARMA(p,q) combines AR and MA ?Often p,q &amp;lt;= 1 or 2 ARIMA ?AR-Integrated-MA ?ARIMA(p,d,q) ?d=order of differencing before applying ARMA(p,q) ?For nonstationary data w/stochastic trend SARIMA, ARMAX ?Seasonal ARIMA(p,d,q)-and-(P,D,Q)S ?Often S= ?12 (monthly) or ? 4 (quarterly) or ?52 (weekly) ?Or, S=7 for daily data inside a week ?ARMAX=ARMA with outside explanatory variables (halfway to multivariate time series) State Space Model, Kalman Filter ?Underlying process that we dont see ?We get noisy observations of it ?Like a Hidden Markov Model (HMM), but state is continuous rather than discrete. ?AR/MA, etc. can be written in this form too. ?State evolution (vector): St = F * St-1 + ht ?Observations (scalar): Yt = H * St + et ARCH, GARCH(p,q) ?(Generalized) AutoRegressive Conditional Heteroskedastic (heteroscedastic?) ?Like ARMA but variance changes randomly in time too. ?Used for many financial models Exponential Smoothing ?More a method than a model. Exponential Smoo&lt;/p&gt;</p>
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号