Time Series Analysis
The TimeSeriesAnalysis package is new to Maple 18 and deals with any data that varies with time. In particular, any data where the time intervals between data points are regular, such as with macroeconomical data and in many other fields like statistics, signal processing, econometrics, and mathematical finance. The TimeSeriesAnalysis package has many tools working with data including tools for analyzing and modeling, finding patterns and forecasting, and visualizing time series data.
with⁡TimeSeriesAnalysis
AIC,AICc,Apply,BIC,BoxCoxTransform,Decomposition,Difference,ExponentialSmoothingModel,Forecast,GetData,GetDateFormat,GetDates,GetHeaders,GetParameter,GetParameters,GetPeriod,Initialize,Join,LogLikelihood,LogTransform,LongestDefinedSubsequence,NumberOfParameters,OneStepForecasts,Optimize,SeasonalSubseriesPlot,SetParameter,Specialize,TimeSeries,TimeSeriesPlot,Unapply
Forecasting
Seasonal Analysis
Using the Time Series Analysis Package
Working with Time Series Data
More Details on Choosing Exponential Smoothing Models
References
The following example uses a data set containing the number of monthly air passengers (in thousands of passengers) from 1949 until 1960. The data is from Box, Jenkins, and Reinsel, noted in the references below.
path:=FileTools:-JoinPath⁡kernelopts'datadir',datasets,air_passengers.csv:
data:=ImportMatrix⁡path
The data has one column of dates (year) and one column of data (monthly passengers). The first row is a header.
data1..5
To work with this data, construct a TimeSeries object. Such an object can contain one or more data sets, measured at a common set of time points, as well as data headers and other metadata. You can construct this time series object as follows:
ts:=TimeSeries⁡data,'header'=1,'dates'=1,'period'=12
The options above indicate that the first row and column contain a header and the dates, respectively, and that you expect any seasonal characteristics to occur with period 12.
To inspect the data, you can use the GetData, GetDates, and GetHeaders commands.
GetData⁡ts
GetDates⁡ts
GetHeaders⁡ts
Monthly Passengers
Alternatively, you can plot the data using the TimeSeriesPlot command.
TimeSeriesPlot⁡ts
We can also look for seasonal trends in our data using the SeasonalSubseriesPlot command. The following plot shows the number of passengers on a monthly basis.
SeasonalSubseriesPlotts,seasonnames=January,February,March,April,May,June,July,August,September,October,November,December,size=800,300
You can now have Maple select a suitable model from a family of 30 related models, and adjust it to this time series. In fact, you will adjust it to the first 10 years of data; this can be done by specifying the time range as an index.
tstrain≔ts..1958-12
tsverify≔ts1959 ..
The family of models used is an exponential smoothing model.
model:=ExponentialSmoothingModel⁡tstrain
model:=< an ETS(M,M,M) model >
You can observe that the best model in this case has multiplicative errors, a multiplicative trend, and multiplicative seasonal information. You can get the details on this exponential smoothing model using the GetParameters command.
GetParameters⁡model
Use the model to predict two years of future data using the forecast command: the actual data for this time range is in tsverify.
forecast:=Forecast⁡model,tstrain,24
The forecast is itself a time series object that can be inspected using GetData, GetHeaders, and other commands.
GetHeaders⁡forecast
Monthly Passengers (forecast)
TimeSeriesPlot⁡tstrain,tsverify,forecast
There is reasonable agreement between the forecast and the verification data. You get a better handle on this if you include confidence intervals for all data points. You can then see how often the true data falls within the boundaries of the confidence intervals.
confidence:=Forecast⁡model,tstrain,24,output=confidenceintervals⁡80,95
TimeSeriesPlot⁡tstrain,forecast,confidence,tsverify,color=Red,thickness=3
Use the model to decompose the data set into several components; the number of components depends on the model. This is done using the Decomposition command. For an exponential smoothing model, there are always level and residual components. There may also be trend and seasonal components depending on whether or not the model has those properties. This can be used to, for example, correct for seasonal influences or smooth data.
decomposition:=Decomposition⁡model,tstrain
TimeSeriesPlot⁡tstrain,decomposition,split=pertimeseries
Because the seasonal component is multiplicative, you can compensate for it by dividing the original data by it.
trainingdata:=GetData⁡tstrain
seasonaldata:=GetData⁡decomposition.., Monthly Passengers (seasonal)
nonseasonaldata ≔ TimeSeriestrainingdata /~ seasonaldata, 'header' = Monthly Passengers (deseasonalized), 'dates' = GetDateststrain
TimeSeriesPlot⁡nonseasonaldata
There are several commands that can be used to modify existing time series objects. Using the data from above, you can use the Join command to merge the forecast with the training data set.
merged:=Join⁡tstrain,forecast
TimeSeriesPlot⁡merged
Another command, Difference, applies a differencing transformation in a flexible way. The LogTransform command applies a logarithm transformation, and BoxCoxTransform generalizes that to a general Box-Cox transformation.
differenced:=Apply⁡Difference,ts
logs:=Apply⁡LogTransform,ts
boxcox:=Apply⁡BoxCoxTransform⁡λ=13,ts
TimeSeriesPlot⁡ts,differenced,logs,boxcox,split=pertimeseries
You can manually step through the process of finding a suitable model for the data set using the Specialize, Initialize, and Optimize commands. The ExponentialSmoothingModel command generates an exponential smoothing model object; it represents a wide range of models. In this case, you can see that the data has a strong seasonal component, so you might be able to guess that you can discard the models that do not take the seasonal component into account.
general_model:=ExponentialSmoothingModel⁡'seasonal=A,M'
general_model:=< an ETS(*,*,*) model >
You can now specialize this to all individual model formulations represented by the general model. That is, where the seasonal component is additive or multiplicative. Some models are excluded by default because they are subject to numerical difficulties: the forecasts have infinite variance. (They can be included by overriding an option to Specialize.)
individual_models:=Specialize⁡general_model,ts
individual_models:=< an ETS(A,A,A) model >,< an ETS(A,Ad,A) model >,< an ETS(A,N,A) model >,< an ETS(M,A,A) model >,< an ETS(M,A,M) model >,< an ETS(M,Ad,A) model >,< an ETS(M,Ad,M) model >,< an ETS(M,M,M) model >,< an ETS(M,Md,M) model >,< an ETS(M,N,A) model >,< an ETS(M,N,M) model >
Each individual model now has a slightly different set of model equations. They all still have a number of parameters and initial values for the model's state variables and need to be optimized for the best fit. This optimization process consists of a number of iterations. In each iteration, Maple picks a new set of values for the parameters and initial state values, then runs the simulation using the model, and finally, computes the deviations from the actual observed data. The first set of values (which initializes the optimization process) is computed directly from the actual data, by the Initialize command.
initialization_tables:=map⁡Initialize,individual_models,ts:
Here is an example of these initialization values:
individual_models1,initialization_tables1
You can now perform the optimization.
map⁡Optimize,individual_models,ts
−579.1189017,−587.8357482,−587.5947092,−577.5718572,−527.4987042,−559.4653237,−528.6819747,−527.8440927,−527.3996887,−565.0019522,−535.5928547
The optimization process sets all parameters and initial state values to the optimal values found. The Optimize command returns a measure for how close the fit between the simulation run and the actual data is. However, these models all have different numbers of parameters. If a similar fit is achieved with a model that has fewer parameters, then the principle of parsimony says you should prefer the latter model. This is quantified in a so-called information criterion: a function that takes both the closeness of the fit and the number of parameters into account. The TimeSeriesAnalysis package has three information criteria built in: two versions of Akaike's Information Criterion (AICc, which includes a correction for small data sizes, and AIC, which does not), and the Bayesian Information Criterion (BIC). Let us compare the BIC for all these models.
forminindividual_modelsdo printm,BIC⁡m,tsend do:
< an ETS(A,A,A) model >,1412.79583526645
< an ETS(A,Ad,A) model >,1445.84094061076
< an ETS(A,N,A) model >,1422.76337107009
< an ETS(M,A,A) model >,1722.03500867297
< an ETS(M,A,M) model >,1133.89411113851
< an ETS(M,Ad,A) model >,1945.38209291183
< an ETS(M,Ad,M) model >,1141.84855312639
< an ETS(M,M,M) model >,1139.65948604423
< an ETS(M,Md,M) model >,1141.03566969852
< an ETS(M,N,A) model >,2087.66988840404
< an ETS(M,N,M) model >,1150.95261586989
From the results, the M,A,M, M,Ad,M, M,M,M, (M,N,M), and (M,Md,M) models give the best results in terms of the Bayesian information criterion.
Box, G.E.P., Jenkins, G.M., and Reinsel, G.C. (1976) Time Series Analysis, Forecasting and Control. Third Edition. Holden-Day. Series G.
Hyndman, R.J. and Athanasopoulos, G. (2013) Forecasting: principles and practice. http://otexts.org/fpp/. Accessed on 2013-10-09.
Hyndman, R.J., Koehler, A.B., Ord, J.K., and Snyder, R.D. (2008) Forecasting with Exponential Smoothing: The State Space Approach. Springer Series in Statistics. Springer-Verlag Berlin Heidelberg.
See Also
Overview of the TimeSeriesAnalysis Package
Download Help Document