# Bridging AIC and BIC: A New Criterion for Autoregression

@article{Ding2018BridgingAA, title={Bridging AIC and BIC: A New Criterion for Autoregression}, author={Jie Ding and Vahid Tarokh and Yuhong Yang}, journal={IEEE Transactions on Information Theory}, year={2018}, volume={64}, pages={4024-4043} }

To address order selection for an autoregressive model fitted to time series data, we propose a new information criterion. It has the benefits of the two well-known model selection techniques: the Akaike information criterion and the Bayesian information criterion. When the data are generated from a finite-order autoregression, the Bayesian information criterion is known to be consistent, and so is the new criterion. When the true order is infinity or suitably high with respect to the sample… Expand

#### Figures, Tables, and Topics from this paper

#### 39 Citations

Consistent model selection criteria and goodness-of-fit test for affine causal processes

- Mathematics
- 2019

This paper studies the model selection problem in a large class of causal time series models, which includes both the ARMA or AR(∞) processes, as well as the GARCH or ARCH(∞), APARCH, ARMA-GARCH and… Expand

Order selection for possibly infinite-order non-stationary time series

- Computer Science
- 2019

This study proposes a two-stage information criterion (TSIC), and shows that TSIC is asymptotically efficient in predicting integrated AR models when the underlying AR coefficients satisfy a wide range of conditions. Expand

Controlling the error probabilities of model selection information criteria using bootstrapping

- Computer Science
- 2019

The Error Control for Information Criteria (ECIC) method is presented, a bootstrap approach to controlling Type-I error using Difference of Goodness of Fit (DGOF) distributions. Expand

On Statistical Efficiency in Learning

- Computer Science, Mathematics
- IEEE Transactions on Information Theory
- 2021

A generalized notion of Takeuchi’s information criterion is proposed and it is proved that the proposed method can asymptotically achieve the optimal out-sample prediction loss under reasonable assumptions. Expand

Performance Evaluation of AIC and BIC in Time Series Clustering with Piccolo Method

- Computer Science
- 2020

Results shows that Bayesian information Criterion (BIC) is better than Akaike’s information Criteria (AIC) in time series clustering with Piccolo method. Expand

Asymptotic analysis of model selection criteria for general hidden Markov models

- Mathematics
- Stochastic Processes and their Applications
- 2021

The paper obtains analytical results for the asymptotic properties of Model Selection Criteria -- widely used in practice -- for a general family of hidden Markov models (HMMs), thereby substantially… Expand

Sensitivity and specificity of information criteria

- Mathematics, Computer Science
- Briefings Bioinform.
- 2020

In some cases the comparison of two models using ICs can be viewed as equivalent to a likelihood ratio test, with the different criteria representing different alpha levels and BIC being a more conservative test than AIC. Expand

Sensitivity and Specificity of Information Criteria

- Computer Science, Biology
- 2018

In some cases the comparison of two models using ICs can be viewed as equivalent to a likelihood ratio test, with the different criteria representing different alpha levels and BIC being a more conservative test than AIC. Expand

A Penalized Method for the Predictive Limit of Learning

- Computer Science
- 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2018

This paper studies a penalized model selection technique that asymptotically achieves the optimal expected prediction loss (referred to as the limit of learning) offered by a set of candidate models. Expand

Analysis of Multistate Autoregressive Models

- Computer Science
- IEEE Transactions on Signal Processing
- 2018

This paper proposes an inference strategy that enables reliable and efficient offline analysis of this class of time series, referred to as multistate autoregressive models, and provides theoretical results and algorithms in order to facilitate the inference procedure. Expand

#### References

SHOWING 1-10 OF 46 REFERENCES

Order selection for same-realization predictions in autoregressive processes

- Mathematics
- 2005

Assume that observations are generated from an infinite-order autoregressive [AR(∞)] process. Shibata [Ann. Statist. 8 (1980) 147-164] considered the problem of choosing a finite-order AR model,… Expand

PREDICTION-FOCUSED MODEL SELECTION FOR AUTOREGRESSIVE MODELS: MODEL SELECTION IN AUTOREGRESSIVE MODELS

- Mathematics
- 2007

Summary
In order to make predictions of future values of a time series, one needs to specify a forecasting model. A popular choice is an autoregressive time-series model, for which the order of… Expand

Parametric or nonparametric? A parametricness index for model selection

- Mathematics
- 2011

In model selection literature, two classes of criteria perform well asymptotically in different situations: Bayesian information criterion (BIC) (as a representative) is consistent in selection when… Expand

PREDICTION/ESTIMATION WITH SIMPLE LINEAR MODELS: IS IT REALLY THAT SIMPLE?

- Mathematics
- Econometric Theory
- 2006

Consider the simple normal linear regression model for estimation/prediction at a new design point. When the slope parameter is not obviously nonzero, hypothesis testing and information criteria can… Expand

Catching up faster by switching sooner: a predictive approach to adaptive estimation with an application to the AIC–BIC dilemma

- Mathematics
- 2012

Summary. Prediction and estimation based on Bayesian model selection and model averaging, and derived methods such as the Bayesian information criterion BIC, do not always converge at the fastest… Expand

Finite sample criteria for autoregressive order selection

- Mathematics, Computer Science
- IEEE Trans. Signal Process.
- 2000

The special finite sample information criterion and combined information criterion are necessary because of the increase of the variance of the residual energy for higher model orders that has not been accounted for in other criteria. Expand

Regression and time series model selection in small samples

- Mathematics
- 1989

SUMMARY A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregressive time series models. The correction is of particular use when the sample size is small,… Expand

The Focused Information Criterion

- Mathematics
- 2003

A variety of model selection criteria have been developed, of general and specific types. Most of these aim at selecting a single model with good overall properties, for example, formulated via… Expand

Multimodel Inference

- Mathematics
- 2004

The model selection literature has been generally poor at reflecting the deep foundations of the Akaike information criterion (AIC) and at making appropriate comparisons to the Bayesian information… Expand

Can the Strengths of AIC and BIC Be Shared

- Mathematics
- 2003

It is well known that AIC and BIC have dierent properties in model selection. BIC is consistent in the sense that if the true model is among the candidates, the probability of selecting the true… Expand