An efficient method for feature selection in linear regression based on an extended Akaike’s information criterionстатья
Информация о цитировании статьи получена из
Web of Science ,
Scopus
Статья опубликована в журнале из списка Web of Science и/или Scopus
Дата последнего поиска статьи во внешних источниках: 10 ноября 2014 г.
Авторы:
Vetrov D.P. ,
Kropotov D.A. ,
Ptashko N.O.
Журнал:
Computational Mathematics and Mathematical Physics
Том:
49
Номер:
11
Год издания:
2009
Издательство:
Pleiades Publishing, Ltd
Местоположение издательства:
Road Town, United Kingdom
Первая страница:
1972
Последняя страница:
1985
DOI:
10.1134/S096554250911013X
Аннотация:
A method for feature selection in linear regression based on an extension of Akaike’s information criterion is proposed. The use of classical Akaike’s information criterion (AIC) for feature selection assumes the exhaustive search through all the subsets of features, which has unreasonably high computational and time cost. A new information criterion is proposed that is a continuous extension of AIC. As a result, the feature selection problem is reduced to a smooth optimization problem. An efficient procedure for solving this problem is derived. Experiments show that the proposed method enables one to efficiently select features in linear regression. In the experiments, the proposed procedure is compared with the relevance vector machine, which is a feature selection method based on Bayesian approach. It is shown that both procedures yield similar results. The main distinction of the proposed method is that certain regularization coefficients are identical zeros. This makes it possible to avoid the underfitting effect, which is a characteristic feature of the relevance vector machine. A special case (the so-called nondiagonal regularization) is considered in which both methods are identical. В© Pleiades Publishing, Ltd., 2009.
Добавил в систему:
Кропотов Дмитрий Александрович