标签:cti nal any The time not cer orm process
In mathematics, statistics, finance,[1] computer science, particularly in machine learning and inverse problems, regularization is the process of adding information in order to solve an ill-posed(不适定) problem or to prevent overfitting.[2]
Regularization can be applied to objective functions in ill-posed optimization problems. The regularization term, or penalty, imposes a cost on the optimization function to make the optimal solution unique.
At the time, ridge regression was the most popular technique for improving prediction accuracy. Ridge regression improves prediction error by shrinking the sum of the squares of the regression coefficients to be less than a fixed value in order to reduce overfitting, but it does not perform covariate selection and therefore does not help to make the model more interpretable.
Lasso achieves both of these goals by forcing the sum of the absolute value of the regression coefficients to be less than a fixed value, which forces certain coefficients to zero, effectively excluding them. This idea is similar to ridge regression, which only shrinks the size of the coefficients, without setting any of them to zero.
正则化可以解决两种类型的问题:
标签:cti nal any The time not cer orm process
原文地址:https://www.cnblogs.com/qianxinn/p/14763871.html