TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Why do polynomials have a bad reputation for overfitting?

1 点作者 andrew_v4大约 4 年前

2 条评论

anothernewdude大约 4 年前
Because the coefficients of the larger powers are more sensitive, and most attempts to fit those models don&#x27;t account for that, penalising high weights for those values more. You could have a regularisation regime that accounts for that. It&#x27;s also why extrapolation fails on them so often.<p>x^2 grows fast, x^3 even faster. They outpace the other parts of the model. Any changes to those coefficients also means you&#x27;ll need to change the coefficients of the smaller powers to adjust to the change.<p>A small amount of error in the largest power means two things - one the error will be exaggerated, because the power has an oversized effect, two the values of the smaller powers will also be wrongly adjusted to account for that error, and will contribute that poor fit to the model as well.
Bostonian大约 4 年前
Cubic splines, which are piecewise cubic polynomials, have been widely used by statisticians for nonparametric regression. &quot;Natural&quot; splines are constrained to be linear in the tails, which reduces the danger of extrapolation.