TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Principal Component Analysis explained visually (2015)

176 点作者 spking超过 2 年前

6 条评论

aquafox超过 2 年前
Here is a much better explanation of PCA: <a href="https:&#x2F;&#x2F;stats.stackexchange.com&#x2F;questions&#x2F;2691&#x2F;making-sense-of-principal-component-analysis-eigenvectors-eigenvalues" rel="nofollow">https:&#x2F;&#x2F;stats.stackexchange.com&#x2F;questions&#x2F;2691&#x2F;making-sense-...</a><p>The key insight that many are missing is that PCA solves a series of optimization problems, namely that reconstructing the data from the first k PCs gives the best k-dimensional approximation in terms of the squared error. Even more, this is equivalent to assuming that the data lives in a k-dimensional subspace and becomes truly high-dimensional because of normally distributed noise that spills into every direction (dimension).
评论 #33388783 未加载
评论 #33387601 未加载
wjnc超过 2 年前
Best thing I’ve ever read on PCA is Madeleine Udell’s PhD-thesis [1]. It extends PCA in many directions and shows that well-known techniques fit into the developed framework. (Was also impressed with a 138 page thesis in math that is readable as well. Quite the achievement.)<p>[1] <a href="https:&#x2F;&#x2F;people.orie.cornell.edu&#x2F;mru8&#x2F;doc&#x2F;udell15_thesis.pdf" rel="nofollow">https:&#x2F;&#x2F;people.orie.cornell.edu&#x2F;mru8&#x2F;doc&#x2F;udell15_thesis.pdf</a>
评论 #33386990 未加载
评论 #33387278 未加载
评论 #33387197 未加载
dang超过 2 年前
Related:<p><i>Principal Component Analysis Explained Visually</i> - <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=27017675" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=27017675</a> - May 2021 (44 comments)<p><i>Principal Component Analysis Explained Visually (2015)</i> - <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=14405665" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=14405665</a> - May 2017 (25 comments)<p><i>Principal component analysis explained visually</i> - <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=9040266" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=9040266</a> - Feb 2015 (22 comments)
lxe超过 2 年前
Also see<p>- Markov Chains (<a href="https:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;markov-chains&#x2F;" rel="nofollow">https:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;markov-chains&#x2F;</a>)<p>- Image Kernels (<a href="https:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;image-kernels&#x2F;" rel="nofollow">https:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;image-kernels&#x2F;</a>)<p>- Bus Bunching (<a href="https:&#x2F;&#x2F;setosa.io&#x2F;bus&#x2F;" rel="nofollow">https:&#x2F;&#x2F;setosa.io&#x2F;bus&#x2F;</a>)<p>Wish these guys kept producing more visualizations!
blt超过 2 年前
In the UK eating example, it would be better to examine the feature-space singular vector associated with the first singular value instead of instructing the reader to &quot;go back and look at the data in the table&quot;. PCA has already done that work, no additional (error-prone, subjective) interpretation needed.
评论 #33388914 未加载
nerdponx超过 2 年前
I&#x27;m not sure this is an explanation as much as an introductory demo. Nice visualizations though.