TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Principal Component Analysis

138 点作者 olooney超过 5 年前

9 条评论

Der_Einzige超过 5 年前
BTW for real world use if you want to do PCA but want a better solution than an algorithm which makes linearity assumptions there are two really hot algorithms for dimensionality reduction right now<p>UMAP - topology manifold learning based method<p>Ivis - simese triplet network autoencoder<p>Both of them will blow PCA out of the water on basically all datasets. PCAs only advantages are speed and interpretability (easy to see explained covariance)
评论 #21166429 未加载
评论 #21167545 未加载
评论 #21166545 未加载
评论 #21166616 未加载
评论 #21167331 未加载
tomkat0789超过 5 年前
PCA is great. I like this paper where it holds its own against all the fancy nonlinear techniques:<p><a href="https:&#x2F;&#x2F;lvdmaaten.github.io&#x2F;publications&#x2F;papers&#x2F;TR_Dimensionality_Reduction_Review_2009.pdf" rel="nofollow">https:&#x2F;&#x2F;lvdmaaten.github.io&#x2F;publications&#x2F;papers&#x2F;TR_Dimension...</a><p>When have you needed something stronger than PCA? Anybody have good stories?
评论 #21167622 未加载
objektif超过 5 年前
Can someone knowledgable please give us examples of real life use of PCA. Not could be used here could be used there kind of toy exmaples but actual use.
评论 #21166713 未加载
评论 #21166375 未加载
评论 #21166809 未加载
评论 #21166953 未加载
评论 #21166255 未加载
评论 #21168062 未加载
评论 #21167596 未加载
评论 #21168617 未加载
评论 #21167635 未加载
评论 #21169210 未加载
评论 #21169708 未加载
评论 #21166674 未加载
评论 #21168603 未加载
zmmmmm超过 5 年前
Am I missing something or is equation (9) incorrect unless the mean of the random variable x is zero (which is never specified)?
评论 #21169045 未加载
SubiculumCode超过 5 年前
I&#x27;ve been using a somewhat related technique in my research: Principal Coordinate Analysis (PCoA) also called Multidimensional Scaling (MDS) which works on a dissimilarity matrix. See [1] for the differences.<p>[1] <a href="http:&#x2F;&#x2F;occamstypewriter.org&#x2F;boboh&#x2F;2012&#x2F;01&#x2F;17&#x2F;pca_and_pcoa_explained&#x2F;" rel="nofollow">http:&#x2F;&#x2F;occamstypewriter.org&#x2F;boboh&#x2F;2012&#x2F;01&#x2F;17&#x2F;pca_and_pcoa_ex...</a>
alexcnwy超过 5 年前
Excellent explanation!<p>I find this visualization really helpful when I need to explain PCA: <a href="http:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;principal-component-analysis&#x2F;" rel="nofollow">http:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;principal-component-analysis&#x2F;</a>
Patient0超过 5 年前
This was a great article - I&#x27;d been wanting to understand PCA. Particularly liked the digression on Lagrange multipliers.
ris超过 5 年前
Some nice intuitive explanations in there.
platz超过 5 年前
anyone want to comment on how to choose between PCA and ICA?<p>(Some of hardest parts of ML imho is more in selection)
评论 #21167671 未加载
评论 #21167452 未加载