TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Intro to Hidden Markov Models (2010) [pdf]

181 pointsby kerckeralmost 9 years ago

6 comments

platzalmost 9 years ago
-- vs kalman filters:<p>&quot;In both models, there&#x27;s an unobserved state that changes over time according to relatively simple rules, and you get indirect information about that state every so often. In Kalman filters, you assume the unobserved state is Gaussian-ish and it moves continuously according to linear-ish dynamics (depending on which flavor of Kalman filter is being used). In HMMs, you assume the hidden state is one of a few classes, and the movement among these states uses a discrete Markov chain. In my experience, the algorithms are often pretty different for these two cases, but the underlying idea is very similar.&quot; - THISISDAVE<p>-- vs LSTM&#x2F;RNN:<p>&quot;Some state-of-the-art industrial speech recognition [0] is transitioning from HMM-DNN systems to &quot;CTC&quot; (connectionist temporal classification), i.e., basically LSTMs. Kaldi is working on &quot;nnet3&quot; which moves to CTC, as well. Speech was one of the places where HMMs were _huge_, so that&#x27;s kind of a big deal.&quot; -PRACCU<p>&quot;HMMs are only a small subset of generative models that offers quite little expressiveness in exchange for efficient learning and inference.&quot; - NEXTOS<p>&quot;IMO, anything that be done with an HMM can now be done with an RNN. The only advantage that an HMM might have is that training it might be faster using cheaper computational resources. But if you have the $$$ to get yourself a GPU or two, this computational advantage disappears for HMMs.&quot; - SHERJILOZAIR
评论 #11910046 未加载
评论 #11910759 未加载
评论 #11909756 未加载
mellingalmost 9 years ago
Markov Chains Explained Visually<p><a href="http:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;markov-chains&#x2F;" rel="nofollow">http:&#x2F;&#x2F;setosa.io&#x2F;ev&#x2F;markov-chains&#x2F;</a>
评论 #11911367 未加载
mjt0229almost 9 years ago
A coworker of mine used to ask job candidates (usually folks with PhDs) with HMMs on their CV &quot;what&#x27;s hidden in a hidden markov model&quot;. Lots of people couldn&#x27;t answer that question.
gallaminealmost 9 years ago
Are there an open tools for solving HMMs for large datasets? i.e. if I have millions of observations from millions of users and want to learn a HMM from the data, what are my options?
评论 #11910101 未加载
graycatalmost 9 years ago
&gt; A Markov chain is a sequence of random variables X1, X2, X3, . . . , Xt , . . . , such that the probability distribution of Xt+1 depends only on t and xt (Markov property), in other words:<p>No. In a Markov process, the future does depend on the past, even all of the past. But what is special is that the past and the future are conditionally independent given the present. If we are not given the present, then all of the past can be relevant in predicting the future.
评论 #11911364 未加载
评论 #11909314 未加载
platzalmost 9 years ago
link to MIT course:<p><a href="http:&#x2F;&#x2F;ocw.mit.edu&#x2F;courses&#x2F;aeronautics-and-astronautics&#x2F;16-410-principles-of-autonomy-and-decision-making-fall-2010&#x2F;" rel="nofollow">http:&#x2F;&#x2F;ocw.mit.edu&#x2F;courses&#x2F;aeronautics-and-astronautics&#x2F;16-4...</a>