Surprised (and delighted!) to see this pop up on the front page of HN today! This course is designed to teach proficient coders the <i>practice</i> of deep learning, that is: how to train accurate models; how to test and debug models; and key deep learning concepts. It covers applications in vision, natural language processing, tabular data, and recommendation systems.<p>If you're interested in diving deeper into the papers and math behind the scenes, as well as coding from the lowest levels (right down to the compiler level), you'll be interested in "Deep Learning from the Foundations", which is coming out in 2 weeks. The last two lessons are co-taught with Chris Lattner (creator of Swift, LLVM, and Clang).<p>If you want to understand the underlying linear algebra implementation details, have a look at "Computational Linear Algebra for Coders": <a href="https://github.com/fastai/numerical-linear-algebra" rel="nofollow">https://github.com/fastai/numerical-linear-algebra</a><p>If you want to learn about decision trees, random forests, linear regression, validation sets, etc, try "Introduction to Machine Learning for Coders": <a href="https://course18.fast.ai/ml" rel="nofollow">https://course18.fast.ai/ml</a><p>(All are free and have no ads. They are provided as a service to the community.)<p>Let me know if you have any questions!
I really love this course. I have no deep learning experience except for this, but in a few hours of watching these videos I was literally building image classifiers that worked pretty well. Coincidentally, just today I started working on a hobby project using the stuff I learned here.
The lessons look interesting from a high level perspective. And I think could help people guide their applications.<p>I think there's also a need for a very low level course in deep learning. I.e. on the level of someone who wishes to write their own deep learning library. Because from high up, sure it all looks like the chain rule, but down low, it gets messy quickly if you want to write a high performance library on your own.
As a complete non-expert in ML I have watched some of the fast.ai videos and dabbled around with their library, and generally I like very much their approach of trhing to make state of the art ML as easily approachable as possible for practical purposes.<p>However, what worries me a <i>lot</i> is complete breakup of the API between versions and some discussion about Swift. As a non full time expert I would need to have a robust and stable framework to keep on learning so that I can keep on building knowledge without needing to be worried that whatever I have learned about how to use some framework (or which language to use!) Will be obsolete in a couple of months and I need to start from scratch again.<p>So, does anyone know if fast.ai is going to stabilize anytime soon, or is it better for me to just try to spend the little time I have to play with ML and deep leaening directly with e.g. pytorch?
I wrote a similar guide based on my experiences:<p><a href="https://austingwalters.com/neural-networks-to-production-from-an-engineer/" rel="nofollow">https://austingwalters.com/neural-networks-to-production-fro...</a><p>I’ve gone through this guide and other guides before, as I often teach courses like: “introduction to deep learning”<p>The guide from fast.ai is very good and highly recommend. I also recommend if you want to get into deep learning taking a numerical methods course / guide. Once you understand the basics the hard part is understanding the pitfalls introduced from hardware (and some software) limitations
Much deep learning focuses on image classification (the first topic of the fast.ai course) and natural language processing.<p>What is a good course that focuses on "tabular data", in particular predicting continuous outputs from continuous inputs, aka regression?
Just please do not adopt their coding style. Use this instead: <a href="http://google.github.io/styleguide/pyguide.html" rel="nofollow">http://google.github.io/styleguide/pyguide.html</a> and try not to drag global state through your entire codebase with kwargs.
Anyone have thoughts on the utility of the Intro to ML and Computational Linear Algebra courses? I've done Ng's ML course and was interested in the first of these as a more practice-oriented complement to it; the latter sounds interesting, but a bit more of an "elective" to me.
Really awesome course! For those looking to run locally these seemed to be the most straight forward setup steps <a href="https://course.fast.ai/start_aws.html#step-6-access-fastai-materials" rel="nofollow">https://course.fast.ai/start_aws.html#step-6-access-fastai-m...</a> assuming you have conda installed.
I was expecting this new course to also include a Swift section with Chris Lattner, but couldn't find it. Does anyone know by chance when that will be released (roughly)?