I do recommend getting the book that just came out (I did, it is fantastic)<p><a href="https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/dp/1492045527" rel="nofollow">https://www.amazon.com/Deep-Learning-Coders-fastai-PyTorch/d...</a><p>that said: fast.ai also released a draft of the book available here (including the notebooks) <a href="https://github.com/fastai/fastbook" rel="nofollow">https://github.com/fastai/fastbook</a><p>edit: if you can afford it, getting the book is a great way to support the authors
for those unfamiliar with fast.ai:<p>it is a practitioner-style style deep learning course that instead of starting with the fundamentals starts with examples and results and then over time, layer by layer reveals what it is all about and how it works in detail until you ask yourself "that is all there is?". a great way to make a seemingly unapproachable topic approachable.<p>you don't need big data, you don't need a GPU, you don't need to install a ton of dependencies, you only need a browser (to access jupyter notebooks).<p>last but not least: this is kind of the "definitive version" of the course as it now comes with a book, a new version of the library (re-written in a more thoughtful way) and with new versions of recorded lectures/lessons based on the book w/ way better audio quality (compared to the previous ones).<p>If you ever were curious about deep learning but did not find the time to take a look or thought it was unapproachable: now is a great time to dive in and this is a great course (& book & library & community) to do so
Oh, neat! I went through an earlier version of the online course when I was just trying to understand what this "deep learning mumbo jumbo" was all about, and it was the clearest, and easiest to follow, and most interesting one available, by a long shot! One of the assignments had you train an image recognization model based on google image results, and after a shockingly small amount of work and time I had a model that could distinguish a picture of a game of Go from a game of Chess almost perfectly. That was a huge eye-opener for me.<p>That was maybe 1-2 years ago at this point and I had wanted to take another look. What a perfect opportunity! And I'm excited it sounds like there might be a little more discussion of non-DL ML and applications in tabular data (where I'd have the most likely use for it), as well as the nitty gritty like deployments and use in production!<p>Any progress on the Swift front? Is that mentioned / used / discussed at all in this new course?
Not interested much in deep learning, but wanting to be somewhat on top of it to understand it well, I've done a few courses and skimmed a few books which are available.<p>The fastai video course was, with a big gap, the best, most understandable, most practical and most enjoyable of them.<p>Just wanted to say this. Thanks so much for creating it and regularly keeping it up to date!
A question for Jeremy, perhaps. For the longest time the fast.ai courses have used Adam and one-cycle, at least for CV tasks. More recently Ranger and flat-cos have been dominating the Imagenette leaderboards. I guess I'm curious if fast.ai intends to switch over to teaching that policy instead of one-cycle?<p>I guess more generally I'm curious what criteria the fast.ai team uses for deciding what techniques to teach. My feeling is that the courses have always taught the training techniques that are a healthy mix of SOTA, generally applicable, and easy to use.<p>Ranger + flat-cos has seemed like a really robust combo, and easy to use. So yeah, just interested in whatever internal discussions fast.ai may have had about it and other potential replacements for Adam + one-cycle.
I took fast.ai a few years ago, and then again a year or so ago. I like their lectures and their methodology of teaching which enabled me to meet a lot of interesting people in my city, but I ended up just building models using vanilla PyTorch instead of using their library as an added layer just because it felt like they were tweaking and revamping their code so often that at times it was kind of hard to connect the docs with the latest code.
Bought the book and trying out the lesson 1 notebook, but man, I can't seem to make this work. Colab can't import fastbook with the GPU runtime, and the TPU and CPU ones are too slow. Gradient gets a little further, but fails with "self.recorder already registered" on the "#id first_training" cell. Maybe I'm too dumb to be a data scientist, but I didn't expect to have to do this kind of debugging right off the bat.
Took this course two times, first when they used TensorFlow and afterwards based on PyTorch.
Like how it is practical from early on and updated on new research. Recommend trying to build networks from scratch in combination with the course, so you don’t become to dependent on the fast.ai framework.
This course and the accompanying libraries were very good when they were released and have only improved over the past several years. I will echo what others have said - the courses are very approachable and practical.<p>Fast.ai changed the course of my career and helped give birth to deep learning as a practice at my place of work. Thank you Jeremy!
People have had a lot of negative things to say about fastai v1, claiming it is not very flexible and intuitive and only good for the certain Kaggle-type problems. I would recommend them to check out fastai v2 as a serious competitor to other PyTorch-based frameworks like PyTorch Lightning, Catalyst, Ignite, etc. It's very easy to work with default deep learning problems, but for more complex and unique problems, the mid-level/low-level API and callbacks make it quite painless to use fastai in your workflow. Plus there's tons of community support (forums.fast.ai + Discord), even for a package maintained by only a few people. Check it out!
Jeremy should update his JavaScript course as well. He might be one if the few people to make it look less messy than it is everywhere else. The fast.ai course is wonderful. Definitely recharged my own interest in deep learning.
Has anyone gone through a career change (to something in data science / ML) after going through courses like fast.ai? If so, how difficult / easy was that change?
This is great! Looking forward to trying it out. I explored it while back when I was looking for a deep learning library that can take a tabular data file and build a multitask predictive model involving different datatypes (for example, some columns may be be text). Uber's ludwig library does it. Would love to check it out.
Amazing news. I pre-ordered the book a while ago and am a bit surprised (positively) it's over 600 pages now. The German Amazon page still says 350 pages btw.<p>Worked with fast.ai for a couple of projects starting <1.0 and with the first MOOC. You're doing great work and it's really appreciated.
Hi Jeremy, congratulations on the new releases and thank you.<p>I see that the original <i>ML</i> course[1] link has been removed from the home page. Does it mean it's been invalidated due to integration of ML lessons with the DL courses?<p>I was pointing those who wanted to learn ML but don't have good access to proper Internet to the old ML course with custom scripts to make installation of requirements for those course in inexpensive SBC like Jetson Nano or similar. I was planning to make those setup public, but should I refrain from doing that because of Fast.ai v2? If so, is the cloud compute de facto first class citizen now?<p>[1]<a href="http://course18.fast.ai/lessonsml1/lesson1.html" rel="nofollow">http://course18.fast.ai/lessonsml1/lesson1.html</a>
As one of the folks that took this course, I was thoroughly engaged. I wouldn't start masquerading as a data scientist after learning this material, but this is a highly-practical approach to deploying new engineering tools.
Jeremy Howard and Andrew Ng are the two teachers who got me into ML and eventually as a career. Amazing to see so much progress! Because of FastAi I can see ML being used around the world just like Excel or python
Looks great - will probably pick up the book<p>In Lesson 1 they talk about use-cases where Deep Learning is the best known approach. Are there any popular use-cases for which it is not the best known approach?
Though not related to the content of post, I found that the favicon of fast.ai is a H character, which is not related to AI. Somebody should update it.<p>FYI, letter H comes from theme Hyde in Hugo: <a href="https://themes.gohugo.io/hyde/?search-input=menu%3Dmai#sidebar-menu" rel="nofollow">https://themes.gohugo.io/hyde/?search-input=menu%3Dmai#sideb...</a>
I am trying to get into ML in general and I am having a bit of a problem. I don't know what is what and I lack a basic trajectory. Fortunately I have all the mathematics prereqs so I can just jump in.
What I need is some sort of up-to-date overview of everything ML so that I know what topics to study in which order.
Does anyone know of such a thing?
I got a little ways through the very first course way back when. I am planning to learn ML/DS in my spare time, but I have a particular end goal - self driving cars/computer vision. Does this course cover those topics?
Hi Jeremy, thanks for your awesome library! I've followed the last online course and was pretty impressed by how effective is your top-down approach.<p>Are multi-gpu setups supported in this version of fast.ai?
Any recommendation on how to approach the course? Is it better to read the chapter in the book before you watch the lecture(s) covering the content of the chapter or vice versa?
I took the fast.ai courses and highly recommend them for anyone who really wants to learn ML.<p>Are there any plans for courses on reinforcement learning?
@dang or someone - I wonder if you can fix the title so it's not just "Fast.ai releases new deep learning course"? The article is just as much about the release of the fastai v2 software library as it as about the course.<p>The original title was "fast.ai releases new deep learning course, four libraries, and 600-page book", although "fast.ai releases new deep learning course and library" would probably cover what most people are interested in, and is quite a bit shorter.