The Fast and the Furious 2 of machine learning is now available for your pleasure.
Fast.ai is the very best way to learn practical Deep Learning. Period.
The first iteration of course 1 and 2, used Keras and the new versions use their own library built on top of PyTorch. Their new library is awesome and has a lot of useful best practice functions.
If you are interested in learning more about Data Science, you can check out the course page for the CS109 Data Science Course at Harvard University.
Topics covered are among others:
- Web Scraping
- Regular Expressions
- Data Reshaping
- Data Cleanup
- Frequentist Statistics
- Bias and Regression
- SVM, Decision Trees, Random Forests
- Ensemble Methods
- Bayes Theorem and Bayesian Methods
- Interactive Visualization
- Deep Networks
“Thanks everyone for an amazing month of January. It’s been an inspiring, life-changing experience for me.” – Lex Fridman
Several more lecture recordings are soon to be released.
Here is the official webpage of the course:
After a long wait, the final and much-anticipated course in the Coursera Deep Learning Specialization series taught by Andrew Ng, called Sequence Models, has now been released.
The first week will be about Recurrent Neural Networks, the second week will address Natural Language Processing & Word Embeddings and the final week will be about Sequence models & Attention mechanism.
Google colaboratory now has GPU support meaning you can run your jupyter notebooks on google drive with GPU support.
Here is a tutorial how to get started.
Last year the 2017 course of fast.ai was amazing, which taught state of the art deep learning to coders. There are so many goodies in the blog post about the Fast.ai 2018 launch which is available now. This year they held the course using Pytorch instead of Keras and wrote their own library for speeding up development and were the first to add several implementations from papers to the library such as Learning Rate Finder (Smith 2015) and Stochastic Gradient Descent with Restarts (SGDR). With one line of code, you can also get the images that the classifier gets wrong.
17 of the 20 top participants in a kaggle competitors were students in the preview course.
I recommend reading the blog post and taking the course.
We know that computers have beat humans in chess, that was a great breakthrough and a milestone in AI.
The worlds strongest AI for chess, called Stockfish was recently dethroned by a deep reinforcement AI by Googles DeepMind called AlphaZero.
Here is a walkthrough of the third game further explanation on chess.com “How Does AlphaZero Play Chess?”
Titta på AlphaZero vs Stockfish Chess Match: Game 3 från Chess på www.twitch.tv
If you are building a robot driven by Raspberry Pi and want to use image recognition and object detection you may want to look into Googles Mobile Nets platform which lets you do use a several mobile-first computer vision models for TensorFlow, combined with an Intel Movidius Neural Compute Stick on a Rasrberry PI 3. The MobileNets platform is designed to be run on resource conservative devices while maintaining accuracy and the latter will give you an order of magnitude more compute power than running the detection on the raspberrys CPU.
There is a new kid on the block in terms of online courses on Deep Learning.
DeepSchool.io is a set of Jupyter notebooks that teach you the basics and different concepts you need in order to get started and being productive in Depp Learning. They are also videos supporting the notebook, although not for every notebook yet.
It differs from fast.ai in that the videos are shorter and the notebooks are mostly self-explanatory.
The goal of the project is to make Deep Learning accessible to everyone, make it practical, make learning open source and fun.
These are the topics covered:
- Lesson 0: Introduction to regression.
- Lesson 1: Penalising weights to fit better (scikit learn intro)
- Lesson 2: Gradient Descent. Using basic optimisation methods.
- Lesson 3: Tensorflow intro: zero layer hidden networks (i.e. normal regression).
- Lesson 4: Tensorflow hidden layer introduction.
- Lesson 5: Using Keras to simplify multi-layer neural nets.
- Lesson 6: Embeddings to deal with categorical data. (Keras)
- Lesson 7: Word2Vec. Embeddings to visualise words. (Tensorflow)
- Lesson 8: Application – Bike Sharing predictions
- Lesson 9: Choosing Number of Layers and more
- Lesson 10: XGBoost – A quick detour from Deep Learning
- Lesson 11: Convolutional Neural Nets (MNIST dataset)
- Lesson 12: CNNs and BatchNormalisation (CIFAR10 dataset)
- Lesson 13: Transfer Learning (Dogs vs Cats dataset)
- Lesson 14: LSTMs – Sentiment analysis.
- Lesson 15: LSTMs – Shakespeare.
- Lesson 16: LSTMs – Trump Tweets.
- Lesson 17: Trump – Stacking and Stateful LSTMs.
- Lesson 18: Fake News Classifier
You can read more here.