Section 1 : Introduction
|
Lecture 1 | 1. Lecture 1 - INTRODUCTION TO BRAINMEASURES PROCT | |
|
Lecture 2 | Introduction + Course Structure + Demo | 00:16:44 Duration |
|
Lecture 3 | BONUS Learning Paths | |
|
Lecture 4 | Your Three Best Resources | 00:10:43 Duration |
|
Lecture 5 | Download the Resources here | |
|
Lecture 6 | INTRODUCTION TO BRAINMEASURES PROCTOR SYSTEM |
Section 2 : Step 1 - Artificial Neural Network
|
Lecture 1 | Welcome to Step 1 - Artificial Neural Network | |
|
Lecture 2 | Plan of Attack | 00:02:52 Duration |
|
Lecture 3 | The Neuron | 00:16:15 Duration |
|
Lecture 4 | The Activation Function | |
|
Lecture 5 | How do Neural Networks work | 00:12:48 Duration |
|
Lecture 6 | How do Neural Networks learn | 00:12:59 Duration |
|
Lecture 7 | Gradient Descent | 00:10:13 Duration |
|
Lecture 8 | Stochastic Gradient Descent | |
|
Lecture 9 | Backpropagation | 00:05:22 Duration |
Section 3 : Step 2 - Convolutional Neural Network
|
Lecture 1 | Welcome to Step 2 - Convolutional Neural Network | |
|
Lecture 2 | Plan of Attack | 00:03:32 Duration |
|
Lecture 3 | What are Convolutional Neural Networks | 00:15:49 Duration |
|
Lecture 4 | Step 1 - The Convolution Operation | |
|
Lecture 5 | Step 1 Bis - The ReLU Layer | 00:06:41 Duration |
|
Lecture 6 | Step 2 - Pooling | 00:14:13 Duration |
|
Lecture 7 | Step 3 - Flattening | 00:01:53 Duration |
|
Lecture 8 | Step 4 - Full Connection | 00:19:25 Duration |
|
Lecture 9 | Summary | 00:04:20 Duration |
|
Lecture 10 | Softmax & Cross-Entropy | 00:18:20 Duration |
Section 4 : Step 3 - AutoEncoder
|
Lecture 1 | Welcome to Step 3 - AutoEncoder | |
|
Lecture 2 | Plan of Attack | 00:02:12 Duration |
|
Lecture 3 | What are AutoEncoders | 00:10:50 Duration |
|
Lecture 4 | A Note on Biases | |
|
Lecture 5 | Training an AutoEncoder | 00:06:10 Duration |
|
Lecture 6 | Overcomplete Hidden Layers | 00:03:53 Duration |
|
Lecture 7 | Sparse AutoEncoders | 00:06:15 Duration |
Section 5 : Step 4 - Variational AutoEncoder
|
Lecture 1 | Welcome to Step 4 - Variational AutoEncoder | |
|
Lecture 2 | Introduction to the VAE | 00:08:16 Duration |
|
Lecture 3 | Variational AutoEncoders. | 00:04:29 Duration |
|
Lecture 4 | Reparameterization Trick | 00:04:56 Duration |
Section 6 : Implementing the CNN-VAE
|
Lecture 1 | Welcome to Step 5 - Implementing the CNN-VAE | |
|
Lecture 2 | Introduction to Step 5 | 00:08:11 Duration |
|
Lecture 3 | Initializing all the parameters and variables of | 00:13:54 Duration |
|
Lecture 4 | Building the Encoder part of the VAE | 00:19:34 Duration |
|
Lecture 5 | Remove - INTRODUCTION TO BRAINMEASURES PROCTOR SY | |
|
Lecture 6 | Building the Decoder part of the VAE | 00:10:40 Duration |
|
Lecture 7 | Implementing the Training operations | 00:18:34 Duration |
|
Lecture 8 | Full Code Section | |
|
Lecture 9 | The Keras Implementation |
Section 7 : Step 6 - Recurrent Neural Network
|
Lecture 1 | Welcome to Step 6 - Recurrent Neural Network | |
|
Lecture 2 | Plan of Attack. | 00:02:32 Duration |
|
Lecture 3 | What are Recurrent Neural Networks | 00:16:02 Duration |
|
Lecture 4 | The Vanishing Gradient Problem | 00:14:27 Duration |
|
Lecture 5 | LSTMs | 00:19:48 Duration |
|
Lecture 6 | LSTM Practical Intuition | |
|
Lecture 7 | LSTM Variations | 00:03:37 Duration |
Section 8 : Step 7 - Mixture Density Network
|
Lecture 1 | Welcome to Step 7 - Mixture Density Network | |
|
Lecture 2 | Introduction to the MDN-RNN | 00:09:28 Duration |
|
Lecture 3 | Mixture Density Networks | 00:09:33 Duration |
|
Lecture 4 | VAE + MDN-RNN Visualization | 00:05:46 Duration |
Section 9 : Step 8 - Implementing the MDN-RNN
|
Lecture 1 | . Welcome to Step 8 - Implementing the MDN-RNN | |
|
Lecture 2 | Initializing all the parameters and variables | 00:13:42 Duration |
|
Lecture 3 | Building the RNN - Gathering the parameters | 00:09:54 Duration |
|
Lecture 4 | Building the RNN - Creating an LSTM cell with Drop | 00:16:15 Duration |
|
Lecture 5 | Building the RNN - Setting up the Input, Target, | 00:14:54 Duration |
|
Lecture 6 | Building the RNN - Getting the Deterministic Outpu | 00:11:56 Duration |
|
Lecture 7 | Building the MDN - Getting the Input, Hidden Layer | 00:13:22 Duration |
|
Lecture 8 | Building the MDN - Getting the MDN parameters. | 00:10:57 Duration |
|
Lecture 9 | mplementing the Training operations (Part 1 | 00:15:31 Duration |
|
Lecture 10 | Implementing the Training operations (Part 2 | 00:13:34 Duration |
|
Lecture 11 | Full Code Section | |
|
Lecture 12 | The Keras Implementation |
Section 10 : Step 9 - Reinforcement Learning
|
Lecture 1 | Welcome to Step 9 - Reinforcement Learning | |
|
Lecture 2 | What is Reinforcement Learning. | 00:11:27 Duration |
|
Lecture 3 | Pseudo Implementation of Reinforcement Learning | 00:20:00 Duration |
|
Lecture 4 | Full Code Section |
Section 11 : Step 10 - Deep NeuroEvolution
|
Lecture 1 | Welcome to Step 10 - Deep NeuroEvolution | |
|
Lecture 2 | Deep NeuroEvolution | 00:11:10 Duration |
|
Lecture 3 | Evolution Strategies | 00:09:27 Duration |
|
Lecture 4 | Genetic Algorithms | 00:12:31 Duration |
|
Lecture 5 | Covariance-Matrix Adaptation Evolution Strategy | 00:13:26 Duration |
|
Lecture 6 | Parameter-Exploring Policy Gradients (PEPG). | 00:12:55 Duration |
|
Lecture 7 | OpenAI OpenAI EvEvolution Strategolution Strategy. | 00:08:30 Duration |
Section 12 : The Final Run
|
Lecture 1 | The Whole Implementatio | 00:19:50 Duration |
|
Lecture 2 | Download the whole AI Masterclass folder here | |
|
Lecture 3 | Installing the required packages | 00:11:38 Duration |
|
Lecture 4 | The Final Race Human Intelligence vs. Artificial | 00:10:16 Duration |