Fundamentals of Machine Learning & Neural Networks
The participant does not need an extensive math background to understand neural network or machine learning to take this class, because this Fundamentals of Machine Learning and Neural Networks course explains in understandable steps, how to build from a one node neural network to multiple features as well as exploring how machine learning works. All the steps use working code to solve problems.
Course taught by an expert Artificial Intelligence coder.
4 days - $2,100.00
This course is taught by an expert Adobe Certified Instructor.
Prerequisites:
Basic understatnding of artificial intelligence and machine learning are required. Familiarity with running programs from the command line and understanding the concept of a function is also helpful.
Course Outline
Basic Supervised Learning Model
The Parts of an ML Model
Training the ML Model
Supervised Regression Models
Goodness of Fit Parameters
Outliers
Learning Rate
Multiple Features
Normalization and Standardization
Polynomial Regression
Overfitting
Regularization
Cross-Validation
Ridge
Supervised Classification Models
The Classification Model
Goodness of Fit
Binary Classification
Multiclass Classification
Solvers and Activation Function
Normalization and Standardization
Regularization
SVM Models
Linear Separation
Basic SVM
Kernel Modules
Bayesian Models
Bayes Theorem
Naïve Bayes
Decision Trees
Demonstration of a Decision Tree
Overview of Decision Tree Model
Combining Techniques: Ensembles and Forests
Unsupervised Learning: Clustering
What Is Different About Clustering?
KNN Clustering
K-Mean Clustering
Hierarchical Clustering
Unsupervised Learning: Anomaly Detection
Elliptic Envelope
Anomaly Detection Plotting
SVM Anomaly Detection
Dimension Reduction
RFE
PCA
Lasso
The Simplest Possible Neural Network
What Is Machine Learning?
What Is a Neural Network?
Building the Simplest Neural Network in Simple Python
Multiple Input
Multiple Outputs
Use NumPy to Build Neural Networks
Updating Weights in Simplest Neural Network
Simple Error Analysis
Working with 1 Attributes
Small Steps
Extending Simplest Neural Network to Multiple Inputs
Extending to Multiple Outputs
Combining Multiple Input and Outputs
Extending to Complete Data Sets
Error vs. Cost
Extending Neural Network to Use Multiple Samples
Goodness of Fit Parameters
Understanding Back Propagation
Review of NumPy Arrays
Introduction to Stacked Arrays
Extending
Backpropagation
Coding Examples
Multiple Layers and Back Propagation
Introduction to Deep Learning
Forward Propagation
Back Propagation
Working Example
Parameters Affecting Deep Learning
Normalization
Data Size
Regularization
Weight Initialization
Working Though Coding Changes