Hyperparameter Optimization For Machine Learning

Posted By: Sigha

Hyperparameter Optimization For Machine Learning
Last updated 9/2024
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English (US) | Size: 2.65 GB | Duration: 9h 24m

Learn grid and random search, Bayesian optimization, multi-fidelity models, Optuna, Hyperopt, Scikit-Optimize & more.

What you'll learn
Hyperparameter tunning and why it matters
Cross-validation and nested cross-validation
Hyperparameter tunning with Grid and Random search
Bayesian Optimisation
Tree-Structured Parzen Estimators, Population Based Training and SMAC
Hyperparameter tunning tools, i.e., Hyperopt, Optuna, Scikit-optimize, Keras Turner and others

Requirements
Python programming, including knowledge of NumPy, Pandas and Scikit-learn
Familiarity with basic machine learning algorithms, i.e., regression, support vector machines and nearest neighbours
Familiarity with decision tree algorithms and Random Forests
Familiarity with gradient boosting machines, i.e., xgboost, lightGBMs
Understanding of machine learning model evaluation metrics
Familiarity with Neuronal Networks

Description
Welcome to Hyperparameter Optimization for Machine Learning. In this course, you will learn multiple techniques to select the best hyperparameters and improve the performance of your machine learning models.If you are regularly training machine learning models as a hobby or for your organization and want to improve the performance of your models, if you are keen to jump up in the leader board of a data science competition, or you simply want to learn more about how to tune hyperparameters of machine learning models, this course will show you how.We'll take you step-by-step through engaging video tutorials and teach you everything you need to know about hyperparameter tuning. Throughout this comprehensive course, we cover almost every available approach to optimize hyperparameters, discussing their rationale, their advantages and shortcomings, the considerations to have when using the technique and their implementation in Python.Specifically, you will learn:What hyperparameters are and why tuning mattersThe use of cross-validation and nested cross-validation for optimizationGrid search and Random search for hyperparametersBayesian OptimizationTree-structured Parzen estimatorsSMAC, Population Based Optimization and other SMBO algorithmsHow to implement these techniques with available open source packages including Hyperopt, Optuna, Scikit-optimize, Keras Turner and others.By the end of the course, you will be able to decide which approach you would like to follow and carry it out with available open-source libraries.This comprehensive machine learning course includes over 50 lectures spanning about 8 hours of video, and ALL topics include hands-on Python code examples which you can use for reference and for practice, and re-use in your own projects.So what are you waiting for? Enroll today, learn how to tune the hyperparameters of your models and build better machine learning models.

Who this course is for:
Students who want to know more about hyperparameter optimization algorithms,Students who want to understand advanced techniques for hyperparameter optimization,Students who want to learn to use multiple open source libraries for hyperparameter tuning,Students interested in building better performing machine learning models,Students interested in participating in data science competitions,Students seeking to expand their breadth of knowledge on machine learning




For More Courses Visit & Bookmark Your Preferred Language Blog
From Here: English - Français - Italiano - Deutsch - Español - Português - Polski - Türkçe - Русский