Large Language Models on AWS: Building and Deploying Open-Source LLMs

Posted By: IrGens

Large Language Models on AWS: Building and Deploying Open-Source LLMs
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 35m | 62.6 MB
Instructor: Noah Gift

In this course, MLOps expert Noah Gift highlights the world of open-source Large Language Models (LLMs) on AWS. Learn about essential toolchains, including how to optimize and compile LLMs such as llama.cpp. Discover the implications of Amdahl's law for your computational tasks and see practical demonstrations using the GGUF file format. Find out about Python UV scripting and packaging to maximize the functionality and efficiency of your models. Understand key concepts in llama.cpp through detailed walkthroughs and see end-to-end demos of quantized models on AWS G5 instances. Gain practical knowledge and hands-on experience that can directly apply to your projects. By the end of the course, you will be able to effectively utilize and optimize open-source LLMs on AWS, making your AI applications more efficient and powerful.