Featured resource
2025 Tech Upskilling Playbook
Tech Upskilling Playbook

Build future-ready tech teams and hit key business milestones with seven proven plays from industry leaders.

Check it out
  • Course
    • Libraries: If you want this course, consider one of these libraries.
    • AI

LLM Basics and Transformers with Pytorch

Master the core ideas behind transformers and LLMs. Using PyTorch and Hugging Face, you’ll grasp attention, run quick inference, and fine‑tune a mini‑BERT model for real‑world text‑classification tasks.

Ashraf AlMadhoun - Pluralsight course - LLM Basics and Transformers with Pytorch
Ashraf AlMadhoun
What you'll learn

Modern NLP is driven by transformers, yet many developers still treat them as black boxes. In this course, LLM Basics and Transformers with PyTorch, you’ll learn to demystify transformer models and use them in practice. First, you’ll explore why self‑attention replaced recurrent networks and how tokenization, positional encoding, and encoder/decoder stacks work together. Next, you’ll discover how to load pre‑trained transformer checkpoints in PyTorch, convert raw text into tensors, and run lightning‑fast inference. Finally, you’ll learn to fine‑tune a compact BERT model on a small labeled dataset, handling over‑fitting and evaluating accuracy. When you’re finished, you’ll possess the core transformer knowledge—and hands‑on PyTorch examples—needed to explain, deploy, and adapt LLM technology in your own projects.

Table of contents

About the author
Ashraf AlMadhoun - Pluralsight course - LLM Basics and Transformers with Pytorch
Ashraf AlMadhoun

Ashraf is an expert in embedded systems, AI, and digital education. With years of experience, he creates engaging courses that simplify complex topics for learners worldwide.

Get access now

Sign up to get immediate access to this course plus thousands more you can watch anytime, anywhere.

Get started with Pluralsight