- Course
- AI
Vector Space Models and Embeddings in RAGs
This course introduces the principles of using vector space models and embeddings in retrieval-augmented generation (RAG) systems.
What you'll learn
Building retrieval-augmented generation (RAG) systems requires more than just a large language model — it depends on embedding vectors that can represent and search knowledge effectively. In this course, Vector Space Models and Embeddings in RAGs, you’ll learn to apply embeddings to connect unstructured data with LLM-powered retrieval and generation. First, you’ll explore the fundamentals of embedding vectors, how they transform raw data into numerical representations, and how they capture semantic relationships in vector space. Next, you’ll discover how embeddings power RAG systems through semantic search, similarity matching, and relevance ranking, while addressing challenges such as vector quality, storage, and updates. Finally, you’ll learn how to optimize and evaluate embeddings to improve retrieval accuracy and overall RAG performance. When you’re finished with this course, you’ll have the skills and knowledge of vector space models and embeddings needed to build effective and reliable RAG applications.
Table of contents
About the author
Cătălin studied Computer Science in Bucharest, graduating in 1997 and obtaining a PhD in 2006. He has taught at POLITEHNICA București and has been involved in the software development industry for decades.