This course teaches developers how to build, evaluate, and deploy end-to-end conversational agents using LangChain, FAISS, and real-world APIs. Participants gain deep insight into prompt engineering, memory management, document retrieval, function calling, and deployment workflows. Through hands-on exercises, participants will learn to build a fully functional web-based agent that answers internal questions, personalizes interactions, and performs real-world tasks using APIs.
Prerequisites
In order to succeed in this course, you will need:
- Experience programming with Python
- Familiarity with using external libraries and working with APIs
Purpose
| Equip developers with the skills to build, evaluate, and deploy conversational agents using LangChain |
Audience
| Developers with Python experience interested in building LLM-powered applications |
Role
| Developers | Software Engineers | AI/ML Practitioners |
Skill level
| Intermediate |
Style
| Lecture | Hands-on Workshop |
Duration
| 2 days |
Related technologies
| Agentic AI | LangChain | LLMs | Python |
Course objectives
- Understand the architecture of a modern LLM-based conversational agents
- Use LangChain components to manage prompts, chains, memory, and tools
- Connect to internal documents using FAISS and embedding models
- Build chatbots that also act as agents by integrating external APIs
- Evaluate chatbot responses for latency, cost, and safety
- Deploy working agents using Hugging Face Spaces and Streamlit/Gradio