Frameworks & Tools / LLM Orchestration

LangChain

Intermediate [3/5]
LC LLM orchestration framework AI application framework

Definition

LangChain is a framework for building applications powered by LLMs. It provides modular components for prompts, chains, agents, memory, and retrieval, along with integrations for various LLM providers and external tools.

LangChain simplifies common LLM application patterns like RAG, agents with tools, and multi-step reasoning by providing reusable abstractions and pre-built chains.

Key Concepts

  • Chains: Sequences of LLM calls and operations
  • Agents: LLMs that decide which tools to use
  • Memory: Conversation history and state management
  • Retrievers: Document retrieval for RAG applications

Examples

Architecture
LangChain Components
LANGCHAIN COMPONENT HIERARCHY: ┌─────────────────────────────────────────────────┐ │ APPLICATION │ │ ┌───────────────────────────────────────────┐ │ │ │ AGENTS / CHAINS │ │ │ │ Orchestrate LLM calls, tools, retrieval │ │ │ └─────────────────┬─────────────────────────┘ │ │ │ │ │ ┌───────────────┼───────────────┐ │ │ ▼ ▼ ▼ │ │ ┌──────┐ ┌──────┐ ┌──────────┐ │ │ │ LLMs │ │Tools │ │Retrievers│ │ │ └──────┘ └──────┘ └──────────┘ │ │ │ │ │ ┌───────────────┼───────────────┐ │ │ ▼ ▼ ▼ │ │ ┌──────┐ ┌──────┐ ┌──────────┐ │ │ │Memory│ │Prompts│ │VectorStore│ │ │ └──────┘ └──────┘ └──────────┘ │ └─────────────────────────────────────────────────┘ KEY COMPONENTS: 1. MODELS - LLM wrappers (OpenAI, Anthropic, local) - Chat models vs completion models - Embedding models 2. PROMPTS - Templates with variables - Few-shot examples - Output parsers 3. CHAINS - Sequential: A → B → C - Conditional: if X then A else B - Parallel: A and B simultaneously 4. AGENTS - ReAct pattern: Reason → Act → Observe - Tool selection and execution - Multi-step planning 5. MEMORY - Conversation buffer - Summary memory - Entity memory 6. RETRIEVAL - Document loaders - Text splitters - Vector stores - Retrievers
Implementation
Building with LangChain
LANGCHAIN EXAMPLES: 1. SIMPLE CHAIN: from langchain_openai import ChatOpenAI from langchain_core.prompts import ChatPromptTemplate # Create components llm = ChatOpenAI(model="gpt-4") prompt = ChatPromptTemplate.from_messages([ ("system", "You are a helpful assistant."), ("user", "{question}") ]) # Create chain using LCEL (LangChain Expression Language) chain = prompt | llm # Invoke response = chain.invoke({"question": "What is RAG?"}) --- 2. RAG APPLICATION: from langchain_openai import OpenAIEmbeddings from langchain_community.vectorstores import Chroma from langchain_core.runnables import RunnablePassthrough # Setup retriever embeddings = OpenAIEmbeddings() vectorstore = Chroma.from_documents(docs, embeddings) retriever = vectorstore.as_retriever(k=3) # RAG chain rag_prompt = ChatPromptTemplate.from_template(""" Answer based on context: {context} Question: {question} """) def format_docs(docs): return "\n\n".join(d.page_content for d in docs) rag_chain = ( {"context": retriever | format_docs, "question": RunnablePassthrough()} | rag_prompt | llm ) answer = rag_chain.invoke("What is our refund policy?") --- 3. AGENT WITH TOOLS: from langchain.agents import create_react_agent, AgentExecutor from langchain.tools import tool @tool def search_web(query: str) -> str: """Search the web for current information.""" # Implementation return search_results @tool def calculate(expression: str) -> str: """Evaluate a math expression.""" return str(eval(expression)) tools = [search_web, calculate] agent = create_react_agent(llm, tools, prompt) executor = AgentExecutor(agent=agent, tools=tools) result = executor.invoke({ "input": "What's the population of Tokyo times 2?" }) # Agent: searches for population, then calculates --- LCEL (LangChain Expression Language): # Pipe syntax for chaining chain = prompt | llm | output_parser # Parallel execution chain = { "summary": summarize_chain, "keywords": keyword_chain } | combine_results # Conditional routing chain = RunnableBranch( (lambda x: "code" in x, code_chain), (lambda x: "math" in x, math_chain), default_chain )

Interactive Exercise

Design a LangChain Application

You want to build a customer support chatbot that can: 1) Answer questions from a knowledge base, 2) Look up order status via API, 3) Escalate to humans when needed. Which LangChain components would you use?

Pro Tips
  • Use LCEL (pipe syntax) for cleaner, more maintainable chains
  • LangSmith integration helps debug complex chains
  • Start simple - add complexity only when needed
  • Consider LangGraph for complex agent workflows

Related Terms