LangChain is a popular open-source framework for building applications with large language models across Python, TypeScript, and other languages. By integrating Helicone AI Gateway with LangChain, you can:
Route to different models & providers with automatic failover through a single endpoint
Unified billing with pass-through billing or bring your own keys
Monitor all requests with automatic cost tracking in one dashboard
Stream responses with full observability for real-time applications
This integration requires only two changes to your existing LangChain code - updating the base URL and API key.
import { ChatOpenAI } from "@langchain/openai";import { HumanMessage, SystemMessage } from "@langchain/core/messages";import dotenv from 'dotenv';dotenv.config();// Initialize ChatOpenAI with Helicone AI Gatewayconst chat = new ChatOpenAI({ model: 'gpt-4.1-mini', // 100+ models supported apiKey: process.env.HELICONE_API_KEY, configuration: { baseURL: "https://ai-gateway.helicone.ai/v1", defaultHeaders: { // Optional: Add custom tracking headers "Helicone-Session-Id": "my-session", "Helicone-User-Id": "user-123", "Helicone-Property-Environment": "production", }, },});
The only changes from a standard LangChain setup are the apiKey, baseURL (or base_url in Python), and optional tracking headers. Everything else stays the same!
5
Use LangChain normally
Your existing LangChain code continues to work without any changes:
Copy
Ask AI
// Simple completionconst response = await chat.invoke([ new SystemMessage("You are a helpful assistant."), new HumanMessage("What is the capital of France?"),]);console.log(response.content);
async function streamingExample() { console.log('\n🌊 Streaming example...\n'); const stream = await chat.stream([ new SystemMessage("You are a helpful assistant."), new HumanMessage("Write a short story about a robot learning to code."), ]); console.log('🤖 Assistant (streaming):'); for await (const chunk of stream) { process.stdout.write(chunk.content as string); } console.log('\n\n✅ Streaming completed!');}streamingExample().catch(console.error);