ModelsChatRankingDocs
DocsAPI ReferenceSDK Reference
DocsAPI ReferenceSDK Reference
  • Overview
    • Quickstart
    • Principles
    • Models
    • FAQ
    • Enterprise
  • Models & Routing
    • Auto Model Selection
    • Model Fallbacks
    • Provider Selection
  • Features
    • Presets
    • Tool Calling
    • Structured Outputs
    • Message Transforms
    • Model Routing
    • Zero Completion Insurance
    • ZDR
    • App Attribution
LogoLogo
ModelsChatRankingDocs
On this page
  • title: LangChain subtitle: Using OpenRouter with LangChain headline: ‘LangChain Integration | OpenRouter SDK Support’ canonical-url: https://openrouter.ai/docs/guides/community/langchain og:site_name: OpenRouter Documentation og:title: ‘LangChain Integration - OpenRouter SDK Support’ og:description: ‘Integrate OpenRouter using LangChain framework. Complete guide for LangChain integration with OpenRouter for Python and JavaScript.’ og:image: https://openrouter.ai/dynamic-og?title=LangChain&description=LangChain%20Integration og:image:width: 1200 og:image:height: 630 twitter:card: summary_large_image twitter:site: ‘@OpenRouterAI’ noindex: false nofollow: false
  • Using LangChain
Community


title: LangChain subtitle: Using OpenRouter with LangChain headline: ‘LangChain Integration | OpenRouter SDK Support’ canonical-url: https://openrouter.ai/docs/guides/community/langchain og:site_name: OpenRouter Documentation og:title: ‘LangChain Integration - OpenRouter SDK Support’ og:description: ‘Integrate OpenRouter using LangChain framework. Complete guide for LangChain integration with OpenRouter for Python and JavaScript.’ og:image: https://openrouter.ai/dynamic-og?title=LangChain&description=LangChain%20Integration og:image:width: 1200 og:image:height: 630 twitter:card: summary_large_image twitter:site: ‘@OpenRouterAI’ noindex: false nofollow: false

Using LangChain

LangChain provides a standard interface for working with chat models. You can use OpenRouter with LangChain by setting the base_url parameter to point to OpenRouter’s API. For more details on LangChain’s model interface, see the LangChain Models documentation.

Resources:

  • Using LangChain for Python: github
  • Using LangChain.js: github
  • Using Streamlit: github
1import { ChatOpenAI } from "@langchain/openai";
2import { HumanMessage, SystemMessage } from "@langchain/core/messages";
3
4const chat = new ChatOpenAI(
5 {
6 model: '<model_name>',
7 temperature: 0.8,
8 streaming: true,
9 apiKey: '${API_KEY_REF}',
10 },
11 {
12 baseURL: 'https://openrouter.ai/api/v1',
13 defaultHeaders: {
14 'HTTP-Referer': '<YOUR_SITE_URL>', // Optional. Site URL for rankings on openrouter.ai.
15 'X-Title': '<YOUR_SITE_NAME>', // Optional. Site title for rankings on openrouter.ai.
16 },
17 },
18);
19
20// Example usage
21const response = await chat.invoke([
22 new SystemMessage("You are a helpful assistant."),
23 new HumanMessage("Hello, how are you?"),
24]);
Was this page helpful?
Previous

LiveKit

Next
Built with
LangChain