Martian Model Router
  • Getting Started
    • Introduction
    • Hello World
    • Integrating Router
      • OpenAI Python
      • OpenAI Node
      • OpenAI cURL
      • LangChain Python
      • LangChain JS
    • Benchmarking Router
    • Router Parameters
    • Supported Models
Powered by GitBook
On this page
  1. Getting Started

Integrating Router

Integrate Martian Into Your Codebase In Just A Few Lines of Code

PreviousHello WorldNextOpenAI Python

Last updated 7 months ago

Integrating Martian's Model Router is incredibly easy. We let you use our router via a drop-in replacement for OpenAI:

  • Update your BASE_URL to https://withmartian.com/api/openai/v1

  • Update your OPENAI_API_KEY to your Martian API Key

    • If you haven't created a Martian API Key yet, you can do so

Below, you'll find instructions for a drop-in replacement for a number of different languages and libraries.

When you first integrate, you can keep using your existing model and . Once you're confident in its performance, you can .

If you are not currently using one of the below methods or need any other help contact us via .

here
benchmark its performance against the router
switch to the router
contact@withmartian.com
OpenAI Python
OpenAI Node
OpenAI cURL
LangChain Python
LangChain JS