Skip to main content
A type-safe API client for the LiteLLM proxy API, auto-generated from the official OpenAPI specification with full TypeScript support.

Features

  • Type-Safe - Full TypeScript support with auto-generated types from OpenAPI spec
  • LLM Proxy - Unified interface for multiple LLM providers
  • Auto-Updated - Daily regeneration from upstream OpenAPI specs

Installation

pnpm add litellm-api

Usage

Basic Usage

import { createClient } from "litellm-api";

const client = createClient({
  baseUrl: "https://your-litellm-proxy.com",
  headers: {
    Authorization: `Bearer ${process.env.LITELLM_API_KEY}`,
  },
});

// List available models
const models = await client.getModels();

// Create a chat completion
const completion = await client.createChatCompletion({
  body: {
    model: "gpt-4",
    messages: [
      { role: "user", content: "Hello, world!" },
    ],
  },
});

Type Exports

import type { Types } from "litellm-api";

// Use TypeScript types
type Model = Types.Model;
type ChatCompletion = Types.ChatCompletion;

API Reference

The client exposes LiteLLM API endpoints including:
  • Chat Completions - Chat completion API (OpenAI-compatible)
  • Completions - Text completion API
  • Embeddings - Text embedding generation
  • Models - Available model listing
  • Keys - API key management
  • Users - User management
  • And more…
For the complete API reference, see the LiteLLM documentation.