NeatlogsNeatlogs

LiteLLM

Neatlogs offers seamless integration with LiteLLM, a lightweight library that provides a unified interface to call any LLM API.

Installation

To get started with LiteLLM, you'll need to install the package:

pip install neatlogs litellm
poetry add neatlogs litellm
uv add neatlogs litellm

Setting Up API Keys

Before using LiteLLM with Neatlogs, you need to set up your API keys. You can obtain:

Then to set them up, you can either export them as environment variables or set them in a .env file:

LITELLM_API_KEY="your_litellm_api_key_here"
OPENAI_API_KEY="your_openai_api_key_here"

Then load the environment variables in your Python code:

from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

os.getenv("LITELLM_API_KEY")
os.getenv("OPENAI_API_KEY")

Usage

Once you've set up your LiteLLM integration, integrating Neatlogs takes just two lines of code:

import neatlogs
neatlogs.init(api_key="<YOUR_API_KEY>")

Examples

Here's a simple example of how to use LiteLLM with Neatlogs:

import neatlogs
import litellm
from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

# Initialize Neatlogs
neatlogs.init(api_key="YOUR_API_KEY")

# Set your API keys
litellm.api_key = os.getenv("OPENAI_API_KEY")  # or other provider key

# Make a completion request
response = litellm.completion(
    model="gpt-4",
    messages=[
        {"role": "user", "content": "What is the capital of France?"}
    ]
)

print(response.choices[0].message.content)

After that, every API call is automatically traced and visualized in Neatlogs, perfect for debugging, evaluating and collaborating.

For more information on LiteLLM, check out their comprehensive documentation.