NeatlogsNeatlogs

Azure OpenAI

Neatlogs offers seamless integration with Azure OpenAI, Microsoft's cloud-based OpenAI service.

Installation

To get started with Azure OpenAI, you'll need to install the package:

pip install neatlogs openai
poetry add neatlogs openai
uv add neatlogs openai

Setting Up API Keys

Before using Azure OpenAI with Neatlogs, you need to set up your API keys. You can obtain:

Then to set them up, you can either export them as environment variables or set them in a .env file:

AZURE_OPENAI_ENDPOINT="your_azure_endpoint_here"
AZURE_OPENAI_API_KEY="your_azure_api_key_here"
AZURE_OPENAI_API_VERSION="2024-02-01"
AZURE_OPENAI_DEPLOYMENT_NAME="your_deployment_name_here"

Then load the environment variables in your Python code:

from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

os.getenv("AZURE_OPENAI_ENDPOINT")
os.getenv("AZURE_OPENAI_API_KEY")
os.getenv("AZURE_OPENAI_API_VERSION")
os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")

Usage

Once you've set up your Azure OpenAI integration, integrating Neatlogs takes just two lines of code:

import neatlogs
neatlogs.init(api_key="<YOUR_API_KEY>")

Examples

Here's a simple example of how to use Azure OpenAI with Neatlogs:

"""
azure_chat.py
Minimal interactive Azure-OpenAI chat client.
"""
from openai import AzureOpenAI
import os
from typing import List, Dict
from dotenv import load_dotenv
import neatlogs

load_dotenv()

neatlogs.init(api_key=<"PROJECT_API_KEY">, tags=["llm-call"])

client = AzureOpenAI(
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    api_version=os.getenv("AZURE_OPENAI_API_VERSION"),
)

DEPLOYMENT = os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME")
if not DEPLOYMENT:
    raise RuntimeError("Set AZURE_OPENAI_DEPLOYMENT_NAME in your .env file")


def generate_response(messages: List[Dict[str, str]]) -> str:
    """Call the Azure OpenAI chat endpoint and return the assistant's reply."""
    response = client.chat.completions.create(
        model=DEPLOYMENT,
        messages=messages,
        temperature=0.7,
        max_tokens=800,
    )
    return response.choices[0].message.content.strip()


# -------------------------------------------------
# MAIN LOOP
# -------------------------------------------------
if __name__ == "__main__":
    history: List[Dict[str, str]] = [
        {"role": "system", "content": "You are a helpful assistant."}
    ]

    print("Azure OpenAI Chat (Ctrl-C to quit)\n")
    while True:
        query = input("> ")
        if not query.strip():
            continue

        history.append({"role": "user", "content": query})
        answer = generate_response(history)
        history.append({"role": "assistant", "content": answer})
        print(f"🤖: {answer}\n")

After that, every API call is automatically traced and visualized in Neatlogs, perfect for debugging, evaluating and collaborating.

For more information on Azure OpenAI, check out their comprehensive documentation.