CrewAI Multi-Agent Crew
Instrument a CrewAI crew with per-agent prompt templates and per-task user prompts.
This guide shows how to instrument a CrewAI crew where agents and tasks are defined in separate files.
CrewAI manages LLM calls internally, so you cannot wrap them with with neatlogs.trace(kind="LLM"):. Neatlogs provides two helpers instead:
neatlogs.bind_templates(llm, system_tpl): attach a system prompt template to an LLM before passing it to an agentneatlogs.register_crewai_task(task, user_tpl): associate a user prompt template with a task after creating it
Setup
import os
import neatlogs
neatlogs.init(
api_key=os.environ["NEATLOGS_API_KEY"],
endpoint=os.environ.get("NEATLOGS_ENDPOINT"),
workflow_name="support-crew",
instrumentations=["crewai"],
)
# Import CrewAI AFTER init()
from crewai import Agent, Task, Crew, LLMagents.py: Bind System Prompts
Create a PromptTemplate for each agent's system prompt, then pass it to neatlogs.bind_templates(llm, system_tpl). Use the bound LLM when constructing the agent.
# agents.py
import neatlogs
from crewai import Agent, LLM
def research_agent() -> Agent:
system_tpl = neatlogs.PromptTemplate(
"You are a research specialist. Find accurate, up-to-date information on the given topic."
)
return Agent(
role="Researcher",
goal="Research and summarize information accurately",
backstory=str(system_tpl.template),
llm=neatlogs.bind_templates(
LLM(model="openai/gpt-4o"),
system_tpl,
),
allow_delegation=False,
)
def writer_agent() -> Agent:
system_tpl = neatlogs.PromptTemplate(
"You are a professional writer. Produce clear, concise reports from research findings."
)
return Agent(
role="Writer",
goal="Write clear reports from research findings",
backstory=str(system_tpl.template),
llm=neatlogs.bind_templates(
LLM(model="openai/gpt-4o"),
system_tpl,
),
allow_delegation=False,
)tasks.py: Register User Prompts
After creating each Task, call neatlogs.register_crewai_task(task, user_tpl) to associate a UserPromptTemplate with it.
# tasks.py
import neatlogs
from crewai import Task
def research_task(agent, topic: str) -> Task:
description = f"Research the following topic thoroughly: {topic}"
expected_output = "A detailed summary of findings with key facts and sources."
task = Task(description=description, expected_output=expected_output, agent=agent)
neatlogs.register_crewai_task(
task,
neatlogs.UserPromptTemplate(description + "\n\n" + expected_output),
)
return task
def write_report_task(agent, context: list) -> Task:
description = "Using the research findings, write a concise 3-paragraph report."
expected_output = "A polished report ready for publication."
task = Task(description=description, expected_output=expected_output, agent=agent, context=context)
neatlogs.register_crewai_task(
task,
neatlogs.UserPromptTemplate(description + "\n\n" + expected_output),
)
return taskmain.py: Run the Workflow
Decorate the function that starts the workflow with @neatlogs.span(kind="WORKFLOW"). This can be the function that directly calls crew.kickoff(), or any function further up the call stack that represents the entry point:
# main.py
import os
import neatlogs
neatlogs.init(
api_key=os.environ["NEATLOGS_API_KEY"],
endpoint=os.environ.get("NEATLOGS_ENDPOINT"),
workflow_name="support-crew",
instrumentations=["crewai"],
)
from crewai import Crew
from agents import research_agent, writer_agent
from tasks import research_task, write_report_task
@neatlogs.span(kind="WORKFLOW", name="research_crew")
def run(topic: str) -> str:
researcher = research_agent()
writer = writer_agent()
task1 = research_task(researcher, topic=topic)
task2 = write_report_task(writer, context=[task1])
crew = Crew(agents=[researcher, writer], tasks=[task1, task2])
result = crew.kickoff()
return result.raw if hasattr(result, "raw") else str(result)
print(run("Advances in vector database technology in 2024"))
neatlogs.flush()
neatlogs.shutdown()What You'll See in the Dashboard
A WORKFLOW span for research_crew containing:
- Agent spans for
researcherandwriter(captured automatically by thecrewaiinstrumentation) - LLM spans with system prompts from
bind_templatesand user prompts fromregister_crewai_task
Summary
| What | How |
|---|---|
| System prompt on an agent | neatlogs.bind_templates(llm, system_tpl): pass bound LLM to Agent(llm=...) |
| User prompt on a task | neatlogs.register_crewai_task(task, user_tpl): call after Task(...) |
| Workflow span | @neatlogs.span(kind="WORKFLOW") on the function that starts the workflow |