From 25a6f2871601dde0aed510595a43f8302b0c432c Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Thu, 31 Jul 2025 18:52:02 +0500 Subject: [PATCH 1/5] Add tracing guide for non-OpenAI LLMs in docs/tracing.md --- docs/tracing.md | 38 ++++++++++++++++++++++++++++++++++++-- 1 file changed, 36 insertions(+), 2 deletions(-) diff --git a/docs/tracing.md b/docs/tracing.md index 5182e159f..cf11afadf 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -39,7 +39,7 @@ By default, the SDK traces the following: - Audio outputs (text-to-speech) are wrapped in a `speech_span()` - Related audio spans may be parented under a `speech_group_span()` -By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. +By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination). @@ -97,6 +97,41 @@ To customize this default setup, to send traces to alternative or additional bac 1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend. 2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. + +## Tracing with Non-OpenAI LLMs + +You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing. + +```python +from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel +from dotenv import load_dotenv +load_dotenv() +import os + +set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) +gemini_api_key = os.getenv("GEMINI_API_KEY") + +external_client = AsyncOpenAI( + api_key=gemini_api_key, + base_url="https://generativelanguage.googleapis.com/v1beta" + ) + +model = OpenAIChatCompletionsModel( + model="gemini-2.0-flash", + openai_client=external_client, + ) + +agent = Agent( + name="Assistant", + model=model, + ) +``` + +## Notes +- Set OPENAI_API_KEY in a .env file. +- View free traces at Openai Traces dashboard. + + ## External tracing processors list - [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents) @@ -117,4 +152,3 @@ To customize this default setup, to send traces to alternative or additional bac - [Okahu-Monocle](https://github.com/monocle2ai/monocle) - [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) - [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) -- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) From 0f244cd7f1a2cc50baf8d4bf184994179d8d5b4f Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Thu, 31 Jul 2025 20:24:32 +0500 Subject: [PATCH 2/5] Save current changes --- docs/repl.md | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/repl.md b/docs/repl.md index 073b87f51..aeb518be2 100644 --- a/docs/repl.md +++ b/docs/repl.md @@ -1,6 +1,7 @@ # REPL utility -The SDK provides `run_demo_loop` for quick interactive testing. +The SDK provides `run_demo_loop` for quick, interactive testing of an agent's behavior directly in your terminal. + ```python import asyncio @@ -14,6 +15,6 @@ if __name__ == "__main__": asyncio.run(main()) ``` -`run_demo_loop` prompts for user input in a loop, keeping the conversation -history between turns. By default it streams model output as it is produced. -Type `quit` or `exit` (or press `Ctrl-D`) to leave the loop. +`run_demo_loop` prompts for user input in a loop, keeping the conversation history between turns. By default, it streams model output as it is produced. When you run the example above, run_demo_loop starts an interactive chat session. It continuously asks for your input, remembers the entire conversation history between turns (so your agent knows what's been discussed) and automatically streams the agent's responses to you in real-time as they are generated. + +To end this chat session, simply type `quit` or `exit` (and press Enter) or use the `Ctrl-D` keyboard shortcut. From a81ca196af36684e84a2a868463402bf560e4ebd Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Fri, 1 Aug 2025 11:14:12 +0500 Subject: [PATCH 3/5] docs: update tracing example and description for non-OpenAI models --- docs/tracing.md | 32 ++++++++++++-------------------- 1 file changed, 12 insertions(+), 20 deletions(-) diff --git a/docs/tracing.md b/docs/tracing.md index cf11afadf..67b0fea43 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -39,7 +39,7 @@ By default, the SDK traces the following: - Audio outputs (text-to-speech) are wrapped in a `speech_span()` - Related audio spans may be parented under a `speech_group_span()` -By default, the trace is named "Agent trace". You can set this name if you use `trace`, or you can can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. +By default, the trace is named "Agent workflow". You can set this name if you use `trace`, or you can configure the name and other properties with the [`RunConfig`][agents.run.RunConfig]. In addition, you can set up [custom trace processors](#custom-tracing-processors) to push traces to other destinations (as a replacement, or secondary destination). @@ -98,37 +98,28 @@ To customize this default setup, to send traces to alternative or additional bac 2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. -## Tracing with Non-OpenAI LLMs +## Tracing with Non-OpenAI Models -You can use an OpenAI API key with non-OpenAI LLMs to enable free tracing on the Openai Traces dashboard without disabling tracing. +You can use an OpenAI API key with non-OpenAI Models to enable free tracing in the OpenAI Traces dashboard without needing to disable tracing. ```python -from agents import set_tracing_export_api_key, Agent, Runner, AsyncOpenAI, OpenAIChatCompletionsModel -from dotenv import load_dotenv -load_dotenv() -import os +from agents import set_tracing_export_api_key, Agent, Runner +from agents.extensions.models.litellm_model import LitellmModel -set_tracing_export_api_key(os.getenv("OPENAI_API_KEY")) -gemini_api_key = os.getenv("GEMINI_API_KEY") +set_tracing_export_api_key("OPENAI_API_KEY") -external_client = AsyncOpenAI( - api_key=gemini_api_key, - base_url="https://generativelanguage.googleapis.com/v1beta" - ) - -model = OpenAIChatCompletionsModel( - model="gemini-2.0-flash", - openai_client=external_client, - ) +model = LitellmModel( + model="your-model-name", + api_key="your-api-key" +) agent = Agent( name="Assistant", model=model, - ) +) ``` ## Notes -- Set OPENAI_API_KEY in a .env file. - View free traces at Openai Traces dashboard. @@ -152,3 +143,4 @@ agent = Agent( - [Okahu-Monocle](https://github.com/monocle2ai/monocle) - [Galileo](https://v2docs.galileo.ai/integrations/openai-agent-integration#openai-agent-integration) - [Portkey AI](https://portkey.ai/docs/integrations/agents/openai-agents) +- [LangDB AI](https://docs.langdb.ai/getting-started/working-with-agent-frameworks/working-with-openai-agents-sdk) From 1e163441386092b9ce31b028d766811ea986c9ea Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Fri, 1 Aug 2025 11:20:20 +0500 Subject: [PATCH 4/5] docs: update tracing example and description for non-OpenAI models --- docs/tracing.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tracing.md b/docs/tracing.md index 67b0fea43..e9551fa8a 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -110,7 +110,7 @@ set_tracing_export_api_key("OPENAI_API_KEY") model = LitellmModel( model="your-model-name", - api_key="your-api-key" + api_key="your-api-key", ) agent = Agent( From a6149c48176f33efb1da812492546f221250d6bf Mon Sep 17 00:00:00 2001 From: UmmeHabiba1312 Date: Fri, 1 Aug 2025 12:19:09 +0500 Subject: [PATCH 5/5] Clarify and enhance run_demo_loop utility documentation --- docs/tracing.md | 26 -------------------------- 1 file changed, 26 deletions(-) diff --git a/docs/tracing.md b/docs/tracing.md index e9551fa8a..5182e159f 100644 --- a/docs/tracing.md +++ b/docs/tracing.md @@ -97,32 +97,6 @@ To customize this default setup, to send traces to alternative or additional bac 1. [`add_trace_processor()`][agents.tracing.add_trace_processor] lets you add an **additional** trace processor that will receive traces and spans as they are ready. This lets you do your own processing in addition to sending traces to OpenAI's backend. 2. [`set_trace_processors()`][agents.tracing.set_trace_processors] lets you **replace** the default processors with your own trace processors. This means traces will not be sent to the OpenAI backend unless you include a `TracingProcessor` that does so. - -## Tracing with Non-OpenAI Models - -You can use an OpenAI API key with non-OpenAI Models to enable free tracing in the OpenAI Traces dashboard without needing to disable tracing. - -```python -from agents import set_tracing_export_api_key, Agent, Runner -from agents.extensions.models.litellm_model import LitellmModel - -set_tracing_export_api_key("OPENAI_API_KEY") - -model = LitellmModel( - model="your-model-name", - api_key="your-api-key", -) - -agent = Agent( - name="Assistant", - model=model, -) -``` - -## Notes -- View free traces at Openai Traces dashboard. - - ## External tracing processors list - [Weights & Biases](https://weave-docs.wandb.ai/guides/integrations/openai_agents)