-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
What happened?
A problem is occurring when interacting with our in-house LLM server.
In our company, we fundamentally use the company's proxy server and have set environment variables such as http_proxy and https_proxy for this purpose.
Our company provides a local Qwen3 LLM service internally, and this service should not go through the proxy. Therefore, we registered its server address in the no_proxy environment variable.
To use the Qwen code, we configured OPENAI_API_KEY, OPENAI_MODEL, and OPENAI_BASE_URL.
When executing the Qwen code to make a query, a Connection error occurs.
What did you expect to happen?
No Connection errors...
Client information
Client Information
Run qwen
to enter the interactive CLI, then run the /about
command.
╭─────────────────────────────────────────────────────────────────────────────────╮
│ │
│ About Qwen Code │
│ │
│ CLI Version 0.0.14 │
│ Git Commit 9a0cb64a │
│ Model S-Core/Qwen3-235B-A22B │
│ Sandbox no sandbox │
│ OS linux │
│ Auth Method openai │
│ │
╰─────────────────────────────────────────────────────────────────────────────────╯
Login information
No response
Anything else we need to know?
What we found after trial and error is that:
It works correctly if we execute the Qwen code by clearing the proxy information, like this:
http_proxy="" https_proxy="" HTTP_PROXY="" HTTPS_PROXY="" qwen
In other words, it seems that the Qwen code uses the proxy information but ignores the no_proxy information.
We believe it needs to be modified so that it also utilizes the no_proxy environment variable.