You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've deployed dify 1.0.0 in an intranet environment.
After installing the ollama model plugin, I attempted to create a native conversation application, but it failed to produce any output. The F12 developer tools did not indicate any errors. Upon examining the API container, it appears that certain dependent files need to be downloaded from the external network, including the file openai_public.py. I have modified the content of openai_public.py as follows. Despite restarting the container service and using docker cp to verify that the file has been successfully modified, the same error persists.
Could you please advise on any steps that I might have overlooked?
== openai_public.py: The modifications are as follows::
2025-03-14 09:02:40.245 ERROR [Thread-10 (_generate_worker)] [app_generator.py:243] - Unknown Error when generating
Traceback (most recent call last):
File "/app/api/core/app/apps/chat/app_generator.py", line 223, in _generate_worker
runner.run(
File "/app/api/core/app/apps/chat/app_runner.py", line 58, in run
self.get_pre_calculate_rest_tokens(
File "/app/api/core/app/apps/base_app_runner.py", line 86, in get_pre_calculate_rest_tokens
prompt_tokens = model_instance.get_llm_num_tokens(prompt_messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/core/model_manager.py", line 195, in get_llm_num_tokens
self._round_robin_invoke(
File "/app/api/core/model_manager.py", line 370, in _round_robin_invoke
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/core/model_runtime/model_providers/__base/large_language_model.py", line 299, in get_num_tokens
return plugin_model_manager.get_llm_num_tokens(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/api/core/plugin/manager/model.py", line 231, in get_llm_num_tokens
for resp in response:
^^^^^^^^
File "/app/api/core/plugin/manager/base.py", line 189, in _request_with_plugin_daemon_response_stream
self._handle_plugin_daemon_error(error.error_type, error.message)
File "/app/api/core/plugin/manager/base.py", line 221, in _handle_plugin_daemon_error
raise PluginInvokeError(description=message)
core.plugin.manager.exc.PluginInvokeError: PluginInvokeError: {"args":{},"error_type":"ConnectionError","message":"HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /gpt-2/encodings/main/vocab.bpe (Caused by NameResolutionError(\"\u003curllib3.connection.HTTPSConnection object at 0x7fc82dd90860\u003e: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -2] Name or service not known)\"))"}
2025-03-14 09:02:40.251 ERROR [Dummy-11] [base_app_generate_response_converter.py:123] - {"args":{},"error_type":"ConnectionError","message":"HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /gpt-2/encodings/main/vocab.bpe (Caused by NameResolutionError(\"\u003curllib3.connection.HTTPSConnection object at 0x7fc82dd90860\u003e: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -2] Name or service not known)\"))"}
2. Additional context or comments
No response
3. Can you help us with this feature?
I am interested in contributing to this feature.
The text was updated successfully, but these errors were encountered:
Hey @IamWWT! 👋 I'm here to help you with any bugs, questions, or contributions you have. Let's squash those bugs together!
To address the ConnectionError related to openaipublic.blob.core.windows.net in Dify v1.0.0, you can follow these steps to create a local cache for the required files and update your Docker configuration:
These steps should help resolve the issue by ensuring that the necessary files are available locally, avoiding the need for external network requests [1].
Wow, thank you so much for your quick response, However, I’m no longer at my company’s office. I will give it a try/test next Monday. Thanks again.
by the way, the setting above is written in the service of 'api'? cause i noticed the - ./volumes/plugin_daemon:/app/storage included in the service of plugin_daemon already.
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
内网环境部署了dify 1.0.0,ollama模型插件安装后,创建一个原生对话应用不能成功输出,f12显示没有什么错误。查看api容器发现好像是有依赖文件要从外网下载,发现是这个文件openai_public.py,然后openai_public.py文件内容已经修改如下。但是我重启容器服务后,我docker cp拷贝这个文件检查发现确实文件修改成功了,但是仍旧报相同的错误。烦请告诉我我少了哪些操作呢?
I've deployed dify 1.0.0 in an intranet environment.
After installing the ollama model plugin, I attempted to create a native conversation application, but it failed to produce any output. The F12 developer tools did not indicate any errors. Upon examining the API container, it appears that certain dependent files need to be downloaded from the external network, including the file openai_public.py. I have modified the content of openai_public.py as follows. Despite restarting the container service and using
docker cp
to verify that the file has been successfully modified, the same error persists.Could you please advise on any steps that I might have overlooked?
== openai_public.py: The modifications are as follows::
=== ERROR of container docker-api-1 as following:
2. Additional context or comments
No response
3. Can you help us with this feature?
The text was updated successfully, but these errors were encountered: