Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The ollama plugin cannot engage in dialogue and returns empty. #15849

Open
5 tasks done
IamWWT opened this issue Mar 14, 2025 · 2 comments
Open
5 tasks done

The ollama plugin cannot engage in dialogue and returns empty. #15849

IamWWT opened this issue Mar 14, 2025 · 2 comments

Comments

@IamWWT
Copy link

IamWWT commented Mar 14, 2025

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing? Tell me about your story.

内网环境部署了dify 1.0.0,ollama模型插件安装后,创建一个原生对话应用不能成功输出,f12显示没有什么错误。查看api容器发现好像是有依赖文件要从外网下载,发现是这个文件openai_public.py,然后openai_public.py文件内容已经修改如下。但是我重启容器服务后,我docker cp拷贝这个文件检查发现确实文件修改成功了,但是仍旧报相同的错误。烦请告诉我我少了哪些操作呢?

I've deployed dify 1.0.0 in an intranet environment.
After installing the ollama model plugin, I attempted to create a native conversation application, but it failed to produce any output. The F12 developer tools did not indicate any errors. Upon examining the API container, it appears that certain dependent files need to be downloaded from the external network, including the file openai_public.py. I have modified the content of openai_public.py as follows. Despite restarting the container service and using docker cp to verify that the file has been successfully modified, the same error persists.

Could you please advise on any steps that I might have overlooked?

== openai_public.py: The modifications are as follows::

def gpt2():
    mergeable_ranks = data_gym_to_mergeable_bpe_ranks(
        #vocab_bpe_file="https://openaipublic.blob.core.windows.net/gpt-2/encodings/main/vocab.bpe",
        #encoder_json_file="https://openaipublic.blob.core.windows.net/gpt-2/encodings/main/encoder.json",
        vocab_bpe_file="/app/api_deps/vocab.bpe",
        encoder_json_file="/app/api_deps/encoder.json",
        vocab_bpe_hash="1ce1664773c50f3e0cc8842619a93edc4624525b728b188a9e0be33b7726adc5",
        encoder_json_hash="196139668be63f3b5d6574427317ae82f612a97c5d1cdaf36ed2256dbf636783",
    )
    return {
        "name": "gpt2",
        "explicit_n_vocab": 50257,
        "pat_str": r50k_pat_str,
        "mergeable_ranks": mergeable_ranks,
        "special_tokens": {ENDOFTEXT: 50256},
    }

=== ERROR of container docker-api-1 as following:

2025-03-14 09:02:40.245 ERROR [Thread-10 (_generate_worker)] [app_generator.py:243] - Unknown Error when generating
Traceback (most recent call last):
  File "/app/api/core/app/apps/chat/app_generator.py", line 223, in _generate_worker
    runner.run(
  File "/app/api/core/app/apps/chat/app_runner.py", line 58, in run
    self.get_pre_calculate_rest_tokens(
  File "/app/api/core/app/apps/base_app_runner.py", line 86, in get_pre_calculate_rest_tokens
    prompt_tokens = model_instance.get_llm_num_tokens(prompt_messages)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/api/core/model_manager.py", line 195, in get_llm_num_tokens
    self._round_robin_invoke(
  File "/app/api/core/model_manager.py", line 370, in _round_robin_invoke
    return function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/api/core/model_runtime/model_providers/__base/large_language_model.py", line 299, in get_num_tokens
    return plugin_model_manager.get_llm_num_tokens(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/api/core/plugin/manager/model.py", line 231, in get_llm_num_tokens
    for resp in response:
                ^^^^^^^^
  File "/app/api/core/plugin/manager/base.py", line 189, in _request_with_plugin_daemon_response_stream
    self._handle_plugin_daemon_error(error.error_type, error.message)
  File "/app/api/core/plugin/manager/base.py", line 221, in _handle_plugin_daemon_error
    raise PluginInvokeError(description=message)
core.plugin.manager.exc.PluginInvokeError: PluginInvokeError: {"args":{},"error_type":"ConnectionError","message":"HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /gpt-2/encodings/main/vocab.bpe (Caused by NameResolutionError(\"\u003curllib3.connection.HTTPSConnection object at 0x7fc82dd90860\u003e: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -2] Name or service not known)\"))"}
2025-03-14 09:02:40.251 ERROR [Dummy-11] [base_app_generate_response_converter.py:123] - {"args":{},"error_type":"ConnectionError","message":"HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /gpt-2/encodings/main/vocab.bpe (Caused by NameResolutionError(\"\u003curllib3.connection.HTTPSConnection object at 0x7fc82dd90860\u003e: Failed to resolve 'openaipublic.blob.core.windows.net' ([Errno -2] Name or service not known)\"))"}

Image

2. Additional context or comments

No response

3. Can you help us with this feature?

  • I am interested in contributing to this feature.
@dosubot dosubot bot added the 🐞 bug Something isn't working label Mar 14, 2025
Copy link

dosubot bot commented Mar 14, 2025

Hey @IamWWT! 👋 I'm here to help you with any bugs, questions, or contributions you have. Let's squash those bugs together!

To address the ConnectionError related to openaipublic.blob.core.windows.net in Dify v1.0.0, you can follow these steps to create a local cache for the required files and update your Docker configuration:

  1. Create and populate a local tiktoken cache:

    mkdir tiktoken
    cd tiktoken
    wget https://openaipublic.blob.core.windows.net/gpt-2/encodings/main/vocab.bpe
    cp vocab.bpe 6d1cbeee0f20b3d9449abfede4726ed8212e3aee
    wget https://openaipublic.blob.core.windows.net/gpt-2/encodings/main/encoder.json
    cp encoder.json 6c7ea1a7e38e3a7f062df639a5b80947f075ffe6
  2. Update your Docker configuration to use the local cache by adding these settings:

    environment:
      TIKTOKEN_CACHE_DIR: /app/api/.tiktoken/
    volumes:
      - ./volumes/plugin_daemon:/app/storage
      - ./tiktoken:/app/api/.tiktoken

These steps should help resolve the issue by ensuring that the necessary files are available locally, avoiding the need for external network requests [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@crazywoola crazywoola removed the 🐞 bug Something isn't working label Mar 14, 2025
@IamWWT
Copy link
Author

IamWWT commented Mar 14, 2025

Wow, thank you so much for your quick response, However, I’m no longer at my company’s office. I will give it a try/test next Monday. Thanks again.

by the way, the setting above is written in the service of 'api'? cause i noticed the - ./volumes/plugin_daemon:/app/storage included in the service of plugin_daemon already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants