Skip to content

About the Deepseek model configuration #8

@yanjip

Description

@yanjip

First of all, thank the author for sharing the self-evolve agent project.

I used the api provided by Deepseek's official website. When configuring the model: deepseek-reasoner in the yaml file and running the example, an error was reported:

LoongFlow/src/evolux/react/components/default_reasoner.py", line 47, in reason
raise Exception( Exception: Error code: litellm_error, error: litellm.BadRequestError: DeepseekException - {"error":{"message":"Missing reasoning_content field in the assistant message at message index 2. For more information, please refer to https://api-docs.deepseek.com/guides/thinking_mode#tool-calls","type":"invalid_request_error","param":null,"code":"inval id_request_error"}}

The above problems have not yet been solved.

Secondly, when configuring the model: deepseek-chat, an error {"message":"Invalid max_tokens value, the valid range of max_tokens is [1, 8192]" is reported. This has been solved by myself. Just add max_token.
python llm_request = CompletionRequest(messages=[user_message]) resp_generator = self.model.generate(llm_request)

So, I'd like to ask if there's a problem with my configuration?

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions