Skip to content

Error: Local DeepSeek - 400 #10975

@Hun31

Description

@Hun31

Error Details

Model: Local DeepSeek
Provider: openai
Status Code: 400

Error Output

400 You passed 13439 input characters and requested 4096 output tokens. However, the model's context length is only 4096 tokens, resulting in a maximum input length of 0 tokens (at most 0 characters). Please reduce the length of the input prompt. (parameter=input_text, value=13439)

Additional Context
Please add any additional context about the error here

Metadata

Metadata

Labels

kind:bugIndicates an unexpected problem or unintended behavior

Type

No type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions