Skip to content

fix: omit temperature from OpenAI-compatible requests when not explicitly set#12143

Draft
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-omit-default-temperature
Draft

fix: omit temperature from OpenAI-compatible requests when not explicitly set#12143
roomote-v0[bot] wants to merge 1 commit intomainfrom
fix/openai-compatible-omit-default-temperature

Conversation

@roomote-v0
Copy link
Copy Markdown
Contributor

@roomote-v0 roomote-v0 bot commented Apr 17, 2026

Related GitHub Issue

Closes: #12141

Description

This PR attempts to address Issue #12141. Feedback and guidance are welcome.

Problem: The OpenAI-compatible handler always sent temperature: 0 in API requests when no custom temperature was configured. Some OpenAI-compatible APIs (e.g. Kimi Coding) require specific temperature values and reject all others with a 400 error.

Fix: When modelTemperature is not explicitly set by the user (null or undefined), the temperature parameter is now omitted entirely from the request. This lets upstream APIs use their own model-specific defaults. When the user explicitly configures a temperature (including 0), it is still sent as before.

Changes:

  • src/api/providers/openai.ts: Modified both streaming and non-streaming request paths to conditionally include temperature only when the user has explicitly set modelTemperature, or when using a DeepSeek reasoner model (which has its own default).
  • src/api/providers/__tests__/openai.spec.ts: Added 6 new tests covering the temperature omission behavior for both streaming and non-streaming modes. Updated 1 existing Azure AI Inference test to match the new behavior.

Test Procedure

  • Ran cd src && npx vitest run api/providers/__tests__/openai.spec.ts -- all 53 tests pass
  • Ran all provider tests (cd src && npx vitest run api/providers/__tests__/) -- all 812 tests pass
  • Ran model-params tests to verify no side effects -- all 57 tests pass
  • Lint and type-check pass via pre-push hooks

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes.
  • Documentation Impact: No documentation updates are required.
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Documentation Updates

  • No documentation updates are required.

Additional Notes

The DeepSeek provider has its own createMessage() override and is not affected by this change -- it already falls back to DEEP_SEEK_DEFAULT_TEMPERATURE independently.

Interactively review PR in Roo Code Cloud

…itly set

When no custom temperature is configured, the OpenAI handler was always
sending temperature: 0 in API requests. Some OpenAI-compatible APIs
(e.g. Kimi Coding) require specific temperature values and reject others.

This change omits the temperature parameter entirely when the user has
not explicitly set one, letting upstream APIs use their own model-specific
defaults. When a user configures a custom temperature, it is still sent.

Fixes #12141
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Error 400 with Kimi Coding Plan API "temperature can only be 0.6"

1 participant