How can I use custom openai compatible API in code...

Creato il: 3 settembre 2025

Risposto usando GPT-5 Thinking da Chat01

Domanda

How can I use custom openai compatible API in codex cli? I want to use gpt 5 model with a custom openai compatible api, and it only supports v1/chat/completions endpoint, not v1/responses.

IMPORTANT: ONLY ANSWER THE QUESTIONS IF YOU'RE %100 SURE YOUR ANSWER IS CORRECT, OTHERWISE JUST SAY "I don't know"

Domanda

How can I use custom openai compatible API in codex cli? I want to use gpt 5 model with a custom openai compatible api, and it only supports v1/chat/completions endpoint, not v1/responses.

IMPORTANT: ONLY ANSWER THE QUESTIONS IF YOU'RE %100 SURE YOUR ANSWER IS CORRECT, OTHERWISE JUST SAY "I don't know"

Pensiero

Stating Uncertainty

It seems there's uncertainty in the details. For now, I should suggest setting OPENAI_BASE_URL and OPENAI_API_KEY for a custom API. For the model, you can use --model. However, due to some changes in documentation and flags, I can't say with certainty how to use --provider. So, my best advice is, unfortunately, "I don't know."

Thought for 2m 3s

Risposta

I don’t know.

Condividi questa Q&A