How can I use custom openai compatible API in code...

Criado em: 3 de setembro de 2025

Respondido usando GPT-5 Thinking por Chat01

Pergunta

How can I use custom openai compatible API in codex cli? I want to use gpt 5 model with a custom openai compatible api, and it only supports v1/chat/completions endpoint, not v1/responses.

IMPORTANT: ONLY ANSWER THE QUESTIONS IF YOU'RE %100 SURE YOUR ANSWER IS CORRECT, OTHERWISE JUST SAY "I don't know"

Pergunta

How can I use custom openai compatible API in codex cli? I want to use gpt 5 model with a custom openai compatible api, and it only supports v1/chat/completions endpoint, not v1/responses.

IMPORTANT: ONLY ANSWER THE QUESTIONS IF YOU'RE %100 SURE YOUR ANSWER IS CORRECT, OTHERWISE JUST SAY "I don't know"

Pensamento

Stating Uncertainty

It seems there's uncertainty in the details. For now, I should suggest setting OPENAI_BASE_URL and OPENAI_API_KEY for a custom API. For the model, you can use --model. However, due to some changes in documentation and flags, I can't say with certainty how to use --provider. So, my best advice is, unfortunately, "I don't know."

Thought for 2m 3s

Resposta

I don’t know.

Compartilhar esta Q&A