Most projects also give you the option of providing an base url for the API so that people can use Azure's endpoints. You can use that config option with LiteLLM or a similar proxy tool to provide an OpenAI compatible interface for other models, whether that's a competitor like Claude or a local model like Llama or Mistral.