How can I make the API response from the TaskingAI server “OpenAI compatible” when I self-host the inference server?
Please visit the blog post at: TaskingAI Example | Integrate with Ease: Using OpenAI-Compatible APIs with TaskingAI Services. And you will how to use the OpenAI compatible API with TaskingAI cloud platform. To use it with you are self-hosting, simply replace base_url
to http://127.0.0.1:8080/v1
(assuming you did not change the service port) and it should work.
2 Likes