HomeArtificial IntelligenceOpenAI Introduces ChatGPT, Whisper APIs For Developers

OpenAI Introduces ChatGPT, Whisper APIs For Developers

openai logo Microsoft-owned OpenAI is now allowing third-party developers to integrate ChatGPT and Whisper AI into their products via its new application programming interface (API) releases.

So get ready to meet the AI chatbot in more apps and services than ever before.

Several platforms such Snap, Shopify, Instacart and Quizlet have already implemented the ChatGPT and/or Whisper API. The latter is an automatic speech recognition system that can transcribe and translate audio into text.

OpenAI has managed to reduce the cost for ChatGPT by a whopping 90% since December 2022 and this savings is being passed on to developers using the API. The price of 1,000 tokens is $0.002.

The newly launched API is fueled by GPT 3.5 Turbo, the very same model used in OpenAI’s ChatGPT product. This is said to be best model for non-chat use cases.

For developers requiring a humongous number of tokens per day, a dedicated-capacity option is also on the menu. Dedicated capacity allows for features such as longer context limits to be enabled, says TechCrunch. This could prevent gpt-3.5-turbo from making up facts, an AI phenomenon dubbed “hallucination”.

For those who need to design a bot that can handle voice commands rather than text queries, there’s the Whisper API. This AI-powered speech-to-text model is capable of transcribing audio into text at the cost of a mere $0.006 per minute. It can work with M4A, MP3, MP4, WAV, MPEG, MPGA, and WEBM file formats.

Apart from announcing the new APIs, OpenAI also pinky swore that it wouldn’t use data submitted through the API for service improvements including model training. However, developers can volunteer to share their data, should they wish to do so. It is also putting up a default 30-day data retention policy for API users.

The T&Cs are being simplified, including the parts that cover data ownership—it’s been spelled out that users own the input and output of the models. This policy change is extremely important if OpenAI wants more companies to use its APIs since a lot of businesses are wary about feeding proprietary information to the bot.

OpenAI acknowledged that its uptime still needs a lot of work and has mentioned it to be the engineering team’s top priority at the moment.