A new report claims to know how much ChatGPT costs to run per day, including how much each query approximately costs. The popular text-to-speech chatbot might have set off an AI revolution in November 2022, but has proven extremely expensive to maintain.
The new report comes from Dylan Patel, chief analyst at the research firm SemiAnalysis, who says it costs approximately $700,000 per day, or 36 cents per query, to keep the chatbot up and running.
It’s so expensive, in fact, that Microsoft might be developing its own proprietary AI chips to assist in the maintenance of OpenAI’s operation of ChatGPT, according to Windows Central.
In addition to quickly hitting a 100 million active users in January, a feat that previously took tech brands years to achieve, ChatGPT has struggled with high traffic and capacity issues slowing down and crashing its servers. The company attempted to remedy this by introducing a paid ChatGPT Plus tier, which costs $20 per month, however, there is no word on how many users subscribe to the paid option.
OpenAI currently uses Nvidia GPUs to maintain not only its own ChatGPT processes, but also those of the brands with which it partners. Industry analysts expect the company will likely require an additional 30,000 GPUs from Nvidia to maintain its commercial performance for the remainder of 2023 alone.
With Microsoft as one of its primary collaborators and investors, OpenAI might be looking at the tech brand to assist in developing hardware to bring down the cost of operations for ChatGPT. According to Windows Central, Microsoft already has this AI chip in the works. Code-named Athena, it is currently being tested internally with the brand’s own teams. The chip is expected to be introduced next year for Microsoft’s Azure AI services.
There is no word on how or when the chip will trickle down to OpenAI and ChatGPT, but the assumption is that it will. The connection comes from the fact that ChatGPT is supported by Azure services. The AI chip might not fully replace Nvidia GPUs, but might help decrease the demand for the hardware, thus reducing the cost of running ChatGPT, Windows Central added.