In 2025, entrepreneurs will unleash a flood of AI-powered apps. Ultimately, generic AI will proliferate with a new crop of affordable consumer and business apps. This is not the unanimous view today. OpenAI, GoogleAnd xai Engaged in an arms race to train the most powerful large language models (LLMs) artificial general intelligenceWhat is known as AGI, and their gladiatorial battle to dominate the mindshare and revenue share of the budding General AI ecosystem.

For example, Elon Musk raised $6 billion to launch newcomer xAI and bought 100,000 Nvidia H100 GPUs, Expensive chips are used to process AIIts model, Grok, cost $3 billion to train. At those prices, only tech giants can afford to build these huge LLMs.

The incredible spending by companies like OpenAI, Google, and xAI has created an unbalanced ecosystem that is heavy at the bottom and light at the top. LLMs trained by these huge GPU farms are typically too expensive to perform inference, the process of entering a signal and generating a response from large language models that are embedded in each app using AI. It's as if everyone had 5G smartphones, but it was too expensive for anyone to use data to watch TikTok videos or surf social media. As a result, excellent LLMs with high estimated costs have made it impossible for killer apps to spread.

This imbalanced ecosystem of ultra-rich tech moguls battling each other has enriched Nvidia while forcing application developers to use lower-cost and lower-performance models, either to frustrate users or , or faced with paying excessive estimate costs and bearing risks. Bankrupt.

In 2025, a new approach will emerge that could change everything. This would be similar to what we have learned from past technology revolutions, such as the PC era of Intel and Windows or the mobile era of Qualcomm and Android, where Moore's Law led to improvements in PCs and apps, and lower bandwidth costs led to improvements in mobile phones and apps. Made improvements. After the year.

But what about the higher estimated cost? A new law for AI inference is about to come. The cost of estimation has reduced by 10 times per year due to new AI algorithms, estimation technologies and better chips at lower prices.

As a reference point, if a third-party developer uses OpenAI's top-level models to create an AI search, the cost will be about $10 per query in May 2023, compared to Google's non-gen-AI search The cost will be $0.01. 1,000x difference. But by May 2024, the price of OpenAI's top model has dropped to about $1 per query. At this unprecedented 10x-per-year price drop, application developers will be able to use higher-quality and lower-cost models, leading to a proliferation of AI apps over the next two years.

Leave a Reply

Your email address will not be published. Required fields are marked *