Our AI writing assistant, WriteUp, can assist you in easily writing any text. Click here to experience its capabilities.

AI chatbots lose money every time you use them. That’s a problem.

Summary

AI chatbots have become immensely popular, but they are expensive to run, and the best models are often unavailable due to their heavy computing requirements. This expense is limiting their quality and preventing them from becoming widely available, as the companies behind them are struggling to make them profitable. The cost of running AI language models is high, and requires specialized computer chips that are difficult to come by. This is driving profits to the chipmakers, cloud computing companies, and tech giants that are offering AI tools. Ads are not likely to be enough to make cutting-edge AI tools profitable anytime soon, and the cost of running AI is likely to be a concern for the environment.

Q&As

What are some of the limitations of AI chatbots and why?
AI chatbots have a limited availability of computer chips they require, a propensity to spit out biased results or blatant falsehoods, and a lack of access to the best models available. This is due to the enormous cost of running today’s large language models.

How can the cost of running today’s large language models be reduced?
Companies are working to make AI language models more efficient by running queries on lightweight models, making their own AI chips, and building on open-source language models.

How has the battle for access to GPUs impacted Big Tech’s vision for AI?
The battle for access to GPUs has helped to make their leading providers into tech giants in their own right, holding the keys to what has become the technology industry’s most prized asset. This has limited which companies can afford to run them and pressured even the world’s richest companies to turn chatbots into moneymakers sooner than they may be ready to.

How are tech companies trying to make AI tools profitable?
Tech companies are experimenting with building advertisements into their AI-powered tools, offering paid subscriptions, and racing for ways to make AI language models cheaper.

What are some of the potential consequences of increased computing needs for AI?
Increased computing needs for AI can have implications for greenhouse gas emissions, as the computing requires energy that could be used for other purposes. It could also slow down the development and application of AI for other, more meaningful uses, such as in health care, drug discovery, cancer detection, etc.

AI Comments

👍 This article gives a comprehensive overview of the current state of AI chatbots, from their economic costs to the environmental impacts of their usage.

👎 This article paints a pessimistic view of the future of AI chatbots, highlighting the current limitations in their usage and profitability.

AI Discussion

Me: It's about how AI chatbots are losing money every time they're used. It talks about how expensive the Large language models are, and how this is limiting their quality and even threatening the global AI boom. It also talks about how only the wealthiest companies can afford them because of the intense computing power they require.

Friend: Wow, that's really interesting. It makes a lot of sense that only the wealthiest companies can afford them. What are the implications of this article?

Me: Well, it means that AI chatbots may not be as widely accessible as they could be. It also means that even with the potential for ads to make them profitable, it's still not likely to happen anytime soon. This could lead to tech giants monopolizing the AI space, as they have the resources to pay for the best language models. It also points to the need for more efficient and sustainable AI systems that can make AI more accessible and affordable, while minimizing environmental impacts.

Action items

Technical terms

AI chatbots
Artificial Intelligence chatbots are computer programs that are designed to simulate conversation with human users.
ChatGPT
ChatGPT is an AI chatbot developed by OpenAI, a research lab focused on artificial intelligence.
Bard
Bard is an AI chatbot developed by Google.
GPUs
Graphics Processing Units are specialized computer chips that are used to process large amounts of data for AI applications.
Nvidia
Nvidia is a company that manufactures GPUs.
TSMC
Taiwan Semiconductor Manufacturing Company is a company that manufactures GPUs.
LaMDA
LaMDA is a language model developed by Google.
Sparrow
Sparrow is an AI chatbot developed by Google's DeepMind subsidiary.
Meta
Meta is a start-up that is building on open-source language models.
LLaMA
LLaMA is a language model developed by Meta.

Similar articles

0.9999996 AI chatbots lose money every time you use them. That’s a problem.

0.91500556 🤖 ChatGPT might be in trouble...

0.9133719 Tech experts are starting to doubt that ChatGPT and A.I. ‘hallucinations’ will ever go away: ‘This isn’t fixable’

0.913366 What if Generative AI turned out to be a Dud?

0.9094387 OpenAI Is Now Everything It Promised Not to Be: Corporate, Closed-Source, and For-Profit

🗳️ Do you like the summary? Please join our survey and vote on new features!