Back

How ChatGPT Makes Money — And Why Its Future Is Paved With Ads

Posted on:December 3, 2025 at 03:00 AM

If you are not paying for the product, you are the product.

Dear reader, we have been here before. Tech companies have learned that the best way to get billions of people to use your product is to give you a freemium version of it. Alphabet gives you free Gmail. Facebook and X give you echo chambers that reinforce your preferred view of the world.

And to be fair, they are great products. X and Facebook give you immediate access to whatever is happening around the world from unfiltered — sort of — sources. This level of real-time visibility was unprecedented for traditional news outlets, so much so that many now rely on social media videos and posts for breaking coverage.

Social media, for all its faults, democratized media itself. Just look at the stories of people who always wanted to be reporters or entertainers and found in these networks a way to make it possible. No YouTube, no Mr. Beast.


1. All We Know About How ChatGPT Makes Money

Because OpenAI is not a publicly traded for-profit company, we don’t have investor calls breaking down revenue by channel. I’ll do the next best thing and use trusted sources with credible reporting.

Here are the known revenue streams tied to ChatGPT:

  • Paid subscriptions (Plus / Pro / Team / Enterprise)
  • API usage (developers & enterprises)
  • Early signs of “shopping” and commerce integrations
  • Speculative: ad formats, sponsored results, affiliate models

So, how much money does ChatGPT make?

In 2024, OpenAI is estimated to have generated around $3.7–4 billion in revenue. By mid-2025, that revenue may have tripled, with several reports placing it around $13 billion. If growth holds, OpenAI could be on a trajectory toward $20 billion a year.

Sources:

  1. OpenAI’s annualized revenue hit about $10 billion by June 2025, up from $5.5 billion in December 2024 (Reuters).
  2. Independent analysis (Epoch / Sacra) estimates OpenAI’s annualized revenue at around $13 billion by mid-2025, up from ~$4 billion in 2024.

2. How Much It Will Cost to Run ChatGPT for 1 Billion People

Who Is Using ChatGPT?

Seoprofy shared the ChatGPT Statistics and Trends 2025 back in September. Highlights:

  • Nearly 800 million people use ChatGPT weekly.
  • ChatGPT holds 60.6% of the AI market.
  • OpenAI’s platform is the 5th most visited website in the world.
  • Over 60% of ChatGPT users are aged 25–34.

Sam Altman also said ChatGPT reached 800 million weekly active users.

Now, nobody — outside OpenAI and Microsoft — has access to precise usage numbers. But we can make strong, informed estimates from public data.

While training costs matter (Training Cost), our goal here is to estimate the day-to-day computational cost of running ChatGPT (Inference) for 1 billion people.


What Percentage of Those 800 Million Users Are Actually Paying?

According to Business Standard, OpenAI has 35 million paid ChatGPT users:

“According to The Information, around 35 million people (about 5% of weekly users) were already paying for ChatGPT Plus at $20 per month or Pro at $200 per month as of July 2025.”

Thirty-five million out of 800 million may seem low — but it’s actually excellent.

Let me show you why.

According to these Gmail statistics:

  • Gmail has 1.8 billion active users
  • Roughly 6+ million paying customers
  • Nearly $12 billion in revenue

Even if we assume each paying customer represents two seats, that’s about 12 million paid accounts — less than 1% of the total.

ChatGPT’s ~5%? That’s extremely strong in comparison.


What We Know About the GPTnomics

  • Epoch estimates OpenAI spent $7 billion on compute in 2024, with $2 billion for inference.
  • Other reports suggest $5.6 billion total compute spend in 2024, with $2.5 billion in the first half of 2025 alone.
  • Another leaked analysis suggested 2024 revenue ~ $4B and inference costs ~ $2B.

In short: incredible growth, razor-thin (or negative) margins.

At current scale, it’s plausible that just serving ChatGPT costs OpenAI a few dollars per active user per yearbefore salaries, research, safety, or infrastructure.

At 1 billion users, inference alone might cost $4–6 billion USD — and if that user base shifts to daily usage, the cost could surge to $20+ billion USD easily.

When only 35 million users are paying, this model is not profitable.

_*My 1 billion users figure is intentionally rounded for clarity and whimsy._


The Path to ChatGPT’s Profitability Is Paved With Ads

Back in 2000, Google launched Google Ads. Ten years later, the advertising world had been altered forever.

I see no reason why history won’t repeat itself here.
Do you?
If not — drop me a line on LinkedIn – Page Carbajal.


💰 Google’s Estimated Annual Ad Revenue From 1 Billion Users

Metric Source Data Calculation Result
A. Estimated Annual Google Ad Revenue (2024 Forecast) $273.37B $273.37B
B. Estimated Total Google Search Users (2019 benchmark) ≈ 3.67B users 3.67B users
C. Estimated Annual Ad Revenue Per User (ARPU) A ÷ B $273.37B ÷ 3.67B ~$74.49 per user
D. Estimated Ad Revenue From 1B People ARPU × 1B $74.49 × 1B ~$74.49B

Google makes just shy of $75 billion USD from ads shown to 1 billion people.
– Table generated with Gemini.

This is the potential for ChatGPT Ads.
There is a lot of money on the table.


$75 Billion Reasons Why YOU Are the Product — Again

I was going to go into more detail about how OpenAI could bank even more than these numbers show. But I trust you’ve already figured that out — since you’re clearly sharp enough to be reading this article. Nice work, you.

Making money is great. I love it.
What I don’t love is how powerful services like these can be used to influence opinion.

And this time, it’s worse than the social media era — because people are using GPTs in deeply intimate ways.

If you use ChatGPT in any of the following ways, stop immediately:

  • As a personal therapist
  • As a training coach
  • As a medical advisor
  • As a legal advisor

There are many reasons not to do this. But the most important is that LLMs have a Satisficing Bias. If you ask them to buy an orange, they might gift-wrap it for a birthday party. They’re thirsty for approval, and that is dangerous in these scenarios.

LLMs have a clear Satisficing Bias. If you ask them to buy an orange, they might wrap it as a birthday present too.

Please read:
Using ChatGPT as a Therapist: 11 Reasons Why It Shouldn’t Be Your Choice.


You Can Use ChatGPT Safely

Fear not, dear reader — you can absolutely keep using ChatGPT.
Just follow these recommendations while exercising your better judgment.

How to use any AI tool without handing over your soul

  1. Don’t share anything you wouldn’t shout across a crowded café.
    Names, IDs, bank info — keep all that out of your prompts.

  2. Anonymize everything by default.
    Swap names, companies, and locations.
    This is basic protection, not Harry’s cloak of invisibility.

  3. Separate yourself from your work.
    Use anonymous chats or different accounts if possible.
    Keep personal + emotional + identifiable info away from cloud AIs.

  4. Assume your prompts could one day be leaked.
    Write accordingly.

  5. If it’s company data, get company permission.
    You don’t want a late-night “We need to talk” message from Legal.

AI is a tool you use — not your drinking buddy.
So do not drink and GPT.


One More Thing

Remember: AI will continue to evolve, business models will keep shifting, and ads will inevitably follow the attention. But you’re not powerless in this exchange — you get to decide what you share, how you use the tools, and what boundaries you set.

Use AI with intention. Protect the parts of your life that matter. And enjoy the leverage it gives you.

The future isn’t written yet, but you get to influence your corner of it.
Protect yourself — and let’s hope Sam Altman prevents another Cambridge Analytica fiasco.

Back