Follow ZDNET: Add us as a favorite source On Google.
ZDNET Highlights
- GitHub has changed the pricing for its flagship Copilot service.
- Under the new AI Credit approach, if you run out of credit, you can’t use the service.
- Users who expected to see far higher prices already hate this deal.
It has been an open secret that people are paying nothing but full cost for their AI services. Finally the bill is coming. GitHub Announced that by June 1, 2026, all GitHub Copilot plans will move to usage-based billing.
This is a radical change from its current Premium Request Unit (PRU) system. Going forward, users will consume a monthly allocation of GitHub AI credits based on token consumption, including inputs, outputs, and cached tokens, at published API rates. In other words, GitHub is moving towards a token-based pricing model.
Smart people saw this coming. a week ago, GitHub blocks users from getting new GitHub Copilot subscriptions. GitHub also began restricting the models available from its individual subscription plans, while shutting down access to Opus models entirely. A price increase was clearly going to happen.
Why? According to GitHub, it is no longer the service it used to be. What was once a smart programming editor has evolved into “an agentic platform capable of running long, multi-step coding sessions, using the latest models, and iterating across entire repositories.” On top of this, “agentic use is becoming the default, and this brings significantly greater calculation and estimation demands.”
GitHub claims that its current premium request model is not sustainable. After all, he said, “a quick chat question and a multi-hour autonomous coding session may cost the user the same amount of money,” with GitHub absorbing the rising estimated costs. The usage-based model aims to maintain long-term service reliability.
The good news is, for now, anyway. base subscription pricesremain unchanged. CoPilot Pro is $10 per month, and Pro+ is $39 per month. However, these subscriptions will now include monthly AI credits matching their dollar value. That means, Pro subscribers get $10 in credit, while Pro+ users get $39. I don’t know why GitHub felt the need to clarify this.
Code completions and next edit suggestions will remain included without consuming AI credits. Users on annual plans will continue PRU-based pricing until expiration, when they will transition to CoPilot Free with upgrade options, or they can convert to initial monthly plans with a prorated credit.
Copilot Business, $19 per user per month, and Copilot Enterprise. Maintain their current price of $39 per user, per month, while adding the equivalent monthly AI credit per seat. To ease the transition, GitHub will offer promotional credits for June, July, and August 2026: Business customers get $30 per month, and Enterprise users get $70 per month.
However, and this is important, in the past, when you ran out of PRUs, you simply moved to a less capable model. With the new AI credit approach, when you run out of credit, you’re out of luck. If you want to continue working, you’ll have to pay more for the credit.
Organizations can benefit from aggregated usage across different teams, eliminating capacity trapped by individual unused credits. Administrators will gain budget control at the enterprise, cost center, and user levels, with options to allow additional purchases or cap spending if the included pool is exhausted.
GitHub plans to launch a preview of Bills in early May. This will give you a look at your estimated costs before your new bill arrives for June.
Many users aren’t waiting to dismiss this new pricing plan as a bad deal. As one Reddit poster said, “I don’t think companies will be so happy that they get this Bill 50 times bigger. People really underestimate how many tokens they use.” Another shrugged, “They could have shut down Copilot entirely. Literally The only reason to stop is because you’re familiar with it And aren’t willing to invest 30 minutes of their life to become familiar with cloud code, codecs, or whatever.”
However, despite all the complaints, it is not that this news is surprising. Those who noticed the rising cost of AI – memory is more expensive than ever, and gigawatt datacenters don’t build themselves – knew it was coming.
Other companies have already started increasing their rates. For example, OpenAI has increased the cost for developers using its flagship GPT-5.2 model to $5.75 from $1.25 per input token in the previous GPT-5.1. Other than this, Anthropic confirms real price increase for its Cloud Enterprise edition on April 15 when it moved from fixed pricing to a dynamic usage-based model.
(Disclosure: ZDNET’s parent company Ziff Davis filed a lawsuit against OpenAI in April 2025, alleging it infringed Ziff Davis copyrights in the training and operation of its AI systems.)
Like it or not, the days of cheap AI are almost over. I expect costs to increase 2 to 3 times by the end of the year, and I wouldn’t be surprised if prices go much higher than that.
