OpenAI Rolls Back ChatGPT’s Model Router System for Most Users

0
business-news-2-768x548.jpg


OpenAI has rested a big change in how hundreds of millions of people use ChatGPT.

On a low profile blog that tracks product changesthe company said it was rolling back ChatGPT's model router — an automated system that sends complex user questions to more advanced “reasoning” models — for users on its Free and $5-a-month Go tiers. Instead, those users will now default to GPT-5.2 Instant, the fastest and cheapest version to serve OpenAI's new model series. Free and Go users will still have access to reason models, but they will have to manually select them.

The model router launched just four months ago as part of OpenAI's push to unify the user experience with the debut of GPT-5. The feature analyzes user queries before choosing whether ChatGPT answers them with a fast-reacting, low-cost-to-serve AI model or a slower, more expensive reasoning AI model. Ideally, the router would direct users to OpenAI's smartest AI models exactly when they need them. Previously, users accessed advanced systems through a confusing “model picker” menu; a feature that CEO Sam Altman said the company “hates as much as you do.

In practice, the router seemed to send many more free users to OpenAI's advanced reasoning models, which are more expensive for OpenAI to serve. Shortly after its launch, Altman said the router increased the use of reasoning models among free users from less than 1 percent to 7 percent. It was a costly bet aimed at improving ChatGPT's responses, but the model router was not as widely embraced as OpenAI expected.

One source familiar with the matter tells WIRED that the router negatively affected the company's daily active users metric. While reasoning models are widely seen as the frontier of AI performance, they can spend minutes working through complex queries at significantly higher computational cost. Most consumers don't want to wait, even if it means getting a better answer.

Fast-reacting AI models continue to dominate general consumer chatbots, according to Chris Clark, the chief operating officer of AI inversion provider OpenRouter. On these platforms, he says, the speed and tone of responses are paramount.

“If someone is typing something, and then you have to show bullet points for 20 seconds, it's just not very engaging,” Clark says. “For general AI chatbots, you are competing with Google [Search]. Google has always focused on making Search as fast as possible; they never said, 'Gee, we've got to get a better answer, but slow it down.'



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *