I cancelled my ChatGPT Subscription for Pay-As-You-Go

I ditched my ChatGPT Plus subscription for a pay-as-you-go AI setup. Tired of rising costs and "subscription fatigue," I now use Open WebUI and OpenRouter AI.

I cancelled my ChatGPT Subscription for Pay-As-You-Go
Photo by rupixen / Unsplash

The rise of AI in recent years has prompted many of us to explore its potential, and I was among the early adopters. I really enjoyed it for what it was– stripping away the marketing hype. In my personal experience, I found myself evaluating my usage patterns and ultimately deciding to cancel my OpenAI ChatGPT Plus subscription in favor of a more flexible Pay-As-You-Go approach. Instead of paying a flat monthly fee, I opted for a self-hosted solution utilizing Open WebUI and OpenRouter AI to meet my family's occasional AI needs without paying unnecessary expenses.

The Decision to Leave a Subscription Model

The crux of my decision revolved around my usage patterns. At $20 a month, the OpenAI ChatGPT Plus program became an expense I could no longer justify. Despite my appreciation for AI as a useful tool, my personal life did not demand the constant access provided by a subscription service. In my professional life, AI tools are useful for daily operations, but in a personal setting, those needs are significantly lower. According to a post from Lifehacker, users can access OpenAI services on a pay-as-you-go basis for as little as $0.002 per 1,000 tokens with the GPT-3.5 Turbo model. This pricing model makes it far more appealing for infrequent users who may only need to access AI occasionally (Lifehacker, 2023).

With my existing setup using Open WebUI integrated with OpenRouter AI, my family can use AI whenever they want without worrying about overwhelming costs. OpenRouter provides access to a diverse range of AI models and services, thereby enriching the overall experience by allowing users to leverage multiple AI engines for various needs. In contrast to the more rigid subscription model and services that comes with the ChatGPT subscription. Running LLMs locally allows us to use AI models with sensitive data, such as personal documents like taxes and bills, that I would never entrust to a public service. For example, if I want to pass all my personal documents like tax, bills, etc. so that I could find specific information.

Instead of the $20 a month I was paying for ChatGPT, I now throw ~$5 into my OpenAI and OpenRouter account every couple months. And what ever the cost of running light models on my GPU for more sensitive data. I also default my Open WebUI to the free Gemini which tends to cover most use case anyway.

The Inherent Limitations of Subscription Models

One of the main drawbacks of subscription-based AI services is their inflexibility. Continual membership often leads to a phenomenon called "subscription fatigue," as consumers become overwhelmed by the plethora of monthly fees for services they may not use regularly. A survey from CNET indicated that nearly half of smartphone owners were unsure about paying extra for AI functionalities, exhibiting the general unease related to additional subscription burdens (CNET, 2024). The increasing prevalence of 'free' AI services often masks future costs, which can contribute to user confusion and unexpected expenses. The new Samsung S25 is an example of this.

Moreover, high monthly fees may misalign with the value perceived by users who primarily seek convenience rather than government-level AI accessibility. TechCrunch recently reported on how OpenAI is considering raising the monthly cost on ChatGPT Plus, bringing the cost from $20 to potentially $44 by 2029 (TechCrunch, 2024). While these tools are great to use, they have the same issue as all of the subscription services we already have. The costs of using them will continue to grow as the companies that run them need to show continued revenue increases.

A Multi-Model Approach

Using OpenRouter AI allows me to connect to different models beyond just OpenAI's offerings. The flexibility of being able to mix and match based on the task at hand offers a significant advantage. Tech companies like OpenRouter advocate for a multi-provider strategy, focusing not only on constant availability but also on price optimization through diversified sources (OpenRouter, 2023).

This option of a multi-model approach can optimize costs further—a critical advantage for a user looking to control expenses while maximizing functionality. The availability of numerous AI engines and tools means that when my family or I do decide to leverage an AI model, we can choose the service best suited for our needs, whether that involves text generation, summarization, or even image generation. We can also use multiple AI models in one chat, allowing us to get different outputs depending on the model, or if we want to perform certain tasks on a cheaper model.

Mobile App

However, this approach also presents some challenges, most notably the absence of a native mobile app. Instead, you'd have to access the web UI from the browser on your phone and save it as a web app on your home screen. Your phone will treat it like a mobile app, but it is NOT a mobile app. So it lacks native OS integration; things like sharing content to the app. They have done a pretty good job at making it useful on a phone. However, every once in a while the interface feels like it wasn't designed for a phone, especially if you start drilling into their more advanced features.

Conclusion

As AI continues to expand in various sectors of our lives, it's crucial to examine how best these tools can serve individual needs without imposing undue financial burdens. The decision to shift from a subscription model to a pay-as-you-go framework represents not only a personal choice for me, but also a broader conversation we should have around subscription fatigue.

In reevaluating my use of AI services, I've acknowledged that while they are beneficial, their value lies mostly in convenience rather than necessity. This has led to a paradigm shift in how I access AI tools; by taking advantage of multi-model platforms and frameworks like OpenRouter and Open WebUI, my family can maximize the use of AI while minimizing costs, paving the way for a more sensible and flexible approach to AI integration in our daily lives.


References

  1. Lifehacker. (2023). OpenAI’s Pay-As-You-Go is the Best Way to Use ChatGPT. Retrieved from Lifehacker
  2. TechCrunch. (2024). Will People Really Pay $200 a Month for OpenAI’s New Chatbot?. Retrieved from TechCrunch
  3. OpenRouter. (2023). OpenRouter: Multi-Model and Multi-Provider AI Service. Retrieved from OpenRouter