CursorAI "unlimited" plan rug pull: Cursor AI silently changed their "unlimited" Pro plan to severely rate-limited without notice, locking users out after 3-7 requests
-
Someone just got the AWS bill.
That's got to be it. Cloud compute is expensive when you're not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we'll see will probably be specialized agents running small models locally.
-
"Prolly"
I mean yeah? I wasn’t counting in detail, it’s an estimate.
Previously you got 500 requests a month and then it’d start charging you, even on “auto.” So the current charging scheme seems to be encouraging auto use so they can use cheaper LLMs when they make sense (honestly a good thing).
-
I mean yeah? I wasn’t counting in detail, it’s an estimate.
Previously you got 500 requests a month and then it’d start charging you, even on “auto.” So the current charging scheme seems to be encouraging auto use so they can use cheaper LLMs when they make sense (honestly a good thing).
I was questioning the use of the word "prolly"
-
I was questioning the use of the word "prolly"
it means "probably"
-
I was questioning the use of the word "prolly"
Nah, you should find a new bone to pick.
-
That's got to be it. Cloud compute is expensive when you're not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we'll see will probably be specialized agents running small models locally.
I'm still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
-
This post did not contain any content.
Imagine the price hikes when they need to get that return on hundreds of billions they've poured into these models, datacenters and electricity.
-
This post did not contain any content.
Ah they're learning from the "unlimited" mobile carriers.
"Unlimited" until you meet your limit, then throttled.
-
This post did not contain any content.
Common People
-
I'm still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I'm somewhat tech savvy, how do I run llm locally. Any suggestions?
How to know if my local data is safe -
I'm somewhat tech savvy, how do I run llm locally. Any suggestions?
How to know if my local data is safeCheckout lm studio https://lmstudio.ai/ and you can pair it with vs continue extension https://docs.continue.dev/getting-started/overview.
-
Hopefully (?) this is the start of a trend and people might begin to realize how all those products are not worth their price and AI is an overhyped mess made to hook users before exploiting them...
The whole industry is projecting something like negative $200B for next years. They know it's not worth the price.
-
This post did not contain any content.