I wonder what will happen with all the compute once the AI bubble bursts.
It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.
But what’s next? We’ve got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.
Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.
Otherwise, cheaper GPUs for us gamers would be great.
Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.
I wonder what will happen with all the compute once the AI bubble bursts.
It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.
But what’s next? We’ve got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.
Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.
Otherwise, cheaper GPUs for us gamers would be great.
Finally very detailed climate simulations to know how hard we’re screwed
…made using the arguably the most criminally environmentally disastrous tech we’ve invented in the past few decades. How ironic!
i think open source will build actually useful integrations due to the available compute
It will be used for more AI research probably.
Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.
I’ll buy a couple top tier gpu’s from a failed startup on ebay to run my own ai at home.