According to numbers floating around online, thiat would mean one llama query is around as expensive as 10 google searches. And it’s likely that those costs will increase further.
It still seems like the biggest factor here is the scale of adaptation. Unfortunately the total energy costs of AI might even scale exponentially since the more complex the queries get, the better the responses will likely be. And that will further drive adaptation.
This pace is so clearly unsustainable it’s horrifying, and while it was obvious to some degree, it seems it’s worse than I thought.
Oh damn. Very good article btw.
According to numbers floating around online, thiat would mean one llama query is around as expensive as 10 google searches. And it’s likely that those costs will increase further.
It still seems like the biggest factor here is the scale of adaptation. Unfortunately the total energy costs of AI might even scale exponentially since the more complex the queries get, the better the responses will likely be. And that will further drive adaptation.
This pace is so clearly unsustainable it’s horrifying, and while it was obvious to some degree, it seems it’s worse than I thought.