Why not replace the CEO with an LLM? Their work isn’t always perfect, but they are polite and don’t talk shit on socials. They’re cheaper than a human CEO too, aside from being thirsty lil devils.
The big bonus is that everyone will be able to have a healthy chat with the CEO.
- Hey CEO, what will be my raise this year?
- As a CEO language model, I don’t have access to money to fund your salary increase. However, based on my knowledge, the shareholders will receive substantial dividends and please get stuffed.
Fwiw a LLM uses as much power as 10 regular Google searches… So it’s almost nothing in the grand scope of things. It might even save some for the people who don’t know how to utilize search engines properly.
We also need more data centers, not fewer. And they use almost no water compared to other utilities.
I’m not sure that’s even a valid comparison? I’d love to know where you got that data point.
LLMs run until they decide to output an end-of-text token. So the amount of power used will vary massively depending on the prompt.
Search results on the other hand run nearly instantaneously, and can cache huge amounts of data between requests, unlike LLMs where they need to run every request individually.
I’d estimate responding to a typical ChatGPT query uses at least 100x the power of a single Google search, based on my knowledge of databases and running LLMs at home.
Why not replace the CEO with an LLM? Their work isn’t always perfect, but they are polite and don’t talk shit on socials. They’re cheaper than a human CEO too, aside from being thirsty lil devils.
The big bonus is that everyone will be able to have a healthy chat with the CEO.
- Hey CEO, what will be my raise this year?
- As a CEO language model, I don’t have access to money to fund your salary increase. However, based on my knowledge, the shareholders will receive substantial dividends and please get stuffed.
Fwiw a LLM uses as much power as 10 regular Google searches… So it’s almost nothing in the grand scope of things. It might even save some for the people who don’t know how to utilize search engines properly.
We also need more data centers, not fewer. And they use almost no water compared to other utilities.
I’m not sure that’s even a valid comparison? I’d love to know where you got that data point.
LLMs run until they decide to output an end-of-text token. So the amount of power used will vary massively depending on the prompt.
Search results on the other hand run nearly instantaneously, and can cache huge amounts of data between requests, unlike LLMs where they need to run every request individually.
I’d estimate responding to a typical ChatGPT query uses at least 100x the power of a single Google search, based on my knowledge of databases and running LLMs at home.
The talking shit on socials is a feature not a bug