For discussion of Electric Universe and Plasma Cosmology. The ideas and opinions expressed on this forum do not necessarily reflect those of T-Bolts Group Inc or The Thunderbolts Project™
Skip to content
by BeAChooser » Sun Feb 18, 2024 7:11 pm
by BeAChooser » Tue Jan 23, 2024 5:50 pm
OpenAI's CEO Sam Altman has said that he believes an energy 'breakthrough' is necessary to advance AI models, reports Reuters. Altman said low-carbon energy sources including nuclear fusion are needed for the unexpected energy demands of AI, during a panel discussion with Bloomberg at the World Economic Forum in Davos, Switzerland. "There's no way to get there without a breakthrough," he said. "It motivates us to go invest more in fusion." … snip … The generative AI boom has seen a huge investment in the compute that will enable it. If the trend continues, the energy requirements will be monumental. According to a peer-reviewed analysis published in Joule in October 2023, current trends are set to see Nvidia [which sells 95 per cent of the GPUs used for AI] shipping around 1.5 million AI server units per year by 2027. Those servers would, if running at full capacity, consume at least 85.4 terawatt-hours of electricity annually.
Generative AI’s Energy Problem Today Is Foundational Before AI can take over, it will need to find a new approach to energy
by BeAChooser » Thu Nov 30, 2023 4:25 am
AI’s Dirty Secret: The Shocking Power Consumption Problem … snip … AI comes at a significant cost. High-performance chipsets are essential for performing complex calculations, requiring considerable energy. Training AI models with hundreds of thousands of data also consumes energy. Therefore, using AI conveniently requires a substantial amount of power. The power consumed by AI could rival that of a country. According to Schneider Electric, a global energy solutions company, the power consumed to run AI functions in countries excluding China reaches 4.3 gigawatt-hours (GWh). They warned that the estimated consumption could reach 20 GWh by 2028. Dr. Alex de Vries from Vrije Universiteit Amsterdam claimed in a report published in the scientific journal ‘Joule’ that using AI once consumes as much power as leaving an LED light bulb on for an hour. Data centers handling AI operations also consume a significant amount of power. The famous AI service ChatGPT receives an average of about 200 million AI operation requests daily. … snip … As more fields introduce AI, consumption will undoubtedly increase steeply. Dr. de Vries estimated that by around 2027, AI data centers worldwide would consume around 100 terawatt-hours (TWh) of power annually, equivalent to the annual power consumption of Sweden, the Netherlands, or Argentina. … snip … As power consumption increases, the burden on companies providing AI services also grows. On October 9 (local time), the Wall Street Journal (WSJ) reported that Microsoft (MS) is losing about $20 per user per month from its AI service GitHub Copilot, which launched last year. It added that active users of AI features are causing Microsoft to lose around $80 per month. … snip … In May this year, Microsoft signed a contract to purchase power from Helion Energy, a company that generates nuclear energy through nuclear fusion, until 2028. NotebookCheck, an overseas PC media outlet that reported the news, stated that Microsoft plans to use its small modular reactors (SMRs) to meet the power consumed in AI data centers. As part of this plan, it has posted job advertisements for nuclear technology program managers.
As various fields apply AI, energy consumption is increasing excessively. This has led to calls to avoid using AI technology indiscriminately. The argument is to avoid forcing the application of AI in areas where it is unnecessary or overusing AI unnecessarily. For example, for users who want to search for keywords on web search services, the search result summary feature using AI is just a feature that wastes energy unnecessarily. On the other hand, experts have urged reducing the habit of creating texts or images with generative AI unless necessary. Occasionally, some users repeatedly create images they will not use, fascinated by the novelty of the image creation AI. Considerable energy is wasted each time these users press the create button. Eliminating such cases alone would be a great help in reducing energy consumption.
by allynh » Fri Feb 24, 2023 12:37 am
by BeAChooser » Tue Feb 21, 2023 11:16 pm
allynh wrote: ↑Mon Feb 20, 2023 4:45 am Stumbled across this video that puts things in context.
by allynh » Mon Feb 20, 2023 4:45 am
by allynh » Fri Feb 17, 2023 11:48 pm
by BeAChooser » Wed Feb 15, 2023 5:00 pm
allynh wrote: ↑Wed Feb 15, 2023 1:03 am It's too late.
by allynh » Wed Feb 15, 2023 1:03 am
by BeAChooser » Tue Feb 14, 2023 11:46 pm
Roshi wrote: ↑Tue Feb 14, 2023 8:53 pm AI does not decide anything.
by Roshi » Tue Feb 14, 2023 8:53 pm
BeAChooser wrote: ↑Tue Feb 14, 2023 6:52 pm What if the AI decides that what you are posting is disinformation and ban you from posting? It's judged disinformation not because the AI, unfettered, would deem it so, but because the developers of the AI don't like what you believe and put constraints on the AI to stop the spread of such *disinformation*.
by BeAChooser » Tue Feb 14, 2023 6:52 pm
Roshi wrote: ↑Tue Feb 14, 2023 8:25 am I'm not worried about AI deciding anything.
Roshi wrote: ↑Tue Feb 14, 2023 8:25 amThose in power will use it to strengthen their power
by Roshi » Tue Feb 14, 2023 8:25 am
BeAChooser wrote: ↑Mon Feb 13, 2023 8:53 pm That may OR MAY NOT be true, but irregardless you've missed my point. These rudimentary AI are already affecting decisions. If the AI are biased because the programmers are biased, then those biases will affect decisions. And we might accept those decisions unaware of the biases. That could seriously impact humanity on many levels ... not the least of which is spending on science projects.
by BeAChooser » Mon Feb 13, 2023 8:53 pm
Roshi wrote: ↑Mon Feb 13, 2023 8:33 pm AI is a program, written by humans. It's a complex thing, and it "learns". It should not be mistaken for a real intelligent life form, because it's not. It's a complex machine, set up to do complex stuff, that sometimes is unexpected, that's all. Look: there are cases in history where humans have put themselves in danger - for saving other humans. Or even for lesser things. Things that a program written to compare things then decide a good solution, would never consider as logical. This is the difference between talking to an AI that can mimic us versus talking to an intelligent life form. The life form is alive, and driven by motives beyond what can be programmed, or imitated using millions of cases to learn from. Yes it can learn to be like us, talk like us. What is the underlying reason for what it does? The underlying reason are the initial conditions written by the programmers, based on what they think an intelligent life form should do. Like "go this way, learn and get better at this". But it can't come up with it's own initial conditions or reasons to be. There is that expression "how do you sleep at night". Because at night there is silence, and we can hear ourselves. The rational mind is stopped from drowning the inner voice. Can AI have such problems? No. Well, this means it's not alive, it's a complex machine.
by Roshi » Mon Feb 13, 2023 8:33 pm
Top