Ouch! Said the AI ...

Has science taken a wrong turn? If so, what corrections are needed? Chronicles of scientific misbehavior. The role of heretic-pioneers and forbidden questions in the sciences. Is peer review working? The perverse "consensus of leading scientists." Good public relations versus good science.
BeAChooser
Posts: 1052
Joined: Thu Oct 15, 2015 2:24 am

Re: The Danger of ChatGPT

Unread post by BeAChooser » Wed Feb 15, 2023 5:00 pm

allynh wrote: Wed Feb 15, 2023 1:03 am It's too late.
Probably true. I'm likely blowing into the wind, like on most topics these days. :(

allynh
Posts: 1115
Joined: Sat Aug 23, 2008 12:51 am

Re: The Danger of ChatGPT

Unread post by allynh » Fri Feb 17, 2023 11:48 pm

This is a great example of how this is going wrong.

More news outlets get caught up in nasty conversations with Bing chatbot over facts
https://www.geekwire.com/2023/nasty-con ... bing-chat/

Read all of the links in the article for what I mean.

Yikes! I'm starting to sound like "Sydney".

allynh
Posts: 1115
Joined: Sat Aug 23, 2008 12:51 am

Re: The Danger of ChatGPT

Unread post by allynh » Mon Feb 20, 2023 4:45 am

Stumbled across this video that puts things in context.

I tried using AI. It scared me.
https://www.youtube.com/watch?v=jPhJbKBuNnA

Yikes!

BeAChooser
Posts: 1052
Joined: Thu Oct 15, 2015 2:24 am

Re: Ouch! Said the AI ...

Unread post by BeAChooser » Tue Feb 21, 2023 11:16 pm

allynh wrote: Mon Feb 20, 2023 4:45 am Stumbled across this video that puts things in context.
Thanks, that was interesting.

Speaking of which …

https://www.pcmag.com/news/everyone-is- ... en-chatgpt “The self-published section of Amazon's Kindle store is filling up with AI-written books”

https://www.cnet.com/culture/internet/c ... -happened/ “ChatGPT Rewrote My Dating Profile”

https://www.thestreet.com/technology/th ... s-shooting “This University Used ChatGPT to Tell Students About a Mass Shooting”

https://abcnews.go.com/US/vanderbilt-un ... d=97365993 “Vanderbilt University apologizes after using ChatGPT to console students”

https://www.seattletimes.com/business/t ... -you-tell/ “ChatGPT wrote cover letters for these job seekers.”

https://www.msn.com/en-us/news/technolo ... r-AA17uJ9t “Cornell AI tool designed to prevent online conversations from escalating into 'incendiary language’”

allynh
Posts: 1115
Joined: Sat Aug 23, 2008 12:51 am

Re: Ouch! Said the AI ...

Unread post by allynh » Fri Feb 24, 2023 12:37 am

Yikes!

Thanks for the links.

It's terrifying watching people playing with sweating sticks of dynamite and not realizing that they can blow up in their face.

I'm turning all this stuff in to Story to help me understand.

BeAChooser
Posts: 1052
Joined: Thu Oct 15, 2015 2:24 am

Re: Ouch! Said the AI ...

Unread post by BeAChooser » Thu Nov 30, 2023 4:25 am

https://viewusglobal.com/report/article/50753/
AI’s Dirty Secret: The Shocking Power Consumption Problem … snip … AI comes at a significant cost. High-performance chipsets are essential for performing complex calculations, requiring considerable energy. Training AI models with hundreds of thousands of data also consumes energy. Therefore, using AI conveniently requires a substantial amount of power.

The power consumed by AI could rival that of a country.

According to Schneider Electric, a global energy solutions company, the power consumed to run AI functions in countries excluding China reaches 4.3 gigawatt-hours (GWh). They warned that the estimated consumption could reach 20 GWh by 2028. Dr. Alex de Vries from Vrije Universiteit Amsterdam claimed in a report published in the scientific journal ‘Joule’ that using AI once consumes as much power as leaving an LED light bulb on for an hour.

Data centers handling AI operations also consume a significant amount of power. The famous AI service ChatGPT receives an average of about 200 million AI operation requests daily. … snip … As more fields introduce AI, consumption will undoubtedly increase steeply. Dr. de Vries estimated that by around 2027, AI data centers worldwide would consume around 100 terawatt-hours (TWh) of power annually, equivalent to the annual power consumption of Sweden, the Netherlands, or Argentina.

… snip …

As power consumption increases, the burden on companies providing AI services also grows. On October 9 (local time), the Wall Street Journal (WSJ) reported that Microsoft (MS) is losing about $20 per user per month from its AI service GitHub Copilot, which launched last year. It added that active users of AI features are causing Microsoft to lose around $80 per month.

… snip …

In May this year, Microsoft signed a contract to purchase power from Helion Energy, a company that generates nuclear energy through nuclear fusion, until 2028. NotebookCheck, an overseas PC media outlet that reported the news, stated that Microsoft plans to use its small modular reactors (SMRs) to meet the power consumed in AI data centers. As part of this plan, it has posted job advertisements for nuclear technology program managers.
Someone needs to tell whoever wrote this article that Helion isn’t currently generating ANY electricity by fusion and likely won’t be before 2028 or long after that. So what exactly is Microsoft buying with this hyped contract? Or have they bought a pack of lies?
As various fields apply AI, energy consumption is increasing excessively. This has led to calls to avoid using AI technology indiscriminately. The argument is to avoid forcing the application of AI in areas where it is unnecessary or overusing AI unnecessarily.

For example, for users who want to search for keywords on web search services, the search result summary feature using AI is just a feature that wastes energy unnecessarily. On the other hand, experts have urged reducing the habit of creating texts or images with generative AI unless necessary. Occasionally, some users repeatedly create images they will not use, fascinated by the novelty of the image creation AI. Considerable energy is wasted each time these users press the create button. Eliminating such cases alone would be a great help in reducing energy consumption.
In other words, AI is going to be only for the elites … not you or I. And between the demands of electric cars and houses that they insist we use, and AI, the elites are REALLY, REALLY going to need those fusion reactors they keep touting. Too bad they probably won’t be on line till 2050 … if at all. ;)

BeAChooser
Posts: 1052
Joined: Thu Oct 15, 2015 2:24 am

Re: Ouch! Said the AI ...

Unread post by BeAChooser » Tue Jan 23, 2024 5:50 pm

So one bad idea (AI) promotes another (FUSION) ….

https://www.datacenterdynamics.com/en/n ... am-altman/
OpenAI's CEO Sam Altman has said that he believes an energy 'breakthrough' is necessary to advance AI models, reports Reuters.

Altman said low-carbon energy sources including nuclear fusion are needed for the unexpected energy demands of AI, during a panel discussion with Bloomberg at the World Economic Forum in Davos, Switzerland.

"There's no way to get there without a breakthrough," he said. "It motivates us to go invest more in fusion."

… snip …

The generative AI boom has seen a huge investment in the compute that will enable it. If the trend continues, the energy requirements will be monumental. According to a peer-reviewed analysis published in Joule in October 2023, current trends are set to see Nvidia [which sells 95 per cent of the GPUs used for AI] shipping around 1.5 million AI server units per year by 2027. Those servers would, if running at full capacity, consume at least 85.4 terawatt-hours of electricity annually.
Now just to put that in perspective, the world now uses over 25,000 terawatt-hours of electricity annually. So it sounds like EACH YEAR they plan to add enough AI servers to consume about 0.3% of total electricity consumption. And this article (https://www.scientificamerican.com/arti ... ectricity/) says that “around the globe, data centers currently account for about 1 to 1.5 percent of global electricity use.” Seems to me, folks, we could soon all be working to feed electricity to these *intelligent* machines. Slaves to the machines.

Some articles on this are even honest about that …

https://spectrum.ieee.org/ai-energy-consumption
Generative AI’s Energy Problem Today Is Foundational Before AI can take over, it will need to find a new approach to energy

BeAChooser
Posts: 1052
Joined: Thu Oct 15, 2015 2:24 am

Expert Opinions On The Threat Of AI

Unread post by BeAChooser » Sun Feb 18, 2024 7:11 pm

Here’s an interesting paper on the deployment of AI over the next 2 decades.

https://ieeexplore.ieee.org/document/10380243

Perhaps most alarming is that all 12 *experts* they interviewed predict that competition between nation states will lead to irresponsible deployment in the 2040s timeframe. The majority expect it to lead to large numbers of deaths. Most predict several “smaller” 10000 death incidents but two foresee a million death incident.

They also point out that AIs will make it difficult for people to distinguish between truth and fiction … as if we don’t have enough problems with that already. In fact, some of the experts see a “battle” occurring between AIs with some trying to fake things and others trying to detect those fakes. They are particularly concerned about the proliferation of conspiracy theories … you know, like my charge that those who are promoting mainstream astrophysics, climate alarmism and the urgent expenditure of hundreds of billions of dollars on unproven fusion technology, are doing it for their own monetary benefit.

Now what the author and experts who interviewed consider a conspiracy theory is not indicated other than to worry about the proliferation and use of them by AI’s to manipulate "marginalized sectors of society". I would note that charge “conspiracy theory” has already been used by the powers in the mainstream to discredit various claims by marginalized sectors that turned out to be entirely true.

The authors warn that AIs could lead to “subversion of democratic decision making” and “poor individual health decision in the COVID pandemic.” But I think it should be noted that both sides of the political spectrum already accuse the other side of subverting democratic making, and many of us, for good reason, didn’t believe much of what the establishment was telling us about COVID and the vaccines (and it turns out we were right). In other words, it seems to me the AIs aren’t the real problem here. It’s the people who in the future will likely program and control the AIs. They are the ones we should worry about.

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest