“Millions of people have changed the way they search, they write and they create images… Generative AI has created enormous upheaval in the tech world,” said Kate Crawford at the ABC’s Futurecast Conference today.
Professor Kate Crawford, an internationally leading Australian scholar of artificial intelligence and its impacts, who lives in America, pointed out that Generative AI has developed “planet scale infrastructure that uses enormouns amounts of water and more energy than all of Japan.”
“AI is neither artificial or intelligent. It is propped up by millions of underpaid crowd workers who are making this system work. It’s having enormous impact on the planet,” she said.
It is 15 to 20 times more energy intensive to do a Generative AI search than a normal search, according to Crawford, who has done research on AI systems.
Discussing how AI bots have been scraping internet hosted photo services to develop new images within their generative photo tools, Crawford pointed out that AI has not yet developed enough discriminative tools to distinguish between different animals, so it just makes up hybrid animal pictures. AI is “hallucinating” when it does that, because AI tools use statistical probability to place one word after another or one image pixel after another in a picture.
One of the things we don’t talk about enough when thinking about the future effect of AI, is who controls the world’s AI systems.
“There are three companies in the West and one in China that control most if the AI infrastructure. Four companies that own the backbone that AI depends on. They have an extraordinary amount of power over the billions of people who use the tools. They own the pipelines of distribution,” said Crawford, who wonders whether the small number of people who control these companies act responsibly.
“Elon Musk owns X and Grok. They have an incentive to allow misinformation to proliferate because it brings viewers and dollars. Elon Musk fired all the fact checkers at X… we are in uncharted waters.”
Asked by moderator Gavin Fang if she is concerned about the implications for trust and society, Crawford said “yes.”
“I think regulators have a very powerful role to play. In the EU the AI Act was passed and is looking very promising, but technologies were active in watering it down in the months before it was enacted. Now those rules are being implemented, we will see how it plays out.”
She believes this is a conversation we need to have “with urgency” in Australia. “How will we make sure we protect what is important to us in terms of society and media. Tools built by tech companies are in the DNA of your organisation, what happens to the content you produce?”
The fact that AI systems and the companies behind them are “planetary” means that it is difficult for national states to regulate them, a fact that is being played out in Australia and elsewhere in the world at this time. The idea of states building their own AI systems, known as Sovereign AI, is gaining traction in countries that have the human and technological capacity to do so and the money to do it with.
Should media be intertwining itself with these companies, asked Fang. While Crawford, and other speakers in subsequent panels, agreed that Generative AI is exciting and has potentially great benefits for media, there will need to be transparency and guidelines as media companies adopt AI.
“We’ve seen some problems when media companies didn’t declare it… and the other questions is, ‘do media companies want to be led by AI?’
“We’ve been through this before, facebook has been leading us … organisations found that they needed to jump to the call of social media when they change their algorithms or priorities things like video or deprioritise other elements that companies have been using. We’ve learnt that this can be dangerous, so we need to be more proactive, to come out with strong positions about what media companies will and won’t do with AI and also to be conscious of whether what we are doing is fair to the original creators.”
Crawford thinks it is “too early” to know if any media companies are getting it right yet. She says there is a wave of excitement for the new technology, which is understandable, but that media companies need to pause to think about where the technology is really serving them best and where it is exploiting their content and their journalism.
“AI not going away so we need a grounded perspective of how we engage with it. To help navigate through this time, remain sceptical, look at how it is done, and what it may be costing us… regulate, make some hard decisions and make them in a timely way.”
In another session of the conference, the ABC’s Chief Content Officer Chris Oliver-Taylor spoke about how he looks at innovation within the ABC. “The amount of content the ABC produces is immense, so I look at innovation in specific way. Can it increase value and serve the Australian people… I also consider our public broadcaster responsibilities and ethics. If the ABC loses trust it loses relevance, so we need cautious innovation,” he said.
“Third party platforms are a complex conversation for the ABC. We can’t ignore the opportunities of third party platforms and also support ABC owned and operated environments. Third party platforms allow us to connect to audience who don’t normally engage with the ABC every week.”
In preparing this report we used an AI tool, Perplexity, to fact check Kate Crawford’s claims about energy usage by generative AI systems and to see how well Generative AI would handle questions about itself. The first answer said:
AI-based searches consume significantly more energy compared to traditional search methods. For instance, a single query using ChatGPT, a popular AI model, consumes nearly ten times the power required for a traditional Google search. This increased energy consumption is due to the complex computations and vast amounts of data that AI models process, as well as the high-performance hardware required to run these models continuously. Traditional search engines primarily rely on keyword matching and indexing, which are less computationally intensive. In contrast, AI models, especially those used in generative AI, involve deep learning algorithms that require extensive training on large datasets and continuous inference operations, leading to higher energy demands. As a result, data centers housing AI servers are projected to consume about 1.5% of global energy usage, with expectations to double by 2026.
Noticing that Perplexity did not dob in itself for its own energy usage, we refined the prompt: Include perplexity’s watt-hours (Wh) of energy in this article near the ChatGPT mention. Perplexity answered:
Each interaction with ChatGPT can consume up to 2.9 watt-hours (Wh) of energy, which is nearly ten times the energy expenditure of a standard Google search at 0.3 Wh. While specific data for Perplexity’s energy consumption is not provided in the search results, it’s reasonable to assume that it falls within a similar range as other AI-powered search tools.