Artificial Intelligence Tech Uncategorized

How Artificial Intelligence is not Energy-Intelligent

How Artificial Intelligence is not Energy-Intelligent
banner

Introduction

Artificial Intelligence is changing a lot in our lives. Artificial Intelligence systems have been built to enable extraordinary possibilities and opportunities for businesses. Extraordinary computing power, machine learning, and automation tools are just a few of the features that make systems world-changing items. In numerous industries, service sectors, and research institutions.

AI is finding applications and delivering on promises that formed before the inception. Anthropologists are using AI systems to make discoveries that were missing or overlooked. Financial institutions are automating and standardizing and reporting tasks. Businesses are creating more effective marketing campaigns while optimizing supply chains. With all this exciting news, it is almost too good to be true.

But it is true. Although the potentials of AI systems are myriad and growing with new applications, its not all pines and roses. Researchers are beginning to reveal that this remarkable invention may also prove a danger to the Earth. In a time when the world is paying attention to the effects of its actions on the environment.

All technology depends on energy and energy sources. Currently, the primary source of energy is fossil fuel. The waste gases emitted when using this energy source (CO) are detrimental to the environment and the ozone layer. Management of energy consumption aims at reducing these harmful emissions while the world discovers and integrates less dangerous energy sources.

How Much Does AI emit Carbon?

According to Spanish researcher, Carlos Gomez-Rodriguez, most researchers had previously trivialized the ecological effects of AI energy consumption. A paper submitted by July Schwartz compares the carbon emission of a basic AI model to the discharge that five cars will produce in their lifetime.

Common Carbon Footprint Benchmarks

As reported by researchers at MIT, standard benchmarks include

  • Trans-American flight for one person is about 1,984 pounds of Carbon.
  • An average person in a year emits 11,023 pounds of Carbon.
  • A US car for one lifetime emits 126,000 pounds.
  • One NAS AI system emits 626,555 pounds of Carbon

The Carbon Footprint of Natural Language Processing

Researchers at MIT decided to identify how much energy is used and the baseline carbon footprint of developing AI systems. Researchers focus on natural language processing (NLP). NLP is based on using machine learning to create computers that can use human language. Researchers used four models of NLP AIs: the Transformer, ELMo, BERT, and GPT-2. Each model was trained for 24 hours, and its energy consumption was measured. Energy usage was then converted to pounds of carbon dioxide, based on the average US (and Amazon’s AWS) energy mix.

The result of the research found that although training a single model is the most basic task, the most costly model (BERT) had a carbon footprint of about 1,400 pounds. This is the same amount a person has for a year of trans-American flights. Additional tuning techniques increased this amount.

The Privatization of AI Research

Major AI systems have been developed by private companies in partnership with the government. We have corporations like Google, Amazon, IBM, and Microsoft in the West, increasing AI research. In the east, there’s Hikivision(video surveillance), iFlytek (voice recognition), Yitu (machine vision), Megvii,and SenseTime (image recognition). Recently, Alibaba announced its enhanced AI chip.

Renewables for AI

Most of the energy consumption of AI systems come during the development stage. One such process called neural architecture search (NAS) is done by automating the design of the neural network. The process takes 84 hours, but without NAS, it takes more than 270,000 hours. Researchers are not only looking into creating more efficient AI models but also suitable renewable energy sources.

Currently, Amazon has announced its dedication to investing and integrating wind power and solar farms for generating electricity. The project is still underway, but according to its website, last year saw its operations powered by more than 50 per cent. Google is also making similar plans and projects. For now, more private companies have to follow these examples.

Conclusion

The advantages of AI systems cannot be ignored or overlooked. We are already on the move of scientific advancement. It would be ill-advised to stop now. It would be better to find cleaner energy sources and a better system. There are already fee proposals made to reduce the adverse effects of AI energy consumption.

One such proposal is to increase awareness on the issue and push for the development of healthier and more eco-friendly policies. Along with the awareness practice, researchers should also accurately report on relevant parameters such as energy consumption, training time, and equipment. Every report must be made using widely accepted standards.

Academic institutions should also avoid getting drawn into the AI race. By banding together, academic institutions can pool resources to build something very high quality. Time spent to achieve the decided goals would also be lower when more institutions come together. Researchers are also encouraged to focus on creating efficient models that use less energy. Government bodies should also implement stricter policies to bring giant corporations and multinationals to heel.

banner

Related posts