+++ Experience a live debate between the first AI system to debate humans on complex topics, Project Debater, and a top human debater +++ How Google, Microsoft, and Big Tech Are Automating the Climate Crisis +++ How to train artificial intelligence that won’t destroy the environment +++ Deep learning has a size problem +++ Quantifying the Carbon Emissions of Machine Learning +++
Breakthrough — Or So They Say
Experience a live debate between the first AI system to debate humans on complex topics, Project Debater, and a top human debater. The IBM Research blog reports on a live, public debate of an AI system with a human debate champion at Think 2019 in San Francisco. During a twenty minute debate on a complex topic, IBM’s AI system Project Debater digested massive texts, constructed a well-structured speech, delivered it with clarity and purpose, and rebutted its opponent. Both the champion debater and Project Debater began by preparing arguments for and against the resolution, “We should subsidize preschool.” Both sides then delivered a four-minute opening statement, a four-minute rebuttal, and a two-minute summary. Both sides had only 15 minutes to prepare for the debate, affording neither the chance to train on the topic. In other words, an AI system engaged with an expert human debater, listened to his argument and responded convincingly with its own, unscripted reasoning to persuade the audience to consider its position on a controversial topic.
Tools and Frameworks
Business News and Applications
How Google, Microsoft, and Big Tech Are Automating the Climate Crisis. An article on Gizmodo, published already in February, collects facts around the engagement of tech giants with the oil and gas industry. Rockwell Automation and Schlumberger Limited, the world’s largest oilfield services firm and a competitor to Hlliburton, announced a joint venture called Sensia. The new company will “sell equipment and services to advance digital technology and automation in the oilfield”. But also Amazon, Google, and Microsoft have all struck lucrative arrangements—collectively worth billions of dollars—to provide automation, cloud, and AI services to some of the world’s biggest oil companies, and they are actively pursuing more. Last year, Google quietly started an oil, gas, and energy division. It hired Darryl Willis, a 25-year veteran of BP, to head up what the Wall Street Journal described as “part of a new group Google has created to court the oil and gas industry.” Google Cloud and the French oil giant Total signed an agreement to jointly develop artificial intelligence solutions for subsurface data analysis in oil and gas exploration and production. Google also struck major deals with the oilfield services company Baker Hughes, and the aforementioned Schlumberger. It even entered into talks to build a tech hub and data centers for Aramco, Saudi Arabia’s incomprehensibly massive oil company. Similarly, Microsoft is two years into a seven-year deal—rumored to be worth over a billion dollars—to help Chevron, one of the world’s largest oil companies, better extract and distribute oil. Microsoft Azure has sold machine vision software to Shell, and is powering its all-out “machine learning push.” It has helped BP build an AI tool to help determine how much oil in a given reserve is recoverable. Microsoft’s data services are helping XTO, a subsidiary of Exxon, “to gain new insights into well operations and future drilling possibilities.” And Microsoft Azure has also partnered with Equinor (formerly called Statoil, Norway’s state-owned oil company), to provide data services in a deal worth hundreds of millions of dollars. Finally Amazon, besides partnering as well with many companies in the oil and gas industry, is now itself in the oil business, releasing its own Amazon Basics Synthetic Motor Oil.
Publications
How to train artificial intelligence that won’t destroy the environment. The carbon footprint of machine learning is bigger than you think. An article by The Outline features recent publications on the environmental consequences of deep learning. A study (arXiv:1906.02243) by researchers at the University of Massachusetts Amherst had found that training just one AI model produces as much carbon dioxide equivalent as nearly the lifetime emission of five average American cars. In a similar vein, a preprint (arXiv:1907.10597) by researchers from the Allen Institute for AI, Carnegie Mellon University and the University of Washington, partly based on the earlier blog post “AI and Compute”, reports that the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.4-month doubling time (by comparison, Moore’s Law had a 2-year doubling period). Since 2012, this metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase). Potential remedies are discussed, e.g. sharing pre-trained models and transparent benefit-to-cost assessments of potential model runs (see articles below).
Deep learning has a size problem. A related, very good post on model efficiency. It goes deep in certain areas but remains extremely accessible at all points. In the long term, the resource hunger of modern deep learning models is going to cause a few problems. Quote: “First, it hinders democratization. If we believe in a world where millions of engineers are going to use deep learning to make every application and device better, we won’t get there with massive models that take large amounts of time and money to train. Second, it restricts scale. There are probably less than 100 million processors in every public and private cloud in the world. But there are already 3 billion mobile phones, 12 billion IoT devices, and 150 billion micro-controllers out there. In the long term, it’s these small, low power devices that will consume the most deep learning, and massive models simply won’t be an option.”
Quantifying the Carbon Emissions of Machine Learning. Researchers from Element AI, Université de Montréal and Polytechnique Montréal just published a preprint (arXiv:1910.09700) on their Machine Learning Emissions Calculator. Quote: “From an environmental standpoint, there are a few crucial aspects of training a neural network that have a major impact on the quantity of carbon that it emits. These factors include: the location of the server used for training and the energy grid that it uses, the length of the training procedure, and even the make and model of hardware on which the training takes place. In order to approximate these emissions, we present our Machine Learning Emissions Calculator, a tool for our community to better understand the environmental impact of training ML models. We accompany this tool with an explanation of the factors cited above, as well as concrete actions that individual practitioners and organizations can take to mitigate their carbon emissions.”