No, AI did not accidentally learn Ecology by playing a video game.

Adil Syed
6 min readApr 29, 2021

A response to an article on the website TecTalk with the title “How AI Accidentally Learned Ecology by Playing StarCraft”.

NOTE: This article was originally supposed to be a thread on twitter, but the length made me decide otherwise. That is the reason though for the slightly more casual style of writing. At various points I have either quoted from the article being responded to or taken explanations from it.

I recently came across an interesting article on the r/artificial subreddit with the unfortunately misleading and inaccurate headline “AI Accidentally learns Ecology by playing StarCraft”.

Here are the highlights from the article and my thoughts on it.

The article is about a paper published by researchers in mid 2020 titled “Artificial Intelligence Accidentally Learned Ecology through Video Games”, which honestly is still a misleading and inaccurate title.

Starcraft II, the video game in question, requires players to choose an alien species to play as. These species then compete for resources in this virtual environment. And that’s basically what an ecosystem is.

Each species has certain strengths and weaknesses. They draw parallels between these alien characters and real life plants. For example, the “Terran” in the game are like cacti in real life; “slow growers, but good at defence.”

Enter AlphaStar, the AI based program made by DeepMind that, according to some quick googling, ranked above “99.8% of all active players” of StarCraft II. In other words, it’s really good at what it does.

The researchers argue that due to the similarity of the game to real life ecosystems, AlphaStar could be repurposed to study these ecosystems, possibly allowing them to test hypotheses that traditional approaches were not capable of.

Stated like this, everything so far seems reasonable. The problem I have is with the statement that AI “accidentally learned ecology through video games.”

This statement, at least to me, evokes images of a maybe sentient AI that now knows and understands the contents of a university ecology textbook.

That’s not what AlphaStar is about.

And even if we leave aside pedantic arguments about what learning means, the fact that AlphaStar can (perhaps) be repurposed to study ecological systems is not something that happened by accident. It was by design.

This is because the game StarCraft II by its very design mimics ecosystems, as was observed by the researchers who wrote the paper. AlphaStar was trained to become good at this game. So the fact that it became good at what the game mechanics mimic is no accident at all.

The more accurate statement would be about the unintentional use case of AlphaStar to study ecological systems.

And the researchers have said something along those lines in the abstract (I think that’s what it’s called) for their paper on cell.com. To quote:

“In a virtual ecosystem, players compete for habitats and resources, unintentionally reproducing many ecological phenomena.”

This sentence acknowledges that the ability to mimic ecosystems by AlphaStar is a result of the nature of the game, and not what the title of the article or paper would suggest.

Apart from the issue I have with the title, the article actually makes for an interesting read.

Here are some fascinating insights from the article.

  1. Applications of AI in Ecology

Some applications of AI in ecology include classification of species, predicting beetle outbreaks in pine forests, classifying bird calls in the arctic and predicting their migration patterns.

Interestingly, AI applications in ecology have recently moved from simple classification models to prediction models based on “messy, high dimensional data — the kind ecology tends to generate.”

I imagine this is in part due to the development and success of neural networks and the availability of cheap computing and storage on the cloud.

AlphaStar is more complex than most traditional ecological models that are usually tiny in comparison. And so it’s exciting what modern AI can bring to the table.

That being said, nature is more complex than a video game. From the article:

“A problem, in my opinion, is that the game mechanics — which are designed for being as entertaining as possible — are only superficially similar to the real physical world”.

2. Challenges to adoption of AI in Ecology.

Some challenges to AI in ecology mentioned in the article include:

  • Ecologists can be intimidated by the programming skills needed to train models.
  • Collecting data to train models can be difficult. Different kinds of data comes with unique challenges. Satellite imagery for example is easier to obtain than soil samples.
  • Another issue at times is the lack of “money and skilled collaborators available for ecology”. This is in contrast to the resources, financial and human, that a company like DeepMind has.

These points highlight an important point, the neural network, or any other AI model, are only one part of the process.

One has to keep in mind the entire end to end process. From collecting data to having domain experts to verify and interpret the results the model outputs, there is a lot required to derive value from implementing AI techniques to a given domain.

3. Artificial Intelligence as an enabling technology.

Another point to note is the role of AI as an enabling technology. It enables people to tackle problems that were previously either not possible to solve or unfeasible.

As mentioned earlier, this includes being able to make predictions on messy, high dimensional real world data as opposed to only simple classifications.

4. On the interdisciplinary nature of AI and the perception of the field.

As far as why even researchers in other fields make statements about AI that are not exactly accurate, I think it comes to down to two key issues.

  • First, AI as a field is interdisciplinary. It not only employs individuals from different fields, but also draws inspiration from various fields. Think neural networks. As a result, people outside the field tend to think that AI mimics not only the tasks humans do, but also the exact mechanism of it. This, I believe is inaccurate.
  • Secondly, AI techniques are just a tool. Classification, prediction, pattern matching, these are things that find applications in many fields. As a result, individuals in fields far away from computer science now find access to and potential applications of AI tools to their domain. While that’s a good thing, the preconceived notions people have about these tools and the technology behind it is shaped less by what modern AI is, and more by the popular hype surrounding it. That includes a future of sentient robot overlords with mass human unemployment.

Now, some, all or none of those concerns may turn out to be true. I’m not commenting on that. What I’m saying is that the perception of the technology is a little removed from what the tools at hand are designed to do, or even capable of, for that matter.

I hope as we move forward, a more structured and realistic understanding of AI is developed and spread to broader society. One that is driven not by hype, speculation and movies, but based on the actual abilities and limits of the technology.

While job losses are a genuine concern, before you worry about sentient robots taking over, worry about the potential damage created by human bias in data creeping into the AI models that are being deployed by companies, or worse, by governments.

That’s a more pertinent thing to worry about.

Lastly, I want to clarify that my intention is not to criticise the people who wrote the article or the researchers who published the paper. The article is actually very interesting. And while I haven’t read the paper, I think this is something we need more of. AI is a tool and as such it will find applications in many different fields.

If you find I have been incorrect in my assessment, please comment down below.

I hope this has been useful to you.

I encourage you to read the original article.

--

--

Adil Syed

Web Developer with an Interest in Artificial Intelligence and Indie Hacking