The Competitions that are being held at the IEEE CoG 2020 are the following:
Details about each of the Competitions can be found below:
Bot Bowl II is the second AI competition in the board game Blood Bowl. The competition uses the Fantasy Football AI (FFAI) framework  that simulates the game with an API for scripted bots and machine learning algorithms in Python. Blood Bowl is a major challenge due to the complexity introduced by having multiple actions each turn. For more details on why we think Blood Bowl should be the next board game challenge for AI, check our CoG paper . Watch state-of-the-art in this video showing the Bot Bowl I final series https://youtu.be/6qv_pzeYoOU.
 https://github.com/njustesen/ffai  Blood Bowl: A new board game challenge and competition for AI. Niels Justesen, Lasse Møller Uth, Christopher Jakobsen, Peter David Moore, Julian Togelius, and Sebastian Risi. Conference on Games (CoG). IEEE, 2019.
The collectible online card game Hearthstone features a rich testbed and poses unique demands for generating artificial intelligence agents. The game is a turn-based card game between two opponents, using constructed decks of thirty cards along with a selected hero with a unique power. Players use their limited mana crystals to cast spells or summon minions to attack their opponent, with the goal to reduce the opponent's health to zero. The competition aims to promote the stepwise development of fully autonomous AI agents in the context of Hearthstone.
During the game, both players need to play the best combination of hand cards, while facing a large amount of uncertainty. The upcoming card draw, the opponent’s hand cards, as well as some hidden effects played by the opponent can influence the player’s next move and its succeeding rounds. Predicting the opponent’s deck from previously seen cards, and estimating the chances of getting cards of the own deck can help in finding the best cards to be played. Card playing order, their effects, as well as attack targets have a large influence on the player’s chances of winning the game.
Despite using premade decks players face the opportunity of creating a deck of 30 cards from the over 1000 available in the current game. Most of them providing unique effects and card synergies that can help in developing combos. Generating a strong deck is a step in consistently winning against a diverse set of opponents.
The competition will encourage submissions to the following two separate tracks, which will be available in the second year of this competition:
In the “Premade Deck Playing”-track participants will receive a list of decks and play out all combinations against each other. Determining and using the characteristics of player’s and the opponent’s deck to the player’s advantage will help in winning the game. This track will feature an updated list of decks to better represent the current meta-game.
The “User Created Deck Playing”-track invites all participants in creating their own decks or choosing from the vast amount of decks available online. Finding a deck that can consistently beat multiple other decks will play a key role in this competition track. Additionally, it gives the participants the chance in optimizing the agents’ strategy to the characteristics of their chosen deck.
You can find more information on this year’s competition and the evaluation of last year’s submissions on our webpage. It also features a list of previously submitted bots and their source code as well as information about how to get started.
Legends of Code and Magic (LOCM) is a small implementation of a Strategy Card Game, designed to perform AI research. Its advantage over the real cardgame AI engines is that it is much simpler to handle by the agents, and thus allows testing more sophisticated algorithms and quickly implement theoretical ideas.
All cards effects are deterministic, thus the nondeterminism is introduced only by the ordering of cards and unknown opponent's deck. The game board consists of two lines (similarly as in TES:Legends), so it favors deeper strategic thinking. Also, LOCM is based on the fair arena mode, i.e., before every game, both players create their decks secretly from the symmetrical yet limited choices. Because of that, the deckbuilding is dynamic and cannot be simply reduced to using human-created top-meta decks.
This competition aims to play the same role for Hearthstone AI Competition as microRTS plays for various StarCraft AI contests. Encourage advanced research, free of drawbacks of working with the full-fledged game. In this domain, it means i.a. embedding deckbuilding into the game itself (limiting the usage of premade decks), and allowing efficient search beyond the one turn depth.
The contest is based on the LOCM 1.2, the same as in CEC 2019 Competition. One-lane, 1.0 version of the game, has been used for CodinGame contest in August 2018.
The General Video Game AI (GVGAI) Learning Competition explores the problem of transferring and reusing the knowledge learnt on given levels of single-player games to play unseen levels without access to any forward model or explicit game rules. More about this competition can be found on the GVGAI website (http://www.gvgai.net/) and the competition webpage (http://www.aingames.cn). The participants are invited to submit their agent via the competition webpage. Detailed instructions for submission are given on the same webpage. The final rank and winners will be announced on the same webpage.
- Docker (GVGAI+RL baselines): https://github.com/SUSTechGameAI/GVGAI_GYM/blob/master/docs/Guidelines-Docker-GVGAI-RLbaselines.md
The goal of the competition is to build AI agents for a 2-player collaborative physics-based puzzle platformer game (Geometry Friends). The agents control, each, a different character (circle or rectangle) with distinct characteristics. Their goal is to collaborate in order to collect a set of diamonds in a set of levels as fast as possible. The game presents problems of combined task and motion planning and promotes collaboration at different levels. Participants can tackle cooperative levels with the full complexity of the problem or single-player levels for dealing with task and motion planning without the complexity of collaboration.
The competition has 3 tracks: Cooperative, Single Player Circle, Single Player Rectangle
A video is provided in the guides section in the website. https://geometryfriends.gaips.inesc-id.pt/guides/
Several AI competitions organized around RTS games have been organized in the past (such as the ORTS competitions, and the StarCraft AI competitions), which has spurred a new wave of research in to RTS AI. However, as it has been reported numerous times, developing bots for RTS games such as StarCraft involves a very large amount of engineering, which often relegates the research aspects of the competition to a second plane. The microRTS competition has been created to motivate research in the basic research questions underlying the development of AI for RTS games, while minimizing the amount of engineering required to participate. Also, a key difference with respect to the StarCraft competition is that the AIs have access to a "forward model" (i.e., a simulator), with which they can simulate the effect of actions or plans, thus allowing for planning and game-tree search techniques to be developed easily. This will be the third edition of the competition, after the 2017 and 2018 editions hosted at IEEE-CIG conferences.
What are promising techniques to develop general fighting-game AIs whose performances are robust against a variety of settings and opponents? As the platform, Java-based FightingICE is used which also supports Python programming and development of visual-based deep learning AIs. Two leagues (Standard and Speedrunning) are associated to each of the three character types: Zen, Garnet, and Lud where the character data of the last two characters are not revealed. Standard League considers the winner of a round as the one with the hit point (HP) above zero at the time its opponent's HP has reached zero. In Speedrunning League, the league winner of a given character type is the AI with the shortest average time to beat our sample MCTS AI. The competition winner is decided considering both leagues' results based on the 2015 Formula-1 scoring system.
ColorShapeLinks is an AI competition for the Simplexity board game with arbitrary game dimensions. The first player to place n pieces of the same type in a row wins. In this regard, the base game, with a 6 x 7 board and n = 4, is similar to Connect Four. However, the type of piece is defined not only by color, but also by shape. Shape can be round or square. Round or white pieces offer the win to player 1, while square or red pieces do the same for player 2. Contrary to color, players start the game with pieces of both shapes. This means that a less observant player can lose in its turn, especially since shape has priority over color as a winning condition. Given this fact, as well as the arbitrary game dimensions, the challenges for the AI, namely at the level of the heuristic evaluation function, are multifold.
The Ludii AI Competition is a general game playing competition focussed on developing agents that can play a wide variety of board, card, dice and tile games. This competition will use Ludii, a recently developed general game system, to provide the necessary games and API. Games will be provided in the Ludii .lud game description format. The current version of Ludii includes over 200 games, with new games being added frequently. Submitted agents will play against all other competition entrants on a selected set of 20 games in a round-robin format. These games will not be named or provided to the agents beforehand. Agents will have a set amount of time, typically a few seconds, to make each move. Agents that fail to return a move, or return an illegal move, within this period will have a random move made for them. The agent that achieves the highest average win-rate across all games will win the competition.
This first iteration of the competition will only include games that are:
We may also include games that are:
IEEE CoG StarCraft competitions have seen quite some progress in the development and evolution of new StarCraft bots. For the evolution of the bots, participants used various approaches for making AI bots and it has fertilized game AI and methods such as HMM, Bayesian model, CBR, Potential fields, and reinforcement learning. However, it is still quite challenging to develop AI for the game because it should handle a number of units and buildings while considering resource management and high-level tactics. The purpose of this competition is developing RTS game AI and solving challenging issues on RTS game AI such as uncertainty, real-time process, managing units. Participants are submitting the bots using BWAPI to play 1v1 StarCraft matches.
This competition focuses on a variation of the classic Snakes game to provide an entertaining environment of studying and teaching artificial intelligence (AI) and machine learning (ML) techniques. Snakes is an arcade game in which two players control their respective snakes, attempting to collect the most number of apples while manoeuvring its way to avoid collisions with each other or leaving the board. Participants are requested to submit the implementation of the bot that operates its respective snake, using the API provided and simple specifications. The tournament will happen in a round-robin format, in which two participants play against each other ten times. At each match, snakes capture apples to increase their sizes. After 2 minutes, the longest snake wins; unless your opponent dies before. ;) The champion is the participant whose bot achieves the highest winning ratio among the participants. Snakes game has been designed to facilitate and encourage the use of AI, ML and other advanced computational techniques by first and second-year students in university-level courses of Computer Science.
This year we will run our fifth Angry Birds Level Generation Competition. The goal of this competition is to build computer programs that can automatically create fun and challenging Angry Birds levels. The difficulty of this competition compared to similar competitions is that the generated levels must be stable under gravity, robust in the sense that a single action should not destroy large parts of the generated structure, and most importantly, the levels should be fun to play, visually interesting and challenging to solve. Participants will be able to ensure solvability and difficulty of their levels by using open source Angry Birds AI agents that were developed for the Angry Birds AI competition. This competition will evaluate each level generator based on the overall fun or enjoyment factor of the levels it creates. Aside from the main prize for “most enjoyable levels”, two additional prizes for “most aesthetic levels” and “most challenging levels” will also be awarded. This evaluation will be done by an impartial panel of judges. Restrictions will be placed on what objects can be used in the generated levels (in order to prevent pre-generation of levels). We will generate 100 levels for each submitted generator and randomly select a fraction of those for the competition. There will be a penalty if levels are too similar. Each entrant will be evaluated for all prizes. More details on the competition rules and can be found on the competition website aibirds.org. The competition will be based on the physics game implementation “Science Birds” by Lucas Ferreira using Unity3D.
The aim of the competition is to promote the production of short videos highlighting any research which is relevant to IEEE CoG. The videos may be related to CoG papers but this is not necessary. A similar competition was run for IEEE CIG 2018 and IEEE CoG 2019. The winner entries of each challenge are included below.
The videos should be informative and well presented. Participants must submit a video which is not longer than 5 minutes, but there is no lower limit. The video should include a title page at the beginning. Each video must mention that it is an entry for the IEEE CoG 2020 Short Video Competition.
To enter the competition at least one author of the video must be registered for the conference.
Entries should be submitted via the conference paper submission server, selecting “Short Video Competition” from the list of special sessions. The information required is the video title, authors, a brief description (approx. 150 words), and a link to the video which can be hosted on any easily viewable video streaming service such as Youtube, Youku, or ieee.tv.
Entries are submitted by registering the video and uploading the required information to the conference paper submission server. Links to all accepted videos will be published on the conference website after the conference. A panel will select the set of finalist videos to be judged by the audience during the short video competition session at the conference.
The session should ideally be run in plenary mode to attract maximum audience participation and feature just the top 6 videos shortlisted by the panel. An ideal time to do this would be just prior to the conference dinner.
The winner will be chosen by an audience vote at the end of this session. The organisers reserve the right to exclude any video they deem to be offensive or inappropriate.
Evolving Mario Levels in the Latent Space of a Deep Convolutional Generative Adversarial Network
SAPEO for Games