Researchers at Sony AI and Polyphony Digital created an AI bot that can outperform the best championship-level Gran Turismo players. The virtual racer, named Gran Turismo Sophy (GT Sophy), bested its human counterparts in the latest racing showdown.
Under the hood, GT Sophy uses a technology called “reinforcement learning” to ace the racing simulation game. This tech helps AI make decisions in real-time throughout a simulated race.
Trial and Triumph
Sony AI and Polyphony Digital started collaborating and training GT Sophy in April 2020. The following year, the racing AI became ready to face off against human players.
In July 2021, GT Sophy did tremendously well in time trials and even beat time records set by human players before. However, when competing in a multiplayer race, the AI’s performance dropped as it competed against four of the best GT championship players.
Later, the team behind GT Sophy took insights from the first trial versus humans and fine-tuned the AI. The results were quite different when the bot took on against actual players again in October. It completely dominated the races and beat human players by a big margin.
Interestingly, the AI showed remarkable resilience and won the overall competition despite trailing at certain points. Showing appreciation, an elite GT player, Takuma Miyazono said, “I want to race with GT Sophy more in the future. I learned a lot from the AI agent.”
Sony AI COO Michael Spranger highlighted Sophy’s impressive progress by calling it “an AI agent that learned to drive by itself at a very competitive level and can compete with the best drivers in the world.”
But Gran Turismo fans need not worry about facing this AI any time soon. It has only passed a highly competitive racing trial as of now and isn’t being prepped for a public release.