AI vs Paul the Octopus

Dhanum Nursigadoo
5min read

It’s been 8 long years since Paul the Octopus passed away. Purveyor of predictions, he managed to correctly foresee Spain as the 2010 World Cup winners. So, how does the most advanced AI machinery in finance stack up against the seemingly magical powers of Paul the Octopus? There’s also the recent predictive piggie Mystic Marcus who wasn’t quite as accurate as the beloved octopus…

Stay tuned for more on this topic very soon, as this Friday we release our podcast on AI that you’ve been asking for.

It’s All About the Numbers

The World Cup has existed in its modern format since 1930, with two breaks in 1942 and 1946 when World War Two made the World Cup impossible. That’s only 21 tournaments with 64 matches each. 1344 unique results isn’t much of a data set. But you can add in the individual players performances at the club level, plus friendly matches between countries, and tournaments like the Euros. So eventually, there’s a fairly large data pool to choose from. But how does it stack up against an octopus and a pig? AI is all about data, it requires a significant dataset to find useful values and forecast outcomes. So it’s curious to see how AI stacks up against the prophetic powers of an octopus and a pig. Financial AI currently resides in the hands of multiple companies, including: Goldman Sachs, Danske Bank, and UBS. All of which have powerful AI programs that use Machine Learning. AI excels at forecasting trends, so how did the banks hold up?

Goldman Sachs

Goldman Sachs took a couple of stabs at guessing who would win, at first forecasting a Brazil-England final before changing to Belgium-England on July 9th after Brazil were knocked out. Needless to say, Goldman Sachs’ AI could not beat the octopus or the pig. Goldman Sachs didn’t take all the data possible and pour it into their AI algorithm. They used data on team characteristics, players, and team performance. After that, four different Machine Learning models analysed the relationship between the data and goals scored per match. All that plus World Cup data since 2005 lead to a result that was off-target.

Danske Bank

Danske Bank fared no better than Goldman Sachs, their AI model also predicted a Brazil victor. This time it was a Brazil-Germany final. To be fair to Danske Bank, they only gave a 17% chance of total victory to Brazil but it was still at the top of their leaderboard. Danske Bank’s modelling used: net goal difference, variations in income levels, differing football traditions (I have no idea how that was quantified), the difference in FIFA ranking between national teams; and most interestingly, GDP per capita in PPP. None of it mattered in the end as Danske’s modelling only managed to guess France as a semi-finalist. Brazil, Argentina, and Germany were predicted but failed to make the cut.


UBS forecasted straight away that Germany would manage to hold on to the trophy and emerge as World Cup winners with a 24% chance at victory. They were a bit off with that prediction. Brazil, Spain, and England were the next most likely according to UBS’ AI model. Sadly, that was also not the case. AI models can take a variety of different data inputs, but UBS’ may have had the ‘smartest’ choice of data as its prediction was in line with the bookmakers. Using Elo (skill) rankings of teams along with a mix of other data, UBS ran the model through a Monte Carlo simulation. Monte Carlo simulations are designed to use a sizeable number of random variables to add a random element to calculations. It’s essentially replicating the wild card factor that’s common in championship tournaments across all sports. UBS repeated this around 10K times and counted up how many times each team won to decide their overall victor. Unfortunately, none of their three most likely victors made it to the final.

Forecasts not Prophecies

Paul managed to get 12 out of 14 predictions correct, including Spain as the eventual 2010 World Cup winners. That’s a success rate of around 86%. Mystic Marcus managed to correctly predict Brexit and the Trump presidency in addition to choosing the four semi-finalists of the World Cup. None of the banks were able to come close to replicating that sort of efficiency. But that’s not where the most interesting part of this story is. Banks are now capable of creating prediction models that forecast behaviour to a fair representation of accuracy. Many of the top picks were favourites to win before the tournament, so while AI can't grant us visions of the future, it's delivering more accurate models every day. No-one can tell you the future, not even an octopus with impressive predictions. Really, Paul just wanted a mussel and picked one for any of a thousand reasons. Mystic Marcus is a pig who likes apples. AI doesn’t want anything, all it can do is use data to make a best guess on future outcomes based on previous trends. But banking isn’t the same thing as the World Cup, it’s not as volatile as any sport. Forecasting is a useful tool for any bank to have and Machine Learning AI is currently the best way to enhance that. At the moment it seems as though the best AI in banks has lesser psychic powers than an all-knowing octopus, but might be just about as prophetic as a pig that England fans want to see made into bacon. It’s also worth noting that this was an incredibly unpredictable World Cup by any standard, and as an aside, my prediction is that in 2020 it’ll finally be coming home. Set your alarms, this Friday at 4pm we’re releasing the most highly anticipated podcast since we started. We’re taking on AI.