Hybrid model reveals people act less rationally in complex games, more predictably in simple ones

Wait 5 sec.

Published on July 9, 2025 10:15 AM GMTThe title is the summary of the paper in a "popular writeup in Phys.org, which made me curious. The full paper title is Capturing the complexity of human strategic decision-making with machine learning (Nature).Does the claim hold up? Are people more rational in simple games? They trained machine learning to predict the strategic decisions of humans in various games. The claim is backed by the paper, but even the complex games are simple in some sense. The complexity results from the "what would the other person choose" dynamic.Example of a simple game: LeftRightTop70, 7020, 0Bottom0, 2010, 10Example of a complex game: LeftRightTop80, 020, 70Bottom10, 6090, 10Full paper abstract:Strategic decision-making is a crucial component of human interaction. Here we conduct a large-scale study of strategic decision-making in the context of initial play in two-player matrix games, analysing over 90,000 human decisions across more than 2,400 procedurally generated games that span a much wider space than previous datasets. We show that a deep neural network trained on this dataset predicts human choices with greater accuracy than leading theories of strategic behaviour, revealing systematic variation unexplained by existing models. By modifying this network, we develop an interpretable behavioural model that uncovers key insights: individuals’ abilities to respond optimally and reason about others’ actions are highly context dependent, influenced by the complexity of the game matrices. Our findings illustrate the potential of machine learning as a tool for generating new theoretical insights into complex human behaviours.Discuss