Bridging the AI Gap

Bridging the AI Gap

By Alex J. Coyne © Great Bridge Links

AI has come a long way since Deep Blue beat chess player Garry Kasparov in 1996, but research shows that AI isn’t quite as close to matching up with real-life Bridge players as we thought. Here’s how far we’ve come with Bridge AI so far and what could still be holding it back…

Computer Bridge: Way Back in 1997

Back in 1997, Bridge Baron won the Baron Barclay World Bridge Computer Challenge and made history for computers everywhere. “Although computer programs have done well in games such as Chess and Checkers, they have not done as well in the game of Contract Bridge.” says the study by researchers Smith, Nau & Throop Computer Bridge: A Big Win for AI Planning, published in AI Magazine Vol. 19 No. 2 (1998).

In fact, according to this study, AI had no problems mastering games like Connect Four, Go-Moku, Quibic and Nine Men’s Morris; Othello it plays ‘probably better than any living human’, Backgammon ranks in at ‘better than all but 10 humans’. Bridge, however, comes in ‘worse than the best players at many local clubs’. Why?

The problem lies in how computers think. Bridge is a game of incomplete information: You, nor the computer opponent, are able to see each other’s cards or moves. To decide what to do next, the AI follows a ‘decision tree’ – for Bridge, the probabilities are huge. As the study notes, “Because a Bridge hand is typically played in just a few minutes, there is not enough time for a game tree search to search enough of this tree to make good decisions.”

Smith, Nau & Throop solved this by changing some of the probabilities – making the computer build its own decision trees directly based on what happens. “We have developed an algorithm for declarer play in Bridge that uses planning techniques to develop game trees whose size depends on the number of different strategies that a player might pursue rather than the number of different possible ways to play the card.”

Cool, right?

Modern times… Now what?

Both AI and the processing power of the average computer have advanced in huge leaps since the 90’s, and according to The State of Automated Bridge Play by Paul M. Bethe (2010), “Computer bridge players have surpassed humans in the time they take to solve double-dummy problems, and the best programs are attaining expert status as card players.”

The study notes that computers don’t handle ‘deceptive play’ – that would be techniques like bluffing – all that well, and there is still some problems in terms of computer logic, especially in terms of computers grasping the idea that a human might ‘let’ it make a mistake in play. “With all these problems and sub-problems, it appears researchers still have a lot of work ahead of them to create a bridge player to rival the top experts.” says Bethe.

Neural Networks for Contract Bridge Bidding by Yegnanarayana, Khemani and Sakar (1996) (not currently available) was an early attempt to “explore the possibility of capturing the reasoning process used in bidding a hand in a bridge game by an artificial neural network”. Yes, this is basically a computer that learns to bid by, well, thinking like a player would. That brings us back to something a little more modern.

Contract Bridge Bidding by Learning by Chun-Yen Ho and Hsuan-Tien Lin (2015) takes a closer look at how systems can cope with an ‘incomplete information game’ like Bridge. Now, computers can learn things like strategy instead of just making a decision based on a simple decision tree. From the study, “Our initiative justifies the possibility that machine learning may be able to do better than human-designed bidding systems on [the] bridge bidding problem.” Isn’t that how the machines took over in every AI movie ever? (Fine, we suppose, if it makes for a better bridge game against the machines…)

Battle of the Machines

The World Computer Bridge Championship looks to find the best of the best of Bridge AI: The first champion was Bridge Baron in 1997, while Jack has taken the top spot a total of ten times (the latest win for it being 2015). In 2016, the winner of the WCBC was Wbridge 5. You can find a more extensive history of champions and their individual histories from Bridge Guys, here (Summary of Participants at Computer Olympiads).

What’s next? Are you an ambitious programmer with a love for Bridge who has taken a stab at improving the AI behind Bridge games? Let us know!