BBO Discussion Forums: Has AI solved Bridge - BBO Discussion Forums

Jump to content

Page 1 of 1
  • You cannot start a new topic
  • You cannot reply to this topic

Has AI solved Bridge AI

#1 User is offline   Mr74Notes 

  • Pip
  • Group: Members
  • Posts: 2
  • Joined: 2026-April-13

Posted 2026-April-13, 10:14

I have been playing on FunBridge recently and I have noticed how highly ranked the Argine AI is on the app. It is ranked 3rd worldwide on average performance (70.61% MPS) and it is ranked 1042nd worldwide on IMPS (3.51 IMPs). I have also seen world and national champion players play against it and 40%-45% of the time the Argine seems to win.

Is there a reason for this ranking outside of its performance or is it just that good.

I myself personally think the world champion players are probably not playing as seriously against the bot most of the time (is what I theorize). I also think it is not going head to head where players can play against the ai directly.

I am not sure though, what to the other players here think?
0

#2 User is offline   pescetom 

  • PipPipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 9,668
  • Joined: 2014-February-18
  • Gender:Male
  • Location:Italy

Posted 2026-April-13, 14:44

The short answer is "not yet".

The slightly longer answer starts with the old Italian saying that one should not ask the innkeeper (in this case Funbridge) if the wine is good :)
There have been several Bridge Bot World Championships (see this link) but I do not recall Argine getting anywhere, despite Funbridge sponsoring an early edition.
This discussion forum was deluded by Argine and BBO (owned by the same holding as Funbridge) has never dared use Argine as a replacement of GiB for 2/1, only for Acol (where it was simpler to tweak the robot rules than in GiB).
Given that GiB has never beaten Jack or Wbridge or even the robot of Synrey, that speaks volumes about Argine.

The possibility that AI can sooner or later play World Class (or better) is more controversial.
I suspect "yes, there is just not enough financial interest", but it is taking longer than many of us expected and some still think it may be impossible.
0

#3 User is online   johnu 

  • PipPipPipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 5,395
  • Joined: 2008-September-10
  • Gender:Male

Posted Yesterday, 18:38

View PostMr74Notes, on 2026-April-13, 10:14, said:

I have been playing on FunBridge recently and I have noticed how highly ranked the Argine AI is on the app. It is ranked 3rd worldwide on average performance (70.61% MPS) and it is ranked 1042nd worldwide on IMPS (3.51 IMPs). I have also seen world and national champion players play against it and 40%-45% of the time the Argine seems to win.

Is there a reason for this ranking outside of its performance or is it just that good.

Where are those statistics from? Playing against players on Funbridge? That says more about the players on Funbridge than the competency of Argine. And who are the world and national players? Are they playing with another world class player, or a much weaker player.

As pescetom notes, Funbridge and BBO are owned by the same company. If Argine was clearly superior to GIBBO, then wouldn't it make sense to abandon GIBBO and switch to Argine for both platforms? Reduces programming and support costs to only support one robot instead of 2.

The English Bridge Union has a rating system and GIB(BO) is rated against English club/tournament players. IIRC, it scores in the high 50% range on average. So even though GIBBO makes some awful bids and play as documented in the robot discussion, it's still much better than the average human player on BBO.

There used to be an annual world championship for bridge playing programs. That seems to have lost momentum, and the best robot lately has been a program called Wbridge5. Not sure how it compares to GIBBO or Argine since neither BBO or Funbridge have bothered to enter recent competitions. A decades old standalone version of GIB was in a relatively recent robot championship and didn't do particularly well, but BBO has done quite a bit of work in the past 20 years so who knows how good it actually is compared to some of the more recently upgraded robots. It's not clear how much AI goes into some of these robots. GIB uses a database of bids for most of its bidding, and uses double dummy analysis to determine plays. While double dummy analysis is quite impressive programming, I don't think the way it is normally implemented constitutes AI. GIB is also supposed to use some double dummy analysis to decide on bids. Again, very good programming but not AI.

BBO does have "Ben" which is based on AI. Based on comments in the Robot Discussion, Ben has its own sort of seemingly random bidding and play mistakes. Based on the results from some older tests, Advanced GIB(BO) played better than Ben by mostly small margins. Actually that's impressive since Ben was only started a few years ago, and the original GIB had development stopped in 2009. BBO bought GIB around 2005 and has independently done bug fixes and a few enhancements since then.
0

#4 User is offline   akwoo 

  • PipPipPipPipPipPip
  • Group: Advanced Members
  • Posts: 1,703
  • Joined: 2010-November-21

Posted Yesterday, 22:50

The average human bridge player is just that bad.

But consider this - a good chess program running on a PC in the 1980s - with none of the last 40 years of AI development and running on a machine with less computing power than a phone from 10 years ago - was easily at Master level (not Grandmaster level) in chess - over 2200 Elo.
0

Page 1 of 1
  • You cannot start a new topic
  • You cannot reply to this topic

2 User(s) are reading this topic
0 members, 2 guests, 0 anonymous users