The level of community involvement and engagement in this project has convinced us to extend Innovation Week for another week, through to January 21. Now, more of you will be able to join in the fun and push Ben to its limits. With the power of AI and BBO's extensive data, we're not just playing bridge, we're modeling bridge's future.
For over two decades, your loyalty has made BBO the premier online destination for bridge enthusiasts. Your input has made it more than just a game platform; it's a thriving community where friendships are made over a shared love for bridge.
In tandem with our thriving community, AI bridge partners, GIB and Argine have also played their part in our story. These AI 'companions' have brought another dimension to our game, offering both challenges and surprises. Your engagement and feedback have been instrumental in their evolution.
In pursuit of improving your experience, our passionate tech team has been working tirelessly to improve BBO. As part of this, they've taken the innovative leap of training Ben, the open source bridge engine, with various traits using BBO's rich data repository. For those of you who don't know, Ben is Lorand Dali's Bridge Engine, a bridge playing robot that can be trained with machine learning. Depending on the hands we "feed" it, Ben can become anything the human mind imagines, and beyond.
Here are some of Ben's various traits that have been tried so far:
If this all sounds intriguing, then come and hop in and take part in the adventure. We're opening the door to our experimental AI models from January 8 to January 21. This is your chance to preview and interact with our latest AI innovations, especially ACBL Ben.
Play the free daylong games and explore the free Arcade game featuring ACBL-Ben. Your feedback will be invaluable as we continue to shape the future of digital bridge.
It is really a completely underwhelming experience playing with this robot. Most of the time it gives wrong or misleading answers in bidding, absolutely horrific behavior when playing defense. NOT an improvement in the BBO experience.
I've played with this engine most if not all days out of curiosity from the start. Is the engine expected to improve in this timeframe? (start to now).
From the perspective of my boards, it is getting worse. Today the robot mis-responded to Blackwood, counting keycards is not a difficult programming task and this has nothing to do with simulating hands opposite.
The robot also unveiled a bizarre MSS stayman sequence over my 2NT opening (3!S) followed by 4!C/3NT then 5!D/ 4NT. This was explained as 5+!C, 6+!D and some number of points i don't recall. Not quite, as dummy tabled 4315 shape.
If this is the best that can be accomplished, I would suggest a different path/algorithm.
How bizarre is this auction
https://www.bridgebase.com/tools/handviewer.html?lin=pn|Margo2017,~~v3fakebot,~~v3fakebot,~~v3fakebot|st||md|3SAT632HDA532CKQJ3,S85H87632DJTCA764,SQJ74HAKQT5DQ764C,SK9HJ94DK98CT9852|sv|o|rh||ah|Board%201|mb|1H|an|Major%20suit%20opening%20--%205+%20!H;%2011-21%20HCP;%2012-22%20total%20points|mb|P|mb|1S|an|One%20over%20one%20--%204+%20!S;%206+%20total%20points|mb|P|mb|3S|an|Jump%20raise%20--%205+%20!H;%204+%20!S;%2016-18%20total%20points|mb|P|mb|4N|an|Blackwood%20(S)%20--%204+%20!S;%2016+%20total%20points|mb|P|mb|5N|an|Even%20number%20of%20keycards%20and%20some%20void%20--%205+%20!H;%204+%20!S;%2016-18%20total%20points|mb|P|mb|6D|an|Cue%20bid%20--%201+%20!C;%204+%20!S;%20no%20!CA;%20!DA;%2019+%20total%20points;%20forcing|mb|P|mb|P|mb|P|pc|S8|pc|SQ|pc|SK|pc|SA|pc|D2|pc|DT|pc|DQ|pc|DK|pc|S9|pc|S2|pc|S5|pc|SJ|pc|D4|pc|D8|pc|DA|pc|DJ|pc|C3|pc|C4|pc|D6|pc|C2|pc|HA|pc|H4|pc|CJ|pc|H2|pc|HK|pc|H9|pc|CQ|pc|H3|pc|HQ|pc|HJ|pc|CK|pc|H6|pc|D7|pc|D9|pc|D3|pc|H7|pc|CT|pc|D5|pc|C6|pc|H5|mc|11|
Unfortunately , this is the worst bot you can find on the net ...
A pity ...
Long ways to go! I don't agree with the comments that AI learned from Human play in ACBL tournaments leads to inferior bidding/play. No bridge robot, bar none, can play at even a regional A/X level and score over 40%. So learning from good ACBL players might work if the programming is up to snuff. Robots must learn to signal, not peek at each other's hands, like the GIB robots do, and have fewer brain farts. How is it that with both opponents in the bidding a robot partner will take you to 7 something with <3 HCPs in the hand because they assign you with 30+ HCPs (a mathematical impossibility). 1❤️ -dbl - 2❤️- pass-pass-3 ♦️ pass 4 NT(???, etc) for example. The 3 ♦️ bidder has 20 - Partners HCPs, not more.
Absolutely disagree that this is a better robot … the worst ! If this is the face of A I world we are in big trouble..,
Might be helpful to show us where these "free arcade games" are located. I don't see anything labeled as such.
In "Robot World" there is a button on the bottom of the page
I cannot even find where the AI bridge games are. It said in the casual area but I cannot find it there. Perhaps they should extend it until you can publish how to find the game
Hi Russ, it's in the Robot World section, for the tournament you'll find it on top. Or you can also click on the button below the list of tournaments.
Please see the details on how to find it here: https://news.bridgebase.com/2024/01/05/try-our-ai-bridge-engine-innovation-week/
I played a game on the Al bridge engine on Jan 15, 2024 where the robot partner bid 2S weak two. I as the partner did a pass and then the opponent entered 2NT. However, the robot actually went up again to 3S with only 6 S cards. This has caused a very fatal consequence on my score.
At the end, it made me wonder whether this robot was set up correctly or not. Therefore, in the last 5 games I created abnormal bids since I felt it was useless because the robot seems to be programmed incorrectly, so I don't see a reason to bid correctly.
Have not played enough AI games to judge whether they are better than GIB. But have enjoyed them more because they are not Best Hand. Best Hand games are not real bridge
I do not take robot games seriously though i play them quite often. That is a great way to improve my declaring skill and nothing else.
Bridge is a game that requiers 100% trust among partners (probably that's why it's called bridge). Having that in mind, Ben is a lying, sel endulging, arrogant submediocre.Horrible conclusion is, based on "100 mil ACBL played boards", that is a reflection of an average bridge player.
I feel better playing with AI robots, and generally score higher in the free AI tournaments than the free GIB tournaments. It could be my playing style and/or the field is stronger in the GIB tournaments.
Terrible bidders these bots are.. should probably stop this experiment and go back to drawing board to rework on this AI, stop training it on us... the results in these AI tournaments don't matter much because of the terrible bids and play by the bot
They had AI train itself in the game of Go..
Now it can easily beat any human.
And, it makes many moves humans do not understand!
Twice Ben has screwed up a RKC auction. First time it showed the wrong number of keycards. (It showed two without the Q but had 3 with the Q.) Second time it not only bid a slam off two key cards, but bid 5 NT looking for a grand. (It had 2 key cards, I showed 1 or 4. Then it bid 5 NT.) This seems like really basic stuff.
Yes, I caught Ben lying about keycards at least once - answered "two without the trump Queen" when it actually had the Queen.
In general, Ben is much more aggressive than GIB, frequently doubling opponents' (and my) contracts for penalty. That's probably a better model, but sometimes it's stress-inducing (and/or stupid).
It promised me 0 or 3 and then produced 2!!!
I just played a hand - opps in 3C. my bot partner led a singleton queen of clubs. huh??
I can not play
This bot still makes terrible leads. Much like GIB. Doesn't seem like any improvement there.
Much of the root cause of having robots behave poorly in auctions is that their bidding is based on points rather than tricks. Why was Culbertson much better a player than Goren who introduced points? This is the very difference between the ugly point evaluation and what really matters: tricks. My robots are therefore much more performing than the BBO ones because they have a sounder evaluation of the potential, based on tricks taking as well as unescapable losers. It's really straightforward: robots should not be taught anything less than my students: bidding is just preparation for play, and what matters is tricks, only tricks...
Excellent comment.
I totally agree... Tricks, High Cards, the Aces and Kings, are the most important... then finding fit... then valuing short suits... But, as a rule queens and jacks are unimportant... To me, to use for example, 2/1 system is antiquated and much over valued... And for the most part, the BOTs are useless in sensing the value of the words above... in short, they are pretty stupid, at least to me...
Bots should be programmed from sound, dynamic rules, like humans should be. Humans are usually taught by means of stupid static rules, which makes most of them poor players. This is the way most classes and textbooks are unfortunately made, again and again for coimmercial reasons, and kept in the darkness of the essence of our wonderful game. Bad bots are also fed with static rules and Ai helps them ape humans, usually poor players, anyway not the brilliant players you can learn from. They deserve a better fate, somehow in the area Asimov perceived them. Now, how can we get sensible bots if their conceptors do not know what to teach them? The programmer should ally with a brilliant theorist and practioner of the game so that the what marries the how. Common sense, isn't it?
In the game of Go they had the net play
against itself.
It now beats any human.
Wonder if BBO could contact Deep Mind to learn their methods?
I can only agree. This is the whole difference of using AI properly. AI is not about aping, it is about giving the bot the structured means to learn good stuff, not any garbage passing by.
I think that the AI approach used in advanced chess engines is the wrong approach when it comes to bridge (especially bidding), where there are two players, as long as the partner of the AI is a competent player.
The proper approach, IMO, is to have the robots adhere to system, improve on the (awful in competition) GIB system, and have a set of principles and guidelines for when system runs out. Simulations and "learning" can apply to marginal (close) cases.
The approach chosen will likely end up making the robot play like a pro playing with a client (as opposed to another pro). This will be very frustrating to above-average players.
I notice that every tournament awarding points lists hands as " advanced". That listing appears despite the ranking system . And, of course, the ranking system here is flawed by foreign competition where advanced players aren't identifiable by ACBL points. I would remind ACBL you need new players and making the goals " mission impossible" is hurting at the club level as well. We cannot get our "new" players to even want to play in an open game. They use our club as a "cheap room" for social bridge with cheap lessons if interested.
I have been playing bridge for a long time used to be piruli or cafeto 5,000connections and then I STOPPED. I came back at the beginning of last year and have played all the time- my connections are at always at 1000 and I must say rather disappointed, A lot of the peoplewho started are now at 5000 can you do something for me, thanks and Happy New year
I must say I am surprised to see the very few comments left here. This topic is probably shaping the future of bridge, indeed. Perhaps the reason is that so few suggestions have been answered so far by the BBO team, when at least all the relevant ones are expected to be. It is really a pity, as the offered services nave been great, since the creation of the site. If those suggestions keep remaining unanswered, we can expect the useful comments to decrease the same way.
You mentioned some bots you made using A K Q J instead of high card points.
How to see them in action?
While i appreciate the bbo experience a lot, still i don't get why the bbo is not implementing simple suggestions like, for example, naming dummy as "dummy" and give its card a distinct background so that the daylong players don't mistake their own hand for dummy - as everyone would have done some time or the other. Also, when suggestions are made, it would be nice to see the response of the bbo team.
I agree !!!
Would love e to.play wit h Ben.
January 1-3 could not register me, hope Ben's week will do.
some of the director keeps entry to limited players that shount be allowed by BBO and in TD not there so i td or robotas td should be kept
Looking forward to playing with Ben 🙂
Happy to meet you,
I will give it a chase and try it.
When will a bot ask partner to first input the parameters
for each opening bid of the system to be played?
That would be great and what system they would like to play
Yes, this should have happened years ago.
GIB is outdated and should be re programmed so you create your own robot(partner) like Bridge base got and use 2/1 system and a few extra convention which work very good with 2/1 system
GIB shows total points and (example) 6-11 and 90% is on the low count and you bid and shows 6-11 total points it allways counts your points on high count and bids again
never dbl a contract for penalties because GIB bids again to another level
I also think the robot should not expect a fourth suit opening bid on the 2 level to be preemptive
I feel another price increase coming on.
I have been on data modelling to build bidding gridge robots for years, and I am very disappointed at the level reached so far by the BBO robots. Not to mention their awful defense. Anyone interested in augmenting my robots with AI collected simply on double-dummy random hands? And of course no learning from so-called bridge champions, except for the very best in psychology (Zia, Seres, for instance). I also need a program computing the best probabilistic (not double-dummy) minimax outcome for every possible contract in a given hand. Something to replace the double-dummy tables available today (which have already been a substancial step forward).
Having worked in the AI field in its earlier days, my real wish is to get the robots out of human games other than as emergency subs. Bridge is a human game full of human errors and human leaps of logic, and we're all the better when we play against each other.
agree
Strongly agree. Humans tend to excel competing other humans. AI will take that edge away. Not looking forward to play with it against a neural engine.
ok
I may be repeating a comment already made? Essentially, the decision the robot makes in bidding, selecting a lead, highlighting a suit preference and playing a hand, must show itself to be "Humanly" logical.
Not doing this, it isolates itself from it's human partner, making the whole experience decreasingly even less interactive. Leaving you with a feeling that you have no real control when you engage.
+1 to this. GIB bidding something because "it concluded the response worked well in its simulation" is as useless as a human partner saying "I bid game because it felt right".
If you are a pro, you need to learn how to quickly adapt to the weaknesses of your client lest you lose him/her. Learn to treat bots the same way.
But I'm not a pro and GIB isn't my client.
If GIB were my client, she would either stop some of the silliness or be a former client. After all, pros do improve clients' games.
No matter HOW you posit it---------a bridge "game" with 3 robots is STILL only a 2-player game!!!
I don't think you should use novices to train Ben. Just good advanced and expert players. This is the Ben I'd want to play with and against even if I were a novice so I could learn to play better.
AI is neither artificial nor intelligent
AI simply suggests it gives the impression of intelligence...so it might not be either, but it is both.
The bridge AI innovation I'm most interested in is one that can honestly explain its bidding to opponents - at the moment all robots quasi-cheat by having implicit understandings that humans aren't privy to.
The present robot occasionally bids erratically
Occasionally? Personally, I'd go with most of the time. It's your best guess as to what is in robot's hand when it's bidding. Doesn't follow bridge rules about 80% of the time and about 90% of the time doesn't have what the "bubble" says it has. There wouldn't be a need for the "bubble" if the robot followed some bridge rules. I could go on but I think point has been made.
Truth
Very good idea. I will surely take part in it.
If the results in the previous "Robots vs. Best Players" is anything to go by, it seems like this "star player" ACBL-Ben did significantly worse than even GIB Basic.
While it's true that ACBL Ben scored low against star players and advanced bridge robots, it's important to note that ACBL Ben was trained on hands played by humans. Typically, human players' skill levels are lower than those of robots, including the GIB Basic robot.
We encourage the BBO community to preview and experience an AI model that has been trained with human gameplay. This project is ongoing research, not a final product. Exploring AI's potential to enhance robot play, for instance by training a model to play 'like a human,' is a significant step towards making bridge more accessible and understandable for newcomers, and even for the average bridge player (who is not usually an expert).
All I want from any bridge robot is count and attitude on defense.
Do any of these do that?
Try the SAYC daylong, see if you like it better than the usual GIB games.
This does not answer the direct question:
yes or no: Does Ben use signals when defending a bridge hand.
Please don’t respond unless you know for sure.
If you were the person who said "All I want from any bridge robot is count and attitude on defense.
Do any of these do that?" my answer to try the SAYC daylong directly answers the question, because yes, the SAYC robot does give standard signals.
Leaving this aside, the Ben model trained on human hands will give whatever count and attitude it learned from the humans. I would guess that most times yes, given that in BBO's ACBL games it's extremely common to have an established system which includes signals. Best is to come see for yourself, in January 🙂
Ben does not in fact use signals in defense yet. At least not deliberately,
Amen!!! And i already play the daylomgs - a LOT - and their count and attitude is horrible
I was really worried! (no joke) so, thank you for pointing out that Ben just learned from our silly games and makes the same mistakes, only more frequently ha ha. Imagine the disaster if it was like the modern chess engines, that no human can beat them
I'm not even sure it makes mistakes more frequently than the "normal" bridge player. The games were we pitched Ben vs humans were much stronger than the average bridge field.
need a partner. i want to play
hi devon, you don't need a partner for this, it's going to be a free daylong game and arcade game, you'll play with the robots 🙂
ok