Blackjack and pair splitting

During the last couple of weeks, I have been spending a lot of time re-visiting a problem that first attracted my interest almost 15 years ago: casino blackjack.  My interest was renewed just lately after a friend at work directed me to an online message board where people were discussing some software that I had written for playing and analyzing the game.  After getting involved in the discussion, I ended up making some significant improvements and additions to the code, and learning some interesting mathematics in the process.

The software is available at the usual location here.  Everything but the pre-built executables is also on GitHub.

There are two programs in the release zip archive; one is a game/trainer, that lets you play blackjack with customizable rule variations, number of decks, etc.  Here is a screen shot:

Screenshot for blackjack game.

As you can see, this was a long time ago.  The graphics are thanks to the Allegro game programming library.

The other program is a strategy calculator, that generates a printable strategy table for particular rule variations, and allows you to inspect expected values for specific player hands.

Although there are a lot of blackjack software packages out there, I think a few features combined make this one unique:

1. It’s exact.  (At least, it is now– more on this shortly.)  Both the game and strategy calculator use the same underlying “engine” to compute exact probabilities and expected values based on combinatorial analysis, as opposed to Monte Carlo simulation.  This is where the interesting mathematics lives, and is the main reason I enjoy studying this and other casino games.  (Unfortunately, I am not actually much of a gambler.)

2. It’s fast.  Although traversing the “tree” of possible outcomes of a particular hand is pretty simple in principle (e.g., a recursive function of a dozen lines or so would compute the dealer probabilities), doing so quickly is another matter.  The game is fast enough to allow “perfect card counting,” continuously updating the exact player edge (vs. a “true count” estimate) prior to each hand.  This is where the interesting computer science lives, and is the other reason I continue to re-visit the game.

3. It’s free.  Executables (pre-built for Windows) and complete source code are available for anyone who wants to use them, subject to the GPL.  The code for the computational engine is separate from any external interface, and so may be used in other applications.  (I know of a few instances where people have done just that; for example, an old blackjack game in the GNOME distribution used this engine.)

But about that claim of “exact.”  It turns out that blackjack would be a relatively simple game to analyze, if not for one particular feature: splitting pairs.  Splits greatly complicate analysis of the game; there are many challenges with even specifying a strategy involving splits, let alone evaluating or optimizing such a strategy.  For example, should the player be allowed to consider all cards among all split hands in a round when making decisions, or just the cards in the current (possibly split) hand?  The former would be optimal but too hopelessly complex to memorize, while the latter is realistically playable but sub-optimal.  And there are more possibilities in between these two extremes.

The most significant of these latest improvements is to the algorithm for computing expected values for re-splitting pairs.  Prior to these changes, split values were exact only when no re-splitting is allowed; i.e., a pair may only be split once.  When re-splitting is allowed (usually up to a maximum of four split hands), my previous algorithm only approximated the corresponding expected values.

With these latest changes, the code now computes exact expected values for re-splitting as well, for several different choices of complexity of “post-split” playing strategy.  The details of the algorithm are described in this paper.  (Most of this paper is over eight years old; the last section is the recent addition that allows an efficient implementation.)

Finally, recall the red/black card puzzle from several months ago?  That solution turns out to be useful once again here, by providing an easy test for proving that a particular algorithm for computing expected values– in almost any card game, not just blackjack– is either incorrect or an inaccurate approximation.  The idea is as follows: consider using some algorithm to compute the overall expected return from a single round of blackjack, played from the top of a full shoe, and the corresponding playing strategy that realizes that expected return.  (For example, the expected return for a typical 6-deck game is -0.615389123%.)

Now recompute overall expected return again, using that same playing strategy, but this time using a shoe with one ace removed.  Similarly, compute the expected value with one two removed, then with one three removed, etc., all with the same playing strategy.

The “Extended” True Count Theorem (first proved by Thorpe, generalized slightly here) says that if we compute the average of these expected values, weighted by the probability of each corresponding removed card, this should equal the overall expected value computed for the full shoe.  Put another way, the average “effect of removal” should be zero.

So if using the same algorithm to compute the overall expected value in these two different ways yields two different answers, then the algorithm must be incorrect.  Of course, the converse is not true: equality does not imply correctness.  But I have still found this to be a useful additional sanity check and debugging tool when working on algorithms like this one.

Edit: This is some interesting simple analysis using the strategy engine.  The following plots show expected return versus shoe penetration for both basic and optimal strategy.  For each plot, a single simulated player played 500 shoes.  On the left (think “perfect card counting”), the player used the same composition-dependent, “full-shoe” basic strategy for every hand.  On the right (think “perfect card counting and perfect play“), the player used a different strategy continuously re-optimized for each hand, based on the current depleted shoe.

Expected return vs. shoe penetration.

Each blue point indicates exact expected return (i.e., not just a true count) for the corresponding depleted shoe prior to each hand.  The red curves indicate 10th percentiles of the distribution of expected return.

50 thoughts on “Blackjack and pair splitting

  1. I have not tested on a Mac (I don’t have a Mac to test :)). However, both have been tested on Linux, and the Allegro site (the graphics API for the game) advertises support for OS X, so I would not expect any problems. Also, the strategy calculator is just a text interface, so building it is easy: just ‘g++ strategy.cpp blackjack.cpp’ from the src folder.

  2. Pingback: Blackjack and number of decks | Possibly Wrong

  3. Pingback: Blackjack CA optimization as a Steiner tree problem | Possibly Wrong

  4. Pingback: Efficiency of card counting in blackjack (Part 1) | Possibly Wrong

    • I had to poke around a while before finding Wizard of Odds split EVs to compare with; I assume you are referring to those in his Appendix 9?

      The challenge, as mentioned in this post, is that we have to be very careful about specifying what playing strategy is used for the new hands created *after* splitting a pair. For example, my software has 3 options: CDZ-, CDP1, and CDP. This is reasonably common notation; see the readme.txt for details. CDZ- is the simplest, and arguably the only one actually realizable in practice by a human at the blackjack table. At the other extreme, CDP is the most “optimal,” but at the expense of an arguably impossible-to-remember increase in complexity of playing strategy.

      From the description in the various Wizard of Odds appendix pages: “The program only considered how many hands the player had already split, but not the other cards drawn in those hands.” Unfortunately, it isn’t quite clear exactly what this means. It *sounds* like it might be CDP, in which case you’re right that there is disagreement. But he might mean something slightly different, in which case his values aren’t necessarily wrong, they just depend on different assumptions. We would need more details on his methodology to resolve the discrepancy. (This cross-checking is extremely useful; see this archived forum thread for an example of what I’m talking about.)

      • Would the fact that the Wizard does composition dependent calculations instead of total dependent calculations affect the EV’s slightly?

    • No, in fact that is how I got started on this– I have implemented a couple of different infinite-deck CAs, one in C++, and a second in Mathematica. This is much easier than finite-deck, particularly for computing split EVs: all of the necessary component EVs described here simplify to just a few different terms, since all of those E(X;a,b) values no longer depend on a and b (the number of pair and non-pair cards removed from the shoe).

      • It’s not in the distribution described in this post, if that’s what you mean. The infinite deck code is pretty old, first written in 1997, when I was new to C, so you may have an interesting time even getting it to compile (e.g., at first glance, I see a #include in there :). Let me know if you want me to post it somewhere, or just email it to you, etc.

      • Sorry… I meant are the changes needed to implement it in the current code localized in a few functions, or does it require a major refactorization?

      • Could you discuss how to implement an algorithm for computing the EV when hitting? I find it difficult to determine when to exit the recursive algorithm. Computing the dealer probabilities is far easier since the exit from the recursive algorithm is easy to figure out.

      • @Mike, re computing the EV for hitting: there are a couple of ways to approach this. First, if you’re doing it recursively, your “base case” is a busted hand. That is, to compute_ev_hit(hand), first check if the hand is busted, if so, return -1. Otherwise, return a weighted sum of the optimal value of each possible drawn hand, something like 1/13*max(compute_ev_stand(hand.deal(1)), compute_ev_hit(hand.deal(1))) + … + 4/13*max(compute_ev_stand(hand.deal(10)), compute_ev_hit(hand.deal(10))).

        I wouldn’t recommend this, though, since in the recursive traversal of possible drawn hands, you will re-visit a lot of hands multiple times. Instead, you can visit each hand just once– but in a carefully chosen order– so that when doing the “recursive” weighted sum, you can guarantee that the EVs for hitting that you need have already been calculated. Do you have a copy of Griffin’s Theory of Blackjack? If so, Chapter 11 does a good job of describing the basic idea. Let me know if this makes sense.

      • I have the code running and it works fine. Can it be modified so that if the dealer does NOT peek for blackjack, that both the ACE and 10 are ignored?

        Also, is it terribly difficult to modify the code so that the number of resplits can be specified, i.e. 2 to 4 hands?

  5. Could you help understand how the overall house edge for a blackjack game is computed? I have managed to write my own program to compute the optimal return for all 550 unique upcard/player-hole-card combinations. If I sum up all ev(i)*p(i) (where i=1 to 550) the overall return is off a bit. I have determined that all ev(i) and p(i) match published results on the web. If I sum up the evs by upcard, they match published for all but A and 10. Do the final evs need “adjustment” for dealer Blackjack? My algorithm already adjusts the optimal returns for such cases. I’m puzzled on this one.

    • @Pete, from your description of the discrepancy, where everything looks fine except for ace and ten up cards, you’re right, the “adjustment” needed is to condition (or “un-condition”) your EVs on the dealer not having blackjack. If you have a specific number of decks and rule variations (S17/H17, etc.) in mind, I can provide the corresponding table of 550×2 ev(i) and p(i) values, and the accompanying calculation of overall EV for comparison.

      • How about 1 deck;
        Dealer hits S17
        DAS
        No late surrender
        Double on any first 2 cards
        Split to 4 hands

      • My calculations condition returns and probabilities so that the dealer does NOT have blackjack. Are you saying that the final result needs to be unconditioned so that the dealer may have blackjack?

  6. @Pete: Yes. When you compute the sum of ev(i)*p(i), each ev(i) needs to be the “overall” EV for that hand and up card, including the case where the dealer has blackjack. Let’s use the rules you indicated (where the overall EV is -0.00008734733, with no resplitting aces, and CDZ- post-split strategy), and consider the specific example of player 10,6 vs. dealer ace (so that p(i)=64/16575 or about 0.00386124). If I understand your notation correctly, I am guessing that your ev(i) for that hand is -0.52896517978 (from hitting), which assumes that the dealer does *not* have blackjack, right?

    The value for ev(i) should instead be P(bj)*(-1)+(1-P(bj))*(-0.529), where in this case P(bj)=15/49 (the probability of a ten hole card).

  7. @Mike, the ENHC rule wouldn’t be too difficult to incorporate. The limited resplits would take a bit more work– it’s actually *easier* to compute unlimited resplits– basically requiring the same calculations described in the linked paper and implemented in the distribution. But at that point it seems like there is a fidelity mismatch; why not just use the finite-deck calculation (i.e., the existing strategy.exe) with, say, 1000 decks? You can specify limited resplits, and the resulting EVs won’t differ from the infinite deck approximation until the fourth digit or so.

  8. @Mike: Yes, but note that my CA computes composition-dependent strategy as well (by default, at least– the API allows for an arbitrary playing strategy to be specified). At any rate, no matter *what* playing strategy is used, computing exact EVs requires considering all possible *compositions* of hands separately. In other words, to *compute* EVs, we must always consider hand composition, whether the *playing strategy* decision to stand/hit/whatever considers composition or not.

    Even if it can be verified that the Wizard’s numbers and mine both use “composition-dependent strategy,” there is still the potential for differences depending on *what* composition-dependent strategy is being used. (I would be happy to communicate with him, btw, to resolve this if he is available for discussion.) This is where things get tricky, because there is not just one reasonable composition-dependent playing strategy, depending on how we decide to play hands “pre-split” or “post-split.” My approach, referred to as CDZ-, is to compute optimal CD strategy temporarily assuming that we never split any pairs… then to apply *that* strategy to split hands. Note that this does not necessarily yield the best *overall* EV… but no one else computes this either, since it is intractable to do so– the earlier linked thread at blackjackinfo.com discusses this, and I have more detailed discussion archived from the old bjmath.com if you’re interested.

  9. What do you recommend one to do when you are in fact trying to determine the total-dependent strategy with component-dependent results and you have more than 1 option?

    • If I understand your question correctly, then unfortunately this is in fact a very difficult problem which no one has solved that I know of. That is, suppose that you provide me a complete total-dependent (TD) strategy, specifying the stand/hit/double/etc. action for *every* possible hand total and dealer up card. Then I can *evaluate* that strategy relatively easily, and compute the exact overall EV of playing that strategy.

      But among all possible such complete TD strategies, which one yields the *maximum* resulting overall EV? This is a computationally hard problem. It may seem counter-intuitive, especially since it is actually *easy* to compute such a maximum-EV *composition*-dependent (CD) strategy!

      To see why, consider *how* we compute optimal CD strategy, essentially “bottom-up” through the tree of possible player hands: first, computing the expected value of *standing* on any particular player hand is easy (for the purpose of this discussion, at least 🙂 ). To compute the expected value of *hitting*, we must consider each of the 10 possible new hands obtained by drawing each possible card ace through ten.

      So suppose that we *start* by considering hands that already total hard 21. Then computing E(hit) is easy, because if we draw *any* card, we bust.

      Now consider hands that total hard 20. If we hit, and draw anything but an ace, we bust. And if we draw an ace, we have hard 21… the EV of which we have already computed!

      We then evaluate all hands totaling hard 19, then hard 18, then hard 17, etc. By being careful about the *order* in which we evaluate each composition-dependent hand, we guarantee that we can compute E(hit) in terms of expected values for drawn hands that we have already computed.

      The problem is that this trick doesn’t work for TD strategy. Should we stand or hit on, say, hard 16? That will depend on our strategy for a hard 17 (if we draw an ace)… but how often we *encounter* a hard 17 (of which there are many different possible compositions) depends on our strategy for hitting lower-total hands… which is exactly what we are trying to compute!

      In this sense, computing optimal CD strategy is like a straightforward dynamic programming problem, while TD strategy requires “global” optimization.

  10. Would it be terribly difficult to modify your code so that the player can hit split aces? If you could at least identify the locations where the code needs to be modified it would be a great help.

    • That depends on which code you are talking about. That is, at a minimum, the underlying CA engine (blackjack.*) would need to be modified to accurately compute the corresponding expected values. This would involve (1) changing the BJRules interface to *specify* whether hitting split aces is allowed; (2) changing the logic that currently *prevents* hitting split aces. At first glance, this happens in the blocks starting at lines 514, 600, and 992 of blackjack.cpp. And finally, (3) increasing the size of the array holding the player hand subsets in blackjack.h, since there are now more possible subsets of cards that can be dealt.

      If you are doing your own analysis using the engine, then this is all that’s needed. But if you are wanting to modify the game or the strategy calculator, then the corresponding changes would be required to game.cpp and/or strategy.cpp to “forward” the rule changes to the CA engine.

  11. I simply want to modify the engine so it can calculate with or without hitting split aces. So if I understand correctly:

    Blackjack.cpp
    1. Line 514:
    for (int card = 2; card <= 10; ++card)
    becomes
    for (int card = (hitSplitAces ? 1 : 2); card = splitHands)
    && (currentHand.count – pairCard * (splitHands – 1) = 2 && (pairCard != 1 || numCards == 2));
    Not sure the extent of changes needed… can you comment?
    Is it just removing conditioning pairCard != 1?

    3. Line 992:
    if (pairCard == 1) {
    value = hand.valueStand[true][upCard];
    becomes
    if (!hitSplitAces && pairCard == 1) {
    value = hand.valueStand[true][upCard];

    Blackjack.h
    Increasing the size of the array holding the player hand subsets, is that playerHands[25941]? How much bigger does it need to go?

    • Right. Re line 600, that last condition (pairCard != 1 || numCards == 2) can be removed altogether. And I did miss line 965 as well. Re the size of the array, this is something I would do differently if I had it to do over again (use a std container instead of a native array). You don’t have to get the size exactly right; increasing it to 50000 will be plenty, and it’s not a memory hog so you’re not wasting enough to matter.

      One approach to testing I have found handy in the past is to use some *very* small shoe subsets, where you can compute the correct expected values by hand and compare with the output.

  12. For those interested in the details:
    Line 515:
    for (int card = (_hitSplitAces ? 1 : 2); card = 2 && (_hitSplitAces || (pairCard != 1 || numCards == 2)));

    Line 966:
    if (_hitSplitAces || pairCard != 1) {

    Line 993:
    if (!_hitSplitAces && pairCard == 1) {

  13. Seems my posts are being garbled…

    Line 603:
    (numCards >= 2 && (_hitSplitAces || (pairCard != 1 || numCards == 2)));

  14. Do you know why the following “code” doesn’t quite compute the correct ev’s when the dealer upcard is an ACE or 10?

    void CalculateDoubleDownEv(Hand& player, Hand& dealer, Shoe& shoe)
    {
    long double ddEv = 0;
    for(int card=1; card<=10; ++card)
    {
    long double wt;
    if(!shoe.RemoveCardGetWeight(card, wt))
    {
    continue;
    }
    player.DealCard(card);
    int total = player.getTotal();
    if (total <= 21)
    {
    ddEv += wt * CalculateStandEv(player, dealer, deck) * 2;
    }
    else
    {
    ddEv += -2.0 * wt; // lose it all if hand busts
    }
    shoe.RestoreCard(card);
    player.RestoreCard(card);
    }
    return ddEv;
    }

    It works for all starting hands, except for those with the dealer having an ACE or 10. The result probably needs to be conditioned for the dealer not having blackjack, but I can't seem to figure it out.

      • There are a couple of problems. First, the straightforward weighted summation only works for *unconditioned* expected values for standing on the resulting drawn hands. Fortunately, it’s relatively simple to convert values already conditioned on no dealer blackjack to unconditioned values; see BJPlayer::conditionNoBlackjack() in blackjack.cpp, for example (although these are converting in the other direction).

        Second, note that, unless you are playing ENHC, the player doesn’t lose his additional doubling wager if the dealer *does* have blackjack. So that multiplier 2 is only appropriate when the dealer *doesn’t* have blackjack. Again, see BJPlayer::conditionNoBlackjack() for an example of how to handle this (specifically, you can keep the fixed 2 multiplier in the summation, just start with an offset initial expected value instead of just starting with zero).

  15. Pingback: Distribution and variance in blackjack | Possibly Wrong

  16. Hi
    Is there a way to calculate the average bet (for optimal play) based on all the information that is available in your code?

    • Maybe, depending on what exactly you mean by “average bet.” One possible interpretation is, for a single round, how often do you just risk your initial wager, vs. how often do you double down, vs. how often do you split (and how many times), etc.? Even this can be a difficult question to answer exactly if resplitting is allowed, but if it’s not, then the compute_pdf() could be modified to aggregate *risked* money (vs. just the outcome which is all that’s currently recorded).

      On the other hand, perhaps you mean, if I not only *play* optimally but also *bet* optimally across some specified bet spread, over a specified number of shoes dealt to particular penetration, etc., then this is even harder to answer, since “optimal betting” has a lot of wiggle-room for particular assumptions (are bets constrained to be integral multiples of a base amount, etc.).

      • My interest is in the average wager per round as you describe in the first paragraph.

  17. @Mike, see here for a gist showing how blackjack_pdf.cpp can be modified to compute the entire probability distribution of amount wagered (instead of the distribution of net outcome). As with the net outcome, this is constrained to no resplits, but otherwise you can specify whatever rules you want.

  18. Good afternoon. Can you publish a version of GAME using HSA, as well as a (CDP/CDZ which gives the BEST EV in the three splits allowed?) in the form of source code that can be compiled? The code posted on the github produces a lot of errors. Or a big request to contact me by email. Thanks.

    • I didn’t see a corresponding issue posted on GitHub; can you describe the compilation errors that you are seeing? What platform and compiler are you using? What version of the Allegro library are you using?

    • Execution time difference between CDZ- and CDP, for otherwise-identical rule variations, is negligible. Since execution time for my implementation is nearly exactly linear in the number of shoe states for which dealer probabilities are computed, the biggest drivers are (1) the number of splits allowed, with SPL3 taking longer than SPL1 (since we need to evaluate more shoe states); followed by (2) the number of decks, with more decks requiring more shoe states to evaluate… but only up to a point, that is, 1 deck is faster than 6 decks, which is in turn barely faster than 8 decks, but after that, no faster than, say, 1000 decks.

  19. @nothingTOsay, both CDZ- and CDP expected values are equally accurate (namely, they are both exact). The values will in general differ, because these are different playing strategies, where CDP allows playing strategy to vary with more complexity than CDZ-, depending on whether your current hand is the result of a split. For example, suppose that CDZ- strategy indicates hitting on 6-2 vs. dealer 5. If so, then you have to always hit on 6-2 vs. dealer 5, no matter whether that 6-2 is the initial hand in the round, or the result of splitting 2s, or the result of splitting 6s. But CDP strategy is allowed to be more complex, so that the player might have to remember to hit 6-2 vs. 5 on the initial deal, but double down if the 6-2 is the result of splitting 2s, etc.

    (And CDP is actually more complex than CDP1, where the latter only depends on whether the current hand is the result of a particular pair split, while CDP is allowed to vary depending on how many additional pair cards have been dealt in the possibly multiple splits during the round. IMO only CDP1 is useful, both pragmatically due to the amount of additional complexity of CDP, but also theoretically, for reasons that I can discuss in another post.)

Leave a reply to possiblywrong Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.