Lately I have been revisiting– again– the problem of efficiently calculating strategy and expected values in blackjack. (See this previous post for a description of the software and download links.) The last improvement was one of precision, involving an update to the algorithm for evaluating pair splits to produce exact results for several variants of re-splitting rules and playing strategies.

This time it’s about speed. When computing the overall expected value of a round, almost all of the execution time is spent in a bottleneck that is easy to describe: computing the probabilities of the possible outcomes of the dealer’s hand (i.e., stand on 17, 18, 19, 20, or 21; or bust; or blackjack), for each of many different subsets of cards in the shoe. This post describes an improved algorithm for doing this, as well as some interesting unanswered questions about how close to optimally efficient it is.

(A valid question at this point is: why bother? The current version of the code already takes less than a couple of seconds on recent hardware. At least, on PC hardware. In the process of porting the code as part of an Android app, I learned the hard way just how slow embedded systems can be. On my dual-core, 1.2 GHz Droid Razr Maxx, the same code runs nearly *50 times slower* than on my 2.5 GHz laptop. So it is definitely still useful to try to squeeze out additional performance if possible.)

In principle, this problem is relatively easy to solve. Following is a straightforward recursive implementation that evaluates every possible sequence of cards drawn to the dealer’s hand. Note that for clarity in all of the following discussion, we assume that the dealer stands on all hands totaling 17; the common variant where the dealer hits soft 17 is easily handled.

void computeRecursive(double p) { if (count < 17) { for (int card = 1; card <= 10; ++card) { if (cards[card] > 0) { double p_card = (double)cards[card] / totalCards; deal(card); computeRecursive(p * p_card); undeal(card); } } } else if (count <= 21) { probabilityCount[count] += p; } else { probabilityBust += p; } }

As we consider various improvements to this simple algorithm, a useful “back-of-the-envelope” metric for execution time will be the number of floating-point multiplications and divisions involved. The recursive approach above requires 60,470 multiplications– and the same number of divisions– to visit every possible dealer hand starting from each possible up card (ace through ten).

The problem with this approach is that it revisits most hands, when viewed as *unordered subsets* of cards, multiple times, corresponding to the different orders in which the cards may be drawn. For example, starting with a 6 up card, the dealer may draw a 4, 5, and finally another 6, yielding the subset {4, 5, 6, 6} for a total of 21. But the dealer also could have drawn a 4, 6, *then* a 5, yielding the same subset and total… and more importantly, each of these two possible sequences of draws has the *same probability* of occurring.

So we can improve the algorithm by enumerating the 1676 different (unordered) subsets of completed, non-busted, non-blackjack dealer hands ahead of time, including the number of possible orderings of cards for each. To compute the probabilities for a given shoe of cards, we evaluate each dealer hand just once, computing the probability of *any one convenient ordering* of cards in the hand (since each ordering has the same probability), and multiplying by the number of possible *valid* orderings of those cards.

(Note that the number of valid orderings is not a simple multinomial coefficient, and in general depends on which card in the dealer’s hand was the up card. Consider the above example of the dealer drawing {4, 5, 6} to a 6 up card. Two of the six possible sequences of draws are invalid– (6, 5, 6, 4) and (6, 6, 5, 4)– since the dealer would stand on 17 and not draw the final 4.)

Let’s make what we have so far more precise by introducing some notation. We identify a dealer hand as an unordered subset of cards represented by a 10-vector , where indicates the number of cards of rank in the hand. For example, the hand described above is represented by . Let

be the set of all 1676 possible non-busted, non-blackjack dealer hands, grouped by hand total. If is a shoe also represented as a 10-vector, then the probability that the dealer draws to total from that shoe, given that his up card is , is given by

where is the falling factorial, and is the “multiplier,” or number of valid orderings of cards in starting from up card .

Even if we perform the above calculation “directly,” then we have already achieved a significant speedup, requiring just 12,756 multiplications and 10,676 divisions in total, for about a five-fold speedup over the recursive algorithm.

However, this direct calculation still involves a lot of redundancy. To see this, consider the partial order on the set of *all* 10-vectors in (i.e., all possible subsets of drawn cards) defined by inclusion, and the directed acyclic graph describing the corresponding covering relation. The empty set is the “root” minimum element in this graph, and following an edge corresponds to drawing a single additional card.

In fact, this graph gives us the recipe for computing the probability of any particular sequence of draws to a completed dealer hand. Starting with a probability value of 1, follow any path from the root to the desired hand, and for each edge along the path, multiply by the conditional probability of drawing the corresponding additional card to the hand. There are many different possible paths to the same hand– each corresponding to a particular order in which the cards are drawn– but as before, *the choice of path does not matter*. The probability of a hand is independent of the choice of path leading to it.

The key observation is that, when performing this calculation for each of the 1676 completed dealer hands, there is significant overlap in the 1676 paths from the root to each of those hands. Since each edge corresponds to a multiplication (more on this shortly), minimizing the total number of calculations is equivalent to minimizing the number of edges in a connected subgraph spanning .

This is a Steiner tree problem… which unfortunately is NP-complete even for many restricted classes of graphs. However, this particular graph on has a *lot* of structure, structure which may be exploitable to yield an algorithm for efficiently computing the desired tree of minimum size.

I say “may,” because I am almost certain that the tree that I came up with is *not* optimal, requiring a couple of iterations and some *ad hoc* massaging until I couldn’t trim it down any further. The following figure shows a zoomed-in view of the top levels of the tree. Note that while not every vertex is in , every *leaf* is.

And the next figure shows all 2034 vertices of the entire tree. The 1676 dealer hands are shown in red, while the blue vertices are the “extra” vertices (of which we want as few as possible) on the path to one or more red vertices.

So, how much performance improvement does this buy us? First, recall that I said that each edge in the tree corresponds to a multiplication. That wasn’t quite fair; since each edge is a probability, it involves a ratio (i.e., division) that is a function of a number of cards of a particular rank in the shoe, as well as of the *total* number of cards in the shoe. However, although there are 2033 edges in the tree, it turns out that they correspond to just 217 distinct conditional probabilities.

The result is an algorithm requiring, in total, just 5779 multiplications and 267 divisions, almost *another* four-fold speedup from the previous improvement. The latest version of the software now computes exact overall expected value and complete composition-dependent strategy for all possible player hands in about half a second in the worst case on typical hardware. At this point, it starts to become feasible to “count cards” *perfectly*— for both playing *and* wagering– at least, if electronic devices were allowed at the table.

Pingback: Efficiency of card counting in blackjack (Part 1) | Possibly Wrong

I came across a paper by John Nairn, an engineer, from ’08 giving what he believed were the 1st perfect pair-splitting results for a full 4 splits. He claims it took 45 days for runs over a few rulesets. You appear to have done several orders of magnitude better than this. Are you familiar with Nairn’s paper and can you confirm that this impression is correct? http://www.cof.orst.edu/cof/wse/faculty/Nairn/papers/splitting.pdf

Yes– Nairn and I exchanged emails back in 2011 about this, when I was working on the splitting calculations referenced in the link at the beginning of this post. (Actually, the main focus was on one of his comparison studies by Manson, Barr, and Goodnight, which contained values that disagreed with his results. I had used this paper as a reference as well back in 1999, and found the same discrepancy, which turned out to be a flaw in their algorithm that incorrectly conditioned values on the dealer not having blackjack. I found this flaw… because my first CA software made the same mistake!)

You are correct in your impression that the current state of the art has improved significantly beyond this work, in at least two specific areas: one is the efficiency of calculation of dealer probabilities (which is the subject of this post), and the other is the recurrence (described in the paper linked at the start of this post) allowing exact calculation of resplits without the “full” enumeration done by Nairn.

It isn’t clear whether *my* software represents that state of the art, though. Granted, I don’t know of any algorithms that are faster (by roughly an order of magnitude), having compared performance with several other CAs by other developers in various blackjack-related forums. But blackjack is/can be a business, and there may be someone out there with something faster that is less willing to share :).

Wow, ok, thanks for the quick response. I thought I could improve on Nairn’s algorithm, but there’s no way I’m improving on what you’re doing here. I’m going to stare at this a while and see what I can learn. Thanks 🙂

Pingback: Distribution and variance in blackjack | Possibly Wrong