Had an interesting mathematics argument in psych class today. Here's the problem: You have a binary, zero-sum game. At the beginning of each game event, you add X dollars to the pot. If a flipped coin is heads, you lose that money. Otherwise, you gain 2X the pot. Is there a way to, on average, do better than breaking even in this game? You can't just put in random amounts of money. If you always put in the same amount, you will average 0, as is obvious. Now let's make an assumption. You have infinite capital. (We'll deal with this later.) Here's the trick: Every time you lose, double your bet. Every time you win, stop playing. (At which point you can start over if you want.) You start with 0 dollars net gain, and infinite dollars in your capital pool. First bet ($1): You can either win and gain a dollar (50% chance) or lose and bet $2. Second bet ($2): Win and gain -1 + 2 = 1 dollars (25% chance overall) or lose and bet $4. Third bet ($4): Win and gain -1 - 2 + 4 = 1 dollars (12.5% chance overall) or lose and bet $8. See the trend here? You have a 50% chance of winning a dollar, then a 25% chance of winning a dollar, then a 12.5% chance of winning a dollar. As long as you can keep betting, you have a 100% chance of winning a dollar. Now what if you don't have infinite capital? What if you have to stop sometime? Let's say you have enough money for k bets (doublings). There is a (1/2)^k = 2^(-k) chance of you losing 2^k-1 dollars, and a 1-2^(-k) chance of winning 1 dollar. Expected winnings: -(2^(-k) * (2^k - 1)) + (1 - 2^(-k)) * 1 = -(1 - 2^(-k)) + 1 - 2^(-k) = 0. It's kind of like an inverse lottery. You have a very good chance of winning a dollar, and a very slim chance of losing a lot of dollars. Depending on your utility function of money (which, for most humans, is certainly not linear), you may in fact find a greater expected utility playing this bet with a finite k, even though the expected monetary value is 0.