Differences between version 3 and previous revision of DecisionTheory.

Other diffs: Previous Major Revision, Previous Author

Newer page: version 3 Last edited on Sunday, December 29, 2002 4:01:46 pm. by DavidLucifer
Older page: version 2 Last edited on Sunday, December 29, 2002 3:35:08 pm. by DavidLucifer
@@ -1,5 +1,16 @@
 See [definition | http://pespmc1.vub.ac.be/ASC/DECISI_THEOR.html] at PrincipiaCybernetica. 
+ 
+Here is a fairly simple example of analyzing a decision under uncertainty. Say you are given the opportunity to play a street game. The cost of playing is $10. You draw a card from a deck and the payoff depends on what card you draw. If you draw a black card you get nothing. If you draw a heart, you get $10 (your money back). If you draw diamond you get $20 (double your money back). The decision is whether or not to play.  
+ 
+To apply decision theory to the question of whether or not to play, you calculate the expected payoffs of both the alternatives. The expected payoff of playing the game is equal to the payoffs of the possibilities multiplied by the probability of the outcome. There are 3 possible outcomes of playing the game:  
+* draw a black card (probability = 0.5, payoff = 0, expected payoff = 0.5 x 0 = $0.00)  
+* draw a heart (probability = 0.25, payoff = 10, expected payoff = 0.25 x 10 = $2.50)  
+* draw a diamond (probability = 0.25, payoff = 20, expected payoff = 0.25 x 10 = $5.00)  
+ 
+The expected payoff of the game is the sum of the calculations above, minus the cost of the game, i.e. ($0.00 + $2.50 + $5.00 - $5.00 = $2.50)  
+ 
+Now compare that to the expected payoff of not playing the game ($0, no risk, no reward), and the expected payoff playing the game ($2.50) is higher, therefore you should play the game (assuming, of course, that you want to maximize your expected payoff).  
  
 ---- 
  
 See other InterestingMemes.