These are chat archives for drvinceknight/Year_3_game_theory_course

10th
May 2016
@Huaraz2
May 10 2016 10:21
@drvinceknight when looking at the transition probabilities for stochastic games am I right in assuming that the first number is the prob of staying in that game and the second is the prob of moving on to the next?
Vince Knight
@drvinceknight
May 10 2016 10:21
Yes.
@Huaraz2
May 10 2016 10:23
cool and also if all of the boxes connected does that mean that they are different strategies for the same game?
@Huaraz2
May 10 2016 10:38
@drvinceknight in your notes you have a stochastic game where one of the states has a payoff of (0,0) and a transition probability of (0,1) which as far as I understand it would mean that players would definitely move on from this game once they have played it once. Why is it then that you refer to it as an absorbing state?
Vince Knight
@drvinceknight
May 10 2016 10:40
In the notes the game that has a payoff of (0,0) is the second game.
So the transitions (0,1) mean that you always go to the second game (which is the same game). So once you’re there you’re absorbed.

is the prob of staying in that game and the second is the prob of moving on to the next?

Sorry, I scanned: that’s not correct.

If $\pi$ is the transition vector: $\pi_i$ is the probability of transitioning to game (i).
@Huaraz2
May 10 2016 10:42
oh right so if there were three games then the transition probability vector would have three parts?
Vince Knight
@drvinceknight
May 10 2016 10:49
Yup.
So (0,1) just means you always go to the second game.
@Huaraz2
May 10 2016 10:50
thanks :+1: this is making a lot of sense now
Vince Knight
@drvinceknight
May 10 2016 10:50
As the utility at that second game is (0,0) basically nothing changes and the game is effectively ‘over’.
@Huaraz2
May 10 2016 10:51
is that why when working out the equilibria for these games you only look at the probability of staying in the same game?
Vince Knight
@drvinceknight
May 10 2016 10:51
Yes.