These are chat archives for **drvinceknight/Year_3_game_theory_course**

This repository contains source code for a game theory course.

@drvinceknight when looking at the transition probabilities for stochastic games am I right in assuming that the first number is the prob of staying in that game and the second is the prob of moving on to the next?

cool and also if all of the boxes connected does that mean that they are different strategies for the same game?

@drvinceknight in your notes you have a stochastic game where one of the states has a payoff of (0,0) and a transition probability of (0,1) which as far as I understand it would mean that players would definitely move on from this game once they have played it once. Why is it then that you refer to it as an absorbing state?

In the notes the game that has a payoff of (0,0) is the second game.

So the transitions (0,1) mean that you always go to the second game (which is the same game). So once you’re there you’re absorbed.

is the prob of staying in that game and the second is the prob of moving on to the next?

Sorry, I scanned: that’s not correct.

If $\pi$ is the transition vector: $\pi_i$ is the probability of transitioning to game (i).

oh right so if there were three games then the transition probability vector would have three parts?

So (0,1) just means you always go to the second game.

As the utility at that second game is (0,0) basically nothing changes and the game is effectively ‘over’.

is that why when working out the equilibria for these games you only look at the probability of staying in the same game?

thanks have been trying to get my head around that for a while today