##### Reference no: EM13909545

Consider a Markov decision problem with M states in which some state, say state 1, is inherently reachable from each other state.

**(a) **Show that there must be some other state, say state 2, and some decision, *k*2, such that *P*(*k*2) *> *0.

**(b) **Show that there must be some other state, say state 3, and some decision, *k*3, such that either *P*(*k*3) *> *0 or *P*(*k*3)*> *0.

**(c) **Assume, for some *i*, and some set of decisions *k*2, *...** *, *k**i *that, for each *j*,2 ≤ *j *≤ *i*, (*k*) *P**jl > *0 for some *l j *(i.e., that each state from 2 to *j *has a non-zero transition to a lower numbered state). Show that there is some state (other than 1, *...** *, *i*), say *i *+ 1 and some decision *k**i*+1 such that *P**i*+1,*l **> *0 for some *l *≤ *i*.

**(d) **Use (a), (b), and (c) to observe that there is a stationary policy *k *= *k*1, *...** *, *k*M for which state 1 is accessible from each other state.

Text Book: Stochastic Processes: Theory for Applications By Robert G. Gallager.