Steady state probability of markov chains

Assignment Help Basic Statistics
Reference no: EM13903548

Question 1. A state condition where-as the probability of one event changes as a result of occurrence of another related event is known as:

1. initial probability

2. transition matrix

3. stationary probability

4. conditional probability

Question 2. The steady state probabilities of a double stochastic probability matrix is equal to

1. 1.0

2. 1/(m*n), where m is the number of rows of the matrix and n is the number of columns of the matrix

3. 1/n, where n is the number of columns in the matrix

4. 0.50

5. None of the above

Question 3. The Markov state where, in all conditions, it will never converge to steady state is

1. Periodic State

2. Absorbing State

3. Ergodic State

4. Trapping State

5. Transient

Question 4. Steady State Probability of Markov Chains are

1. States of condition that reach probability value equal to 1.0 in all cases

2. Is the convergence to an equilibrium or "steady state" condition and applies to all markov chains

3. Steady State Probabilities is the product of Steady State Probabilities multiplied by the Transition Matrix

4. All of the above

5. None of the above

Question 5. A Transition matrix is

1. Current states of a system at time t

2. Conditional probabilities that involve moving from one state to another

3. The stationary assumption of a markov chain

4. A m by n matrix of probabilities

5. none of the above

Question 6. In a transition matrix where the sum probabilities values in each column equals 1.0 is referred to as

2. Conditional Probability Matrix

3. Stationary Matrix

4. Double Stochastic Transition Matrix

5. None of the above

Question 7. Which of the following is true regarding the markov Analysis Methodology

1. States of Nature are outcomes of a process (machine operating or broken, % of customers buying product A & B, etc)

2. There exist an initial probability associated with the state of nature (100% operational and 0% broken, 80% customers buy product A and 20% buy product B)

3. There is also transition (or conditional) probabilities of moving from one state to another (represented by the Transition Matrix)

4. All of the above

5. None of the above

Question 8. A Markov Chain is

1. A discrete-time stochastic process that is a description of the relation between the random variables at various states (in time) X0, X1, X2

2. A continuous -time stochastic process in which the state of the system can be viewed at any time, not just at discrete instants in time.

3. A probability assessment that is not conditional

4. All of the above

5. None of the above

Question 9. This condition where the probability relating the next period's state to the current state does not change over time is referred to as

1. Transition Matrix

2. Stationary Assumption

3. Marvov Process

4. Markov Chain

5. Initial Probability Distribution

Question 10. A process where-by the input variables are random and are defined by distributions rather than a single number is known as a

1. Markov Process

2. Stochastic Process

3. Deterministic Process

4. All of the above

5. None of the above

Question 11. Which of the following is a true statement regarding Markov Analysis

1. Markov Analysis is a Technique that involves predicting probabilities of future occurrences

2. Markov Analysis is a stochastic process in which current states of a system depend on previous states

3. The objective of Markov Analysis is to predict future states of nature given the probabilities of existing states

4. All of the above are true regarding Markov Analysis

5. All of the above are NOT true regarding Markov Analysis

Question 12. Which of the following are Markov properties

1. The states of nature are mutually exclusive and collectively exhausted

2. Each entry in the transition matrix is a conditional probability that is nonnegative in value

3. The summation of all the probabilities in the transition matrix sume to a value of 1.0 along each row in the matrix

4. All of the above

5. None of the above

Question 13. The Markov state that is characterized by all zero's in the retention cells (diagonal of the matrix) and all one's or zero's in non retention cells is referred to as

1. Periodic State

2. Absorbing State

3. Trapping State

4. Ergodic State

5. Transient

Give the formula for the proper statistic

Give the formula for the proper statistic to use, and also the p-value and An experiment was conducted to compare the weights of the combs of roosters fed two different vitami

Hypothesis testing on maintence cost

An Electric Co. Operates a fleet of trucks that provide electrical service to the construction industry. Monthly mean maintenance cost has been \$75 per truck.

In what situations might you use probability as a manager to approach business-related problems? What are the advantages to using probability concepts in business decisions?

Value of linear correlation coefficient

A supplier of 3.5" disks clams that only 1% of the disks are defective. In a random sample of 600 disks, it is found that 3% are defective, but the supplier claims that this

Prepare the relative frequency column

Containers of yogurt were inspected with regard to the number of days until they expire. 0 represents a yogurt container with today's date. The following table gives the fre

What is the probability that the fifth adult selected will be the first with ADHD?- What is the probability that at least eight adults will be selected before identifying a pe

Suggest a reason for the peak in womens length of stay

Describe these distributions by writing a few sentences comparing the duration of hospitalization for men and women.- Can you suggest a reason for the peak in women's length o

Average number of customers in the shop

(a) What is the average number of customers in the shop? (b) What is the proportion of potential customers that enter the shop? (c) If the barber could work twi