Reference no: EM131005756
1. The state of a process changes daily according to a two-state Markov chain. If the process is in state iduring one day, then it is in state j the following day with probability Pi,j , where
P0,0 = 0.4, P0,1 = 0.6, P1,0 = 0.2, P1,1 = 0.8
Every day a message is sent. If the state of the Markov chain that day is i then the message sent is "good" with probability pi and is "bad" with probability qi = 1 - pi , i = 0, 1
(a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday?
(b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday?
(c) In the long run, what proportion of messages are good?
(d) Let Yn equal 1 if a good message is sent on day n and let it equal 2 otherwise. Is {Yn, n ? 1} a Markov chain? If so, give its transition probability matrix. If not, brie?y explain why not.
Probability that the gambler quits a winner
: Suppose that on each play of the game a gambler either wins 1 with probability p or loses 1 with probability 1 - p. The gambler continues betting until she or he is either up n or down m. What is the probability that the gambler quits a winner?
|
Policyholder of the insurance company
: Find the average premium received per policyholder of the insurance company of Example 4.27 if λ= 1/4 for one-third of its clients, and λ = 1/2 for two-thirds of its clients.
|
Transition probability matrices for ergodic markov chains
: Let P(1) and P(2) denote transition probability matrices for ergodic Markov chains having the same state space. Let π1 and π2 denote the stationary (limiting) proba- bility vectors for the two chains. Consider a process de?ned as follows:
|
Genetic disorder brochure
: You are creating a brochure, Poster or Powerpoint on a randomly selected genetic disorder. Please present your information as if I have no idea about this genetic disorder.
|
State of a process changes daily
: 1. The state of a process changes daily according to a two-state Markov chain. If the process is in state iduring one day, then it is in state j the following day with probability Pi,j , where
|
Limiting probabilities of this markov chain
: Consider a Markov chain with states 0, 1, 2, 3, 4. Suppose P0,4 = 1; and suppose that when the chain is in state i, i > 0, the next state is equally likely to be any of the states 0, 1, ... , i - 1. Find the limiting probabilities of this Markov c..
|
Walgreens pharmaceutical strategic analysis
: EFE/IFE Quantitative Matrices, SWOT Matrix, a SPACE Matrix, BCG Matrix, IE Matrix, and Grand Strategy Matrix, Write a paper in which you discuss methods for evaluating the success and effectiveness of the recommendations for change that you are recom..
|
Probability that the class does well on a type
: A professor continually gives exams to her students. She can give three possi- ble types of exams, and her class is graded as either having done well or badly. Let pi denote the probability that the class does well on a type i exam,
|
How did culture influence the arts
: What was the role of women and their influence on the various arts?
|