MA4704-无代写
时间:2024-07-04
MA 4704 STOCHASTIC PROCESS
Wanchunzi YU
Department of Mathematics
Bridgewater State University
wyu@bridgew.edu
Contents
1 Chapter 1. Warming Up 2
1.1 Sample Space, Events and Probability . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Independence and Conditioning . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Discrete Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1
1 Chapter 1. Warming Up
1.1 Sample Space, Events and Probability
Some notations:
• Set: Ω
• Subset: A,B
• Union: A∪B
• Intersection: A∩B
• Complement: Ac, A¯,CA
• A+B: A & B are disjoint, so A+B represents the union. Similarly, ∑infk=1Ak is used for
∪infk=1Ak only when A′ks are pairwise disjoint.
• Indicator function: The indicator function of the subset A⊆Ω is the function IA :Ω→{0,1}
defined by,
IA(ω) = 1 if ω ∈ A.
IA(ω) = 0 if ω < A.
• Outcome: Random phenomena are observed by means of experiments. Each experiment
results in an outcome ω .
• Sample Space Ω: The collection of all possible outcome ω is called the sample space Ω.
• Event: Any set A of the sample space Ω as a representation of some event.
Example: Toss a die. The experiment consists in tossing a die once.
The possible outcomes are ω = 1,2, · · · ,6 and the sample space is the setΩ= {1,2,3,4,5,6}.
The subset A= {1,3,5} is the event ”the results is odd”.
Example: Head or tails. The experiment is an infinite succession of coin tosses.
One can take for the sample space the collection of all sequences ω = {xn}n>1, where xn = 1
or 0, depending on whether the n−th toss results in heads or tails.
The subset A= {ω : xk = 1 for k = 1 to 1000} is a lucky event for anyone betting on heads.
Notation: the sets A1,A2, · · · form a partition of Ω if ∑∞k=0Ak = Ω. We say that the events
A1,A2, · · · are mutually exclusive and exhaustive. They are exhaustive in the sense that any
outcome ω realizes at least one among them.
The Probability Rules
The probability P(A) of an event A ∈F measures the likeliness of its occurrence. The proba-
bility rules include 3 basic axioms of probability:
A probability on (Ω,F ) is a mapping p :F →R such that
(1) 0 6 P(A) 6 1 for all A ∈F .
2
(2) P(Ω) = 1, and
(3) P(∑∞k=1Ak) = ∑

k=1P(Ak) for all sequences {Ak}k>1 of pairwise disjoint events inF .
The triple Ω,F ,P is called a probability space, or probability model.
Example: Toss a die. An event A is a subset of Ω= {1,2,3,4,5,6}. The formula
P(A) =
|A|
6
,
where |A| is the cardinality of A (the number of elements in A), defines a probability p.
Example: Heads or Tails. Choose a probability p such that for any event of the form A = {x1 =
a1, · · · ,xn = an}, where a1,a2, · · · ,an are in {0,1},
P(A) =
1
2n
(4) For any event, P(Ac) = 1−P(A) and P( /0) = 0.
(5) Probability is monotone, that is, for any events A and B,
A⊆ B⇒ P(A) 6 P(B)
(6) For any sequence A1,A2, · · · , of events,
P(∪∞k=1Ak) 6


k=1
P(Ak)
1.2 Independence and Conditioning
Definition 1.1. (Independent). Two events A and B are called independent if and only if
P(A∩B) = P(A)P(B)
A family {An}n∈N of events is called independent if for any finite set of indices i1 < · · · < ir
where i j ∈ N(i 6 j 6 r),
P(Ai1∩Ai2∩·· ·∩Air) = P(Ai1×P(Ai1×·· ·×P(Air)
One also says that the A′ns (n ∈ N) are jointly independent.
Definition 1.2. (Conditional probability). The conditional probability of A given B is the number
P(A|B) = P(A∩B)
P(B)
defined when P(B)> 0.
If A and B are independent, then P(A|B) = P(A)
3
Theorem 1.3. (Bayes’ Rule). With P(A)> 0, we have the Bayes’ rule of retrodiction:
P(B|A) = P(A|B)P(B)
P(A)
(Bayes’ Rule of Total Cases). Let B1,B2, · · · be events forming partition of Ω. Then for any
event A, we have the Bayes’ rule of total cases:
P(A) =


i=1
P(A|Bi)P(Bi)
(Bayes Sequential Formula). For any sequence of events A1, · · · ,An, we have the Bayes sequen-
tial formula:
P(∩ki=1Ai) = P(A1)P(A2|A1)P(A3|A1∩A2) · · ·P(Ak|∩k−1i=1 Ai)
Example: Should we always believe doctors? Doctors apply a test that gives a positive result in
99% of the cases where the patient is affected by the disease. However, it happens in 2% of the
cases that a healthy patient has a positive test. Statistical data show that one individual out of 1000
has the disease. What is the probability that a patient with a positive test is affected by the disease?
Example: Three cards are randomly selected, without replacement, from an ordinary deck of 52
playing cards. Compute the conditional probability that the first card selected is a spade given that
the second and third cards are spade.
Example: There are 3 coins in a box. One is a two-headed coin, another is a fair coin, and the
third is a biased coin that comes up heads 75 percent of the time. When one of the three coins is
selected at random and flipped, it shows heads. What is the probability that it was the two-headed
coin?
Example: There are 15 tennis balls in a box, of which 9 have not previously been used. Three of
the balls are randomly chosen, played with, and then returned to the box. Later, another 3 balls are
randomly chosen from the box. Find probability that none of these balls has ever been used.
Example: A and B play a series of games. Each game is independent won by A with probability
p and by B with probability 1− p. They stop when the total number of wins of one player is two
greater than that of the other player. The player with the greater number of total wins is declared
the winner of the series.
(a) Find the probability that a total of 4 games are played.
(b) Find the probability that A is the winner of the series.
Exercise: Suppose that E and F are mutually exclusive events of an experiment. Show that if
independent trials of this experiment are performed, then E will occur before F with probability
P(E)/[P(E)+P(F)].
4
1.3 Discrete Random Variables
Definition 1.4. Let E be a countable set. A function X : Ω→ E such that for all x ∈ E.
{ω; X(ω) = x} ∈F
Example: Tossing a die. The sample space is the set Ω= {1,2,3,4,5,6}.
Take for X the identity: X(ω) = ω .
Therefore X is random number obtained by tossing a die.
Definition 1.5. (Probability Distribution Function). Let X be a discrete random variable taking its
values in E. Its probability distribution function is the function pi : E→ [0,1], where
pi(x) := P(X = x) (x ∈F ).
Example: The Gambler’s Fortune. The number of occurrences of heads in n tosses is Sn = X1 +
X2 + · · ·+Xn. This random variable is the fortunate at time n of a gambler systematically betting
on heads. It take integer values from 0 to n. We have
P(Sn = k) =
(
n
k
)
1
2n
Definition 1.6. (Expectation). Let X be a discrete random variable taking its values in a countable
set E and let the function g : E → R be either non-negative or such that it satisfies the absolute
summability condition

x∈E
|g(x)|P(X = x)< ∞
Then one defines E[g(x)], the expectation of g(x), by the formula
E(g(x)) = ∑
x∈E
g(x)P(X = x).
Example: The Gambler’s Fortune. Consider the random variable Sn = X1 + · · ·+Xn taking its
values in {0,1, · · · ,n}. Its expectation is E(Sn) = n/2.
Theorem 1.7. (Basic Property of Expectation). Let A be some event. The expectation of the
indicator random variable X = IA is E[IA] = P(A)
Example: Two balls are chosen randomly from an box containing 8 white, 4 black, and 2 or-
ange balls. Suppose that we win $2 for each black ball selected and we lose $1 for each white
ball selected. Let X denote our winnings. What are the possible values of X , and what are the
probabilities associated with each value?
essay、essay代写