Probabilistic Systems Analysis and Applied Probability
theorem 1.1 demorgan's law related all three basic operations
theorem 1.2 for mutually exclusive events
theorem 1.3
theorem 1.4 The probability measure
-
$P[\emptyset] = 0$ -
$P[A^c] = 1 - P[A]$ -
For any A and B (not necessarily mutually exclusive),
$P[A \cup B] = P[A] + P[B] - P[A \cap B]$ -
$A \subset B, P[A] \leq P[B]$
Theorem 1.5 The probability of an event
theorem 1.6 For an experiment with sample space
theroem 1.7 A conditional probability measure
Axiom 1:
Axiom 2:
Axiom 3: If
Theorem 1.8 For a partition
Theorem 1.9 For any event
Theorem 1.10 Law of total probability
For a partition
Theorem 1.11 Bayes' theorem
Definition 1.1 Outcome An outcome of an experiment is a possible result of the experiment.
Definition 1.2 Sample space The sample space of an experiment is the finest-grain, mutually exclusive, collectively exhaustive set of all possible outcomes of the experiment.
Definition 1.3 Event An event is a subset of the sample space.
Definition 1.4 Axioms of Probability A probability measure
Axiom 1 For any event
Axiom 2
Axiom 3 For any countable collection
Definition 1.5 Conditional probability The conditional probability of an event
Conditional probability is defined only when
Definition 1.6 Two independent events Two events
Definition 1.7 Three Independent Events
(a)
(b)
(c)
(d)
Definition 1.8 More than Two Independent Events
If
(a) all collections of
(b)
Theorem 2.1 An experiment consists of two subexperiments. If one subexperiment has
Theorem 2.2 The number of k-permutations of
Theorem 2.4 Given
Theorem 2.5 For
Theorem 2.6 The number of observation sequences for
Theorem 2.7 For n reptitions of a subexperiment with sample space
Theorem 2.8 The probability of
Theorem 2.9 A subexperiment has sample space
Definition 2.1
Definition 2.2 Multinomial coefficient
Theorem 3.1 For a discrete random variable X with PMF
Theorem 3.2 For any discrete random variable
Theorem 3.3 For all
Theorem 3.4 The Bernoulli
Theorem 3.5 The geometirc
Theorem 3.6
(a) For the binomial
(b) For the Pascal
(c) For the discrete uniform
Theorem 3.8 Perfom
Theorem 3.9 For a discrete random variable
Theorem 3.10 Given a random variable
Theorem 3.11 For any random variable
Theorem 3.12 For any random variable
Theorem 3.13 In the absence of observations, the minimum mean square error estimate random variable
Theorem 3.14
Theorem 3.15
Theorem 3.16
(a) If X is Bernoiulli
(b) If X is geometric
(c) If X is binomial
(d) If X is Pascal
(e) If X is Poisson
(f) If X is discrete uniform (k, l), then
Definition 3.1 Random Variable
A random variable consists of an experiment with a probability measure
Definition 3.2 Discrete Random Variable
Definition 3.3 Probability Mass Function PMF The probability mass function (PMF) of a discrete random variable
Definition 3.4 Bernoulii (p) Random Variable
Definition 3.5 Geometric (p) Random Variable
where the parameter p is on the range
Definition 3.6 Binomial
where
Definition 3.7 Pascal
where
Definition 3.8 Discrete Uniform
where the parameters k and l are integers such that
Definition 3.9 Poisson
where the parameter
Definition 3.10 Cumulative Distribution Function (CDF) The cumulative distribution function (CDF) of a discrete random variable
Definition 3.11 Mode A mode of random variable
Definition 3.12 Median A median
Definition 3.13 Expected Value The expected value of
Definition 3.14 Derived Random Variable Each sample value y of a derived random variable
Definition 3.15 Variance The variance of random variable
Definition 3.16 Standard Deviation The standard deviation of random variable
Definition 3.17 Moments For random variable
(a) The nth moment is
(b) The nth central moment is
Theorem 4.1 For any random variable
(a)
(b)
(c)
Theorem 4.2 For a continuous random variable
(a)
(b)
(c)
Theorem 4.3
Theorem 4.4 The expected value of a function,
Theorem 4.5 For any random variable
(a)
(b)
(c)
(d)
Theorem 4.6 If
- The CDF of
$X$ is
-
The expected value of
$X$ is$E[X] = {(a + b)}/{2} $ -
The variabce of
$X$ is$Var[X] = {(b - a)^2}/{12} $
Theorem 4.7 Let
Theorem 4.8 If
- The CDF of
$X$ is
-
The expected value of
$X$ is$E[X] = {1}/{\lambda} $ -
The variance of
$X$ is$Var[X] = {1}/{\lambda^2} $
Theorem 4.9 If
Theorem 4.10 If
Theorem 4.11 Let
Theorem 4.12 If
Theorem 4.13 If
Theorem 4.14 If
The probability that
Theorem 4.15
Theorem 4.16 For any continuous function g(x),
Theorem 4.17
Theorem 4.18 For a random variable
Definition 4.1 Cumulative Distribution Function (CDF) The cumulative distribution function (CDF) of random variable
Definition 4.2 Continuous Random Variable
Definition 4.3 Probability Density Function (PDF) The probability density function (PDF) of a continuous random variable
Definition 4.4 Expected Value The expected value of a random variable
Definition 4.5 Uniform Random Variable
Definition 4.6 Exponential Random Variable
Definition 4.7 Erlang Random Variable
Definition 4.8 Gaussian Random Variable
Definition 4.9 Standard Normal Random Variable The standard normal random variable
Definition 4.10 Standard Normal CDF The CDF of the standard normal random variable
Definition 4.11 Standard Normal Complementary CDF The standard normal complementary CDF is
Definition 4.12 Unit Impluse (Delta) Function Let
The unit impulse function
Definition 4.13 Unit Step Function The unit step function is
Definition 4.14 Mixed Random Variable