Exam 1: Sets and Combinations
Ch 1: sets
1.1: set operations:
intersection, union, complement, cartesian product.
1.2: Venn diagrams
+ counting
* partitions (disjoint unions): n(A "+" B) = n(A) + n(B),
where "+" means disjoint union.
* cartesian products: n(A x B) = n(A)*n(B)
+ deMorgan's laws:
* (A union B)' = A' intersect B'
* (A intersect B)' = A' union B'
1.3: counting
* n(A union B) = n(A) + n(B) - n(A intersect B).
1.4: trees: counting possible outcomes of a sequence of experiments
by multiplying number of possible outcomes
of a sequence of experiments.
Ch 2: Combinatorics (counting)
2.1: Probabilities
* sum of probabilities of all outcomes is 1.
* Pr(E) = n(E)/n(sample space) if all outcomes are equally likely
2.2: Permutations (order matters)
P(n,r) = n!/(n-r)! = n*(n-1)*...*(n-r+1)
= permutations of n objects taken r at a time
2.3: Combinations (order doesn't matter)
C(n,r) = P(n,r)/r!
= combinations of n objects taken r at a time
Exam 2: Probability
Ch 3: Probability of events
3.1: axioms and properties of probability
* probability measures the size of a set of outcomes.
* axioms:
1. 0 <= Pr[E] <= 1
2. Pr[S] = 1
3. Pr[E "+" F] = Pr[E] + Pr[F].
* properties:
1. Pr[E'] = 1 - Pr[E]
2. Pr[E union F] = Pr[E] + Pr[F] - Pr[E intersect F]
3.2: conditional probability
* Pr[A|B] := Pr[A and B]/Pr[B]
* A and B are independent if Pr[A and B] = Pr[A]*Pr[B],
i.e., knowledge of one event gives no information
about the probability of the other.
3.3: trees (to depict outcomes of stochastic processes)
3.4: Bayes probabilities
* the probability of a leaf of a tree is the
product of the conditional probabilities along
the branches to that leaf;
* Bayes' formula expresses this idea:
Pr[A|B] = Pr[B|A]Pr[A]/Pr[B] (for two events).
* Bernoulli process: for n independent Bernoulli trials
each with probability of success p and probability
of failure q = 1-p,
Pr[r successes] = C(n,r) p^r q^(n-r)
Ch 4: Random Variables (random numbers)
4.1: probability density function
* binomial random variable = number of successes of Bernoulli process
4.2: Expected value
* definition
Let X = random variable with k outcomes.
Let x_j = outcome number j.
Let p_j = P(X=x_j).
Then E[X] = x_1*p_1 + x_2*p_2 + ... + x_k*p_k.
* expectation of binomial random variable:
E[X] = n*p, (n trials each with success probability p)
* variation and standard deviation:
Let m = E[X].
Var[X] = (x_1 - m)^2*p_1 + (x_2 - m)^2*p_2 +...+ (x_n - m)^2*p_n.
standard deviation = sigma = sqrt(Var[X]).
* variance of binomial random variable:
Var[X] = n*p*q.
Exam 3: Linear Algebra
Ch 5: systems of equations
5.1: lines
* slope = m = rise over run = (y_2 - y_1)/(x_2 - x_1)
* use point-slope formula and/or slope-intercept formula
to find the equation of a line through two points
5.2: linear systems
* substitution method
* reduction method (combining equations)
5.3 linear systems in many variables (*important*)
* 3D graphs of planes
* matrix representation
* row operations
* row reduction
* reduced form
* solution sets
+ unique solution
+ overdetermined case (inconsistent system)
+ underdetermined case (infinite family of solutions)
Ch 6: Matrix algebra
- matrix addition and multiplication
- matrix identity
- matrix inverse
- using matrix inverse to solve linear systems
Exam 4: Applications
Ch 7: Linear programming
- feasible sets
- evaluation at corner points and auxiliary points
Ch 8: Markov Chains
- state transition matrix
- state vectors
- regular markov chains
- stable state vector
Final material
Ch 9: financial math
- compound interest
- present and (future) amount
- present value of annuity
- amount of annuity
- payment of ammortized loan
- payment of sinking fund