1. Basic Probability Concepts
| Concept |
Formula / Definition |
Remarks |
| Probability of an event A |
P(A) = Number of favourable outcomes / Total outcomes (equally likely) |
0 ≤ P(A) ≤ 1 |
| Complement |
P(A') = 1 – P(A) |
|
| Addition Law |
P(A ∪ B) = P(A) + P(B) – P(A ∩ B) |
|
| Multiplication Law (General) |
P(A ∩ B) = P(A) × P(B|A) = P(B) × P(A|B) |
|
| Independent events |
P(A ∩ B) = P(A) × P(B) |
|
| Conditional Probability |
P(A|B) = P(A ∩ B) / P(B) (P(B) > 0) |
|
| Bayes’ Theorem |
P(A_i | B) = \frac{P(B|A_i) P(A_i)}{\sum P(B|A_i) P(A_i)} |
For partition A₁, A₂, … |
2. Random Variables
| Type |
Definition |
Probability Function |
| Discrete Random Variable |
Takes countable values (finite or countably infinite) |
PMF: p(x) = P(X = x) |
| Continuous Random Variable |
Takes uncountable values in an interval |
PDF: f(x) such that P(a ≤ X ≤ b) = \int_a^b f(x)\, dx |
- Properties of PMF: Σ p(x_i) = 1, p(x_i) ≥ 0
- Properties of PDF: \int_{-\infty}^{\infty} f(x)\, dx = 1, f(x) ≥ 0
3. Expectation and Variance
| Quantity |
Discrete |
Continuous |
| Expectation E(X) |
Σ x p(x) |
\int x f(x)\, dx |
| E[g(X)] |
Σ g(x) p(x) |
\int g(x) f(x)\, dx |
| Variance Var(X) |
E(X²) – [E(X)]² = Σ (x – μ)² p(x) |
\int (x – μ)² f(x)\, dx |
| Standard Deviation σ |
√Var(X) |
√Var(X) |
| E(aX + b) |
a E(X) + b |
a E(X) + b |
| Var(aX + b) |
a² Var(X) |
a² Var(X) |
4. Important Discrete Distributions
| Distribution |
PMF p(x) |
Conditions |
Mean μ |
Variance σ² |
Remarks / Use |
| Binomial |
\binom{n}{x} p^x (1-p)^{n-x} |
x = 0,1,…,n |
np |
np(1-p) |
Fixed n trials, success prob. p |
| Poisson |
e^{-\lambda} \lambda^x / x! |
x = 0,1,2,… |
\lambda |
\lambda |
Rare events, \lambda = np (limit of binomial) |
Recurrence relations (useful in problems):
- Binomial: p(x+1)/p(x) = \frac{(n-x)}{(x+1)} \cdot \frac{p}{1-p}
- Poisson: p(x+1)/p(x) = \lambda/(x+1)
5. Normal Distribution (Continuous)
| Property |
Formula |
| PDF |
f(x) = \frac{1}{\sigma \sqrt{2\pi}} \exp\left[ -\frac{(x-\mu)^2}{2\sigma^2} \right] |
| Standard Normal (Z) |
Z = \frac{X - \mu}{\sigma} \sim N(0,1) |
| Mean |
\mu |
| Variance |
\sigma^2 |
| Symmetry |
f(\mu + k) = f(\mu - k) |
| Linear transformation |
aX + b \sim N(a\mu + b, a^2\sigma^2) |
Important probabilities (memorize or use table):
- P(–1 ≤ Z ≤ 1) ≈ 0.6827 (68%)
- P(–2 ≤ Z ≤ 2) ≈ 0.9545 (95%)
- P(–3 ≤ Z ≤ 3) ≈ 0.9973 (99.7%)
Summary Table of All Key Distributions
| Distribution |
Parameters |
Mean |
Variance |
PMF / PDF |
MGF (if required) |
| Binomial B(n,p) |
n, p |
np |
np(1–p) |
\binom{n}{x} p^x q^{n-x} |
(q + p e^t)^n |
| Poisson(λ) |
λ |
λ |
λ |
e^{-\lambda} \lambda^x / x! |
\exp[\lambda(e^t - 1)] |
| Normal N(μ,σ²) |
μ, σ² |
μ |
σ² |
\frac{1}{\sigma\sqrt{2\pi}} \exp\left[-\frac{(x-\mu)^2}{2\sigma^2}\right] |
\exp(\mu t + \sigma^2 t^2 / 2) |
Quick Revision Formulas (Most Frequently Asked)
| Concept |
Formula |
| Total Probability |
P(B) = Σ P(B|A_i) P(A_i) |
| Bayes’ Theorem |
P(A|B) = \frac{P(B|A) P(A)}{P(B)} |
| E(XY) for independent |
E(X) E(Y) |
| Binomial mean & variance |
np, npq |
| Poisson approximation to Binomial |
When n → ∞, p → 0, np = λ constant → Poisson(λ) |
| Normal approximation to Binomial |
X ∼ B(n,p) ≈ N(np, npq) when n large, np > 5, nq > 5 |
| Continuity correction |
P(X = k) ≈ P(k–0.5 < Y < k+0.5) where Y ∼ Normal |
These are all the essential formulas and concepts from Module IV (Probability & Distributions) as per most engineering/mathematics syllabi. Focus on solving numerical problems on Bayes’ theorem, expectation-variance calculation, and identification/application of Binomial Binomial/Poisson/Normal distributions for best exam performance.