Study/Lecture - Basic

W7.L1-2. Bayesian Network - Probability Concepts

공부해라이 2023. 5. 7. 21:55

| Introduction

Graphical model: 확률 변수 간의 관계를 표현

 

Conditional probability

 

$$ P(\textup{H}=\textup{True} \mid \textup{F}=\textup{True}) = \frac{P(\textup{H}=\textup{True}, \textup{F}=\textup{True})}{P(\textup{F}=\textup{True})} $$

 

"​우리가 가지고 있는 어떠한 정보" 를 기반으로 "앞으로 일어날 일을 예측" 하는 것에 관심이 많음

 

 

| Law of Total Probability

Law of total probability: "summing out", "marginalization"

 

$$P(a) = \sum_{b} P(a, b) = \sum_{b} P(a \mid b) \cdot P(b) $$

$P(a) = \sum_{b} P(a, b) = P(a, b=\textup{True}) + P(a, b=\textup{False})$

 

 

Given a joint distribution $P(a, b, c, d)$

We can obtain any "marginal probability" (e.g. $P(b)$) by summing out the other variables

 

$P(b) = \sum_{a} \sum_{c} \sum_{d} P(a, b, c, d) $

 

Given a joint distribution $P(a, b, c, d)$

We can obtain any "conditional probability" of interest

 

$P(c \mid b) = \sum_{a} \sum_{d} P(a, c, d \mid b) = \frac{1}{P(b)} \sum_{a} \sum_{d} P(a, c, d, b) $

 

where $\frac{1}{P(b)}$ is just a normalization constant

 

 

Joint distribution contains the information we need to compute any probability of interest

 

Joint (Ex. $P(a, b)$) 를 알면 individual probability (Ex. $P(a)$ or $P(b)$) 도 알 수 있다.

그리고 Joint 를 알면 individual probability 를 알 수 있으니, Conditional probability 또한 알 수 있다.

(하지만 Joint probability 는 parameter 수가 기하급수적으로 많이 필요하다... $P(a, b, c, d)$ ... $2^4$)

 

 

| Chain Rule (Factorization)

 

By definition of joint probability

 

$$ P(a,\ b,\ c,\ ...\ ,\ z) = P(a\mid b,\ c,\ ...\ ,\ z)\cdot P(b,\ c,\ ...\ ,\ z) $$

$$ P(a,\ b,\ c,\ ...\ ,\ z) = P(a\mid b,\ c,\ ...\ ,\ z)\cdot P(b \mid c,\ ...\,\ z)\cdot P(c \mid \ ...\ ,\ z) \ ...\  P(z) $$

 

| Independence

 

Variables A and B are independent if

 

$$ P(A \mid B) = P(A) $$

​$$ P(A, B) = P(A) \cdot P(B) $$

$$ P(B \mid A) = P(B) $$

 

Example. "n" coin flips

$$ P(C_{1}, ... , C_{n}) = \prod_{i=1}^{n}P(C_{i}) $$

 

 

Marginal independent

A and B is marginally independent!

$$ P(A \mid B) = P(A) $$

 

Conditional independent

A and B is conditionally independent! (Given C)

C ... Commander 

$$ P (A \mid C) = P(A \mid B, \ C) $$

 

 

 

 

Reference
문일철 교수님 강의 
https://www.youtube.com/watch?v=mnUcZbT5E28&list=PLbhbGI_ppZISMV4tAWHlytBqNq1-lb8bz&index=40