본문 바로가기

Study/Lecture - Basic

W7.L3. Bayesian Network - Interpretation of Bayesian Network

| Detour: Naive Bayes Classifier

Given

- Class Prior $ P(Y) $

- $ d $ conditionally independent features $ X $ given the class $Y$ for each $X_i$, we have the likelihood of $P(X_i \mid Y)$

 

$$ f_{NB}(x) = argmax_{\ Y=y}\ P(Y=y) \cdot \prod_{i} \ P(X_i=x_i \mid Y=y) $$

"Y 가 Given 인 상황에서 개별 Feature X 들은 Conditional independent 하다라는 Naive 한 가정"

 

Graphical representation

| Bayesian Network

 

A graphical notation of

- Random variables

- Conditional independence

- To obtain a compact representation of the full joint distributions

 

Syntax

- A acyclic and directed graph

- A set of nodes

  * A random variable

  * A conditional distribution given its parents

  * $P(X_i \mid \textup{Parents}(X_i)) $

- A set of links

  * Direct influence from the parent to the child

 

| Interpretation of Bayesian Network

 

 

Weather is independent of the other variables

Toothache and Stench are conditionally independent given Cavity

Cavity influences the probability of Toothache and Stench

 

 

| Components of Bayesian Network

 

구조적인 정보와 수치적인 정보가 함께 활용

 

Qualitative components

- Prior knowledge of causal relations

- Learning from data

- Frequently used structures

- Structural aspects

 

Quantitative components

- Conditional probability tables

- Probability distribution assigned to nodes

 

Probability computing is related to both

- Qualitative components

- Quantitative components

 

 

 

Reference
문일철 교수님 강의 
https://www.youtube.com/watch?v=mnUcZbT5E28&list=PLbhbGI_ppZISMV4tAWHlytBqNq1-lb8bz&index=41