| Detour: Naive Bayes Classifier
Given
- Class Prior $ P(Y) $
- $ d $ conditionally independent features $ X $ given the class $Y$ for each $X_i$, we have the likelihood of $P(X_i \mid Y)$
$$ f_{NB}(x) = argmax_{\ Y=y}\ P(Y=y) \cdot \prod_{i} \ P(X_i=x_i \mid Y=y) $$
"Y 가 Given 인 상황에서 개별 Feature X 들은 Conditional independent 하다라는 Naive 한 가정"
Graphical representation
| Bayesian Network
A graphical notation of
- Random variables
- Conditional independence
- To obtain a compact representation of the full joint distributions
Syntax
- A acyclic and directed graph
- A set of nodes
* A random variable
* A conditional distribution given its parents
* $P(X_i \mid \textup{Parents}(X_i)) $
- A set of links
* Direct influence from the parent to the child
| Interpretation of Bayesian Network
Weather is independent of the other variables
Toothache and Stench are conditionally independent given Cavity
Cavity influences the probability of Toothache and Stench
| Components of Bayesian Network
구조적인 정보와 수치적인 정보가 함께 활용
Qualitative components
- Prior knowledge of causal relations
- Learning from data
- Frequently used structures
- Structural aspects
Quantitative components
- Conditional probability tables
- Probability distribution assigned to nodes
Probability computing is related to both
- Qualitative components
- Quantitative components
Reference
문일철 교수님 강의
https://www.youtube.com/watch?v=mnUcZbT5E28&list=PLbhbGI_ppZISMV4tAWHlytBqNq1-lb8bz&index=41
'Study > Lecture - Basic' 카테고리의 다른 글
W7.L5. Bayesian Network - Factorization of Bayesian Networks (0) | 2023.06.05 |
---|---|
W7.L4. Bayesian Network - Bayes Ball Algorithm (0) | 2023.06.05 |
W7.L1-2. Bayesian Network - Probability Concepts (0) | 2023.05.07 |
W6.L1-7. Training Testing and Regularization (0) | 2023.05.07 |
W5.L1-9. Support Vector Machine (0) | 2023.05.06 |