Upload
salomon-marcel
View
103
Download
0
Embed Size (px)
Citation preview
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
The most incomprehensible thing about the world is that it is comprehensible
Albert Einstein
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Cognition
Julien DiardCNRS - Laboratoire de Psychologie et NeuroCognition
Pierre BessièreCNRS - Laboratoire de Physiologie de la Perception et de l’Action
Probabilistic models of action, perception, inference,
decision and learning
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
To get more info
Bayesian-Programming.org
ftp://ftp-serv.inrialpes.fr/pub/emotion/bayesian-programming/Cours
http://diard.wordpress.com [email protected]
3
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Plan / planning
• Bessière c1 15/11 – Incomplétude, incertitude, Programme Bayésien,
inférence Bayésienne
• Diard c2 29/11, c3 13/12, c4 03/01– Modèles Bayésiens en robotique et sciences cognitives
• Diard c5 10/01 – Sélection de modèles, machine learning,
distinguabilité de modèles
• Bessière c6 17/01– Compléments : algorithmes d’inférence, maximum
d’entropie
4
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Perception test
Daniel J. Simons & Christopher ChabrisHarvard University
5
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010 6
http://www.youtube.com/watch?v=ubNF9QNEQLA
http://nivea.psycho.univ-paris5.fr/demos/BONETO.MOV
http://viscog.beckman.illinois.edu/flashmovie/12.php
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010 7
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010 8
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010 9
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Probability Theory
as an alternative
to LogicThe actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man's mind .
James Clerk Maxwell
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Incompleteness
and Uncertainty
A very small cause which escapes our notice determines a considerableeffect that we cannot fail to see, and then we say that the effect is due tochance.
H. Poincaré
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Shape from Image
12
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Shape from Motion
Colas, F., Droulez, J., Wexler, M. & Bessiere, P. (2008) Unified probabilistic model of perception of three-dimensional structure from optic flow; in Biological Cybernetics,in pressColas, F. (2006) Perception des objets en mouvement : Composition bayésienne du flux optique et du mouvement de l’observateur, Thèse INPG
13
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Credit card fraud detection
15
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Beam-in-the-Bin experiment (Set-up)
16
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Beam-in-the-Bin experiment (Results)
17
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Beam-in-the-Bin experiment (Results)
18
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Beam-in-the-Bin experiment (Results)
19
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Logical Paradigm
O1begin......end
AvoidObs(01)
P
AIncompleteness
20
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Paradigm
Avoid Obstacle
ConnaissancesPréalables
R ( S , M)
M
SDonnées Expérimentales
=P(M | SDC)
P(MS | DC)
21
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
PrincipleIncompleteness
Uncertainty
Preliminary Knowledge+
Experimental Data=
Probabilistic Representation
Decision
Bayesian Inference
Bayesian Learning
€
P(a)+ P(¬ a) =1
P a∧b( ) = P a( )P b | a( ) = P b( )P a | b( )
22
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
ThesisProbabilistic inference and learning theory, considered as a
model of reasoning, is a new paradigm (an alternative to logic) to explain and understand perception, inference, decision, learning and
action.La théorie des probabilités n'est rien d'autre que le sens commun fait calcul.Marquis Pierre-Simon de Laplace
The actual science of logic is conversant at present only with things either certain, impossible, or entirely doubtful, none of which (fortunately) we have to reason on. Therefore the true logic for this world is the calculus of Probabilities, which takes account of the magnitude of the probability which is, or ought to be, in a reasonable man's mind .James Clerk Maxwell By inference we mean simply: deductive reasoning whenever enough information is at hand to permit it; inductive or probabilistic reasoning when - as is almost invariably the case in real problems - all the necessary information is not available. Thus the topic of « Probability as Logic » is the optimal processing of uncertain and incomplete knowledge .E.T. Jaynes
Subjectivist vs Objectivist epistemology of probabilities ?
23
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
A water treatment unit (1)
24
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
A water treatment unit (2)
€
I0 = 2[ ]∧ I1 = 8[ ]∧ C = 0[ ]∧ H = 0[ ]
€
Q = IntI0 + I1 + F
3
⎛ ⎝ ⎜
⎞ ⎠ ⎟
€
O* = IntI0 + I1 +10
3
⎛ ⎝ ⎜
⎞ ⎠ ⎟
25
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
A water treatment unit (3)
€
I0 = 2[ ]∧ I1 = 8[ ]∧ F = 8[ ]∧ H = 0[ ]
€
α =IntI0 + I1 + F +C − H
3
⎛ ⎝ ⎜
⎞ ⎠ ⎟
€
O =
α , if 0 ≤ α ≤ O*( )
2O* −α( ), if α > O*( )
0, otherwise
⎧
⎨
⎪ ⎪
⎩
⎪ ⎪
26
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
A water treatment unit (4)
€
I0 = 2[ ]∧ I1 = 8[ ]∧ F = 8[ ]∧ C = 0[ ]
€
α =IntI0 + I1 + F +C − H
3
⎛ ⎝ ⎜
⎞ ⎠ ⎟
€
O =
α , if 0 ≤ α ≤ O*( )
2O* −α( ), if α > O*( )
0, otherwise
⎧
⎨
⎪ ⎪
⎩
⎪ ⎪
27
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
A water treatment unit (5)
€
S = IntI0 + F
2
⎛ ⎝ ⎜
⎞ ⎠ ⎟
28
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Uncertainty on O due to inaccuracy on S
€
I0 = 2[ ]∧ I1 = 8[ ]∧ C = 2[ ]∧ H = 0[ ]
29
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Uncertainty due to the hidden variable H
€
I0 = 2[ ]∧ I1 = 8[ ]∧ C = 2[ ]
30
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Not taking into account the effect of hidden
variables may lead to wrong decision (1)
€
I0 = 2[ ]∧ I1 = 8[ ]∧ F = 8[ ]
31
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Not taking into account the effect of hidden
variables may lead to wrong decision (2)
€
I0 = 2[ ]∧ I1 = 8[ ]∧ F = 8[ ]
32
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
PrincipleIncompleteness
Uncertainty
Preliminary Knowledge+
Experimental Data=
Probabilistic Representation
Decision
Bayesian Inference
Bayesian Learning
€
P(a)+ P(¬ a) =1
P a∧b( ) = P a( )P b | a( ) = P b( )P a | b( )
33
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Basic Concepts
Far better an approximate answer to the right question which is often
vague, than an exact answer to the wrong question which can always be
made precise.
John W. Tuckey
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Spam Detection
• Classify texts in 2 categories “spam” or “nonspam”– Only available information: a set of
words
• Adapt to the user and learn from experience
35
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Variable
€
Spam
€
W0 , W1, ..., WN-1
36
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Probability
€
P Spam = false[ ]( ) = 0.25
€
P Spam = true[ ]( ) = 0.75
37
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Normalization postulate
€
P Spam = false[ ]( ) + P Spam = true[ ]( ) =1.0
€
P X = x[ ]( )x∈X
∑ =1.0
€
P X( )X
∑ =1.0
38
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Conditional probability
€
P Wn = false[ ] | Spam = true[ ]( ) = 0.9996
€
P Wn = true[ ] | Spam = true[ ]( ) = 0.0004
€
P X |Y( )X
∑ =1.0
39
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Variable conjunction
€
P Spam∧Wn( )
€
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )
40
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Conjunction postulate
€
P X∧Y( ) = P X( ) × P Y | X( )
= P Y( ) × P X |Y( )
€
P X |Y( ) =P X( ) × P Y | X( )
P Y( )
€
P Spam = true[ ]∧ Wn = true[ ]( ) = P Spam = true[ ]( ) × P Wn = true[ ] | Spam = true[ ]( )
= 0.75 × 0.0004
= 0.0003
= P Wn = true[ ]( ) × P Spam = true[ ] | Wn = true[ ]( )
41
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Syllogisms
• Logical Syllogisms:– Modus Ponens:– Modus Tollens:
• Probabilistic Syllogisms:– Modus Ponens:– Modus Tollens:
€
a∧ a ⇒ b[ ] a b
€
¬b∧ a ⇒ b[ ] a ¬ a
€
P b | a( ) =1
€
P b | a( ) =1 ⇔ P ¬ a |¬ b( ) =1
€
P b | a( ) =1⇒ P a | b( ) ≥ P a( )
€
P b | a( ) =1⇒ P b |¬ a( ) ≤ P b( )
€
a ≡"x divisible by 9"
b ≡"x divisible by 3"
42
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Marginalization rule
€
P X∧Y( )X
∑ = P Y( )
43
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Joint distribution and questions (1)
€
P Y( ) = P X∧Y( )X
∑
P X( ) = P X∧Y( )Y
∑
P Y | X( ) =P X∧Y( )
P X∧Y( )Y
∑
P X |Y( ) =P X∧Y( )
P X∧Y( )X
∑
44
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Joint distribution and questions (2)
€
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )
€
P Spam( ) = P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )W0∧ ... ∧WN-1
∑
€
P Wn( ) = P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )Spam∧Wi≠n
∑
45
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Joint distribution and questions (3)
€
P Wn | Spam = true[ ]( ) =
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )Wi≠n
∑
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )W0∧ ... ∧WN-1
∑
€
P Spam |W0 ∧ ... ∧WN-1( ) =P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( )Spam
∑
46
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Decomposition
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P W0 | Spam( ) × P W1 |W0 ∧Spam( )
× ... × P WN−1 |WN−2 ∧ ... ∧W0 ∧Spam( )
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P Wn | Spam( )n=0
N−1
∏
47
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Network
48
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Parametric Forms (1)
€
P Spam = false[ ]( ) = 0.25
€
P Wn | Spam = false[ ]( ) =Nbre f
n
Nbre f
P Wn | Spam = true[ ]( ) =Nbret
n
Nbret
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P Wn | Spam( )n=0
N−1
∏
49
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Parametric Forms (2)
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P Wn | Spam( )n=0
N−1
∏
€
P Spam = false[ ]( ) = 0.25
€
P Wn | Spam = false[ ]( ) =1+ Nbre f
n
2 + Nbre f
P Wn | Spam = true[ ]( ) =1+ Nbret
n
2 + Nbret
50
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Identification
€
P Wn | Spam = false[ ]( ) =1+ Nbre f
n
2 + Nbre f
P Wn | Spam = true[ ]( ) =1+ Nbret
n
2 + Nbret
51
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Specification = Variables + Decomposition
+ Parametric Forms
• Variables: the choice of relevant variables for the problem
• Decomposition: the expression of the joint probability distribution as the product of simpler distribution
• Parametric Forms: the choice of the mathematical functions of each of these distributions
52
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Description = Specification + Identification
€
P Wn | Spam = false[ ]( ) =1+ Nbre f
n
2 + Nbre f
P Wn | Spam = true[ ]( ) =1+ Nbret
n
2 + Nbret
53
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Questions (1)
€
P Spam∧W0 ∧ ... ∧Wn ∧ ... ∧WN−1( ) = P Spam( ) × P Wn | Spam( )n=0
N−1
∏
€
P Spam( ) = P Spam( ) × P Wn | Spam( )n=0
N−1
∏W0∧ ... ∧WN-1
∑
= P Spam( )
54
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Questions (2)
€
P Wn( ) = P Spam( ) × P Wn | Spam( )n=0
N−1
∏Spam∧Wi≠n
∑
= P Spam( ) × P Wn | Spam( )Spam
∑
€
P Wn = true[ ]( ) = 0.25 ×1+ Nbre f
n
2 + Nbre f
⎛
⎝ ⎜ ⎜
⎞
⎠ ⎟ ⎟+ 0.75 ×
1+ Nbretn
2 + Nbret
⎛
⎝ ⎜
⎞
⎠ ⎟
55
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Questions (3)
€
P Wn | Spam = true[ ]( ) =1+ Nbret
n
2 + Nbret
€
P Spam | Wn = true[ ]( ) =P Spam( ) × P Wn = true[ ]( )
P Spam( ) × P Wn = true[ ]( )Spam
∑
56
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Question (4)
€
P Spam |W0 ∧ ... WN−1( ) =
P Spam( ) × P Wn | Spam( )n=0
N−1
∏
P Spam( ) × P Wn | Spam( )n=0
N−1
∏Spam
∑
€
P Spam = true[ ] |W0 ∧ ... ∧WN−1( )P Spam = false[ ] |W0 ∧ ... ∧WN−1( )
=P Spam = true[ ]( )P Spam = false[ ]( )
×P Wn | Spam = true[ ]( )P Wn | Spam = false[ ]( )n=0
N−1
∏
57
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Program = Description + Question
Utilization
Des
crip
tion
Que
stio
n
Pro
gram
SpecificationSpecification
IdentificationIdentification
• Variables
• Parametrical Forms or Recursive Question
• Decomposition
Preliminary Knowledge
Experimental Data
58
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Bayesian Program = Description + Question
59
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Results
SpamSievehttp://c-command.com/spamsieve/
60
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Theoretical Basis
Content:• Definitions and notations• Inference rules• Bayesian program• Model specification• Model identification• Model utilization
61
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Logical Proposition
Logical Proposition are denoted by lowercase name:
€
a
Usual logical operators:
€
a∧b
a∨b
¬ a
62
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Probability of Logical Proposition
We assume that to assign a probability to a given proposition a, it is necessary to have at least some preliminary knowledge, summed up by a proposition
€
P a | π( ) ∈ 0,1[ ]
Of course, we will be interested in reasoning on the probabilities of the conjunctions, disjunctions and negations of propositions, denoted, respectively, by:
P a∧b|π( ) P a∨b|π( ) P ¬a|π( )
We will also be interested in the probability of proposition aconditioned by both the preliminary knowledge and some other proposition b:
P a|b∧π( )
63
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Normalization and Conjunction Postulates
Bayes ruleCox Theorem
Resolution PrincipleWhy don't you take the disjunction rule as an axiom?
€
P a∧b | π( ) = P a | π( ) × P b | a∧π( )
= P b | π( ) × P a | b∧π( )
€
P a | π( ) + P ¬ a | π( ) =1
€
P a∨b | π( ) = P a | π( ) + P b | π( ) − P a∧b | π( )
64
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Discrete Variable• Variable are denoted by name starting
with one uppercase letter: • By definition a discrete variable is a
set of propositions – Mutually exclusive:– Exhaustive: at least one is true
The cardinal of X is denoted:
€
i ≠ j ⇒ xi ∧x j[ ] = false
€
xi
€
X⎣ ⎦
€
X
€
X
65
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Variable Conjunction
€
X∧Y = xi ∧y j{ }
€
X∨Y = xi ∨y j{ }Not a
variab
le
66
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Conjunction rule
€
∀xi ∈ X,∀y j ∈Y
P xi ∧y j | π( ) = P xi | π( ) × P y j | xi ∧π( )
= P y j | π( ) × P xi | y j ∧π( )
€
P X∧Y | π( ) = P X | π( ) × P Y | X∧π( )
= P Y | π( ) × P X |Y ∧π( )
67
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Normalization rule
€
PX∑ X | π( ) =1 Proof
68
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Marginalization rule
€
P X∧Y | π( )X∑ = P Y | π( ) Proof
69
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Contraction/Expansion rule
€
X∧Y = A ⇒ P X∧Y | π( ) = P A | π( )
70
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Rules
€
P X∧Y | π( ) = P X | π( ) × P Y | X∧π( )
= P Y | π( ) × P X |Y ∧π( )
€
PX∑ X | π( ) =1
€
P X∧Y | π( )X∑ = P Y | π( )
€
X∧Y = A ⇒ P X∧Y | π( ) = P A | π( )
71
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
DescriptionThe purpose of a description is to specify an effective method to compute a joint distribution on a set of variables:
€
X1, X 2 ,..., X n{ }
Given some preliminary knowledge and a set of experimental data . This joint distribution is denoted as:
€
P X1∧X 2 ∧...∧X n | δ∧π( )
€
Spam,W0 , ...,WN−1{ }
€
P Spam∧W0 ∧ ... ∧WN−1 |δ∧π( )
72
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
DecompositionPartion in K subsets:
€
Li = X i1 ∧X i2 ∧...
€
P X1∧X 2 ∧...∧X n |δ∧π( )
= P L1 | δ∧π( ) × P L2 | L1∧δ∧π( ) ×...× P Lk | Lk−1∧...∧L1∧δ∧π( )
Conjunction rule:
€
P Li | Li−1∧...∧L1∧δ∧π( )
= P Li | Ri ∧δ∧π( )
€
P X1∧X 2 ∧...∧X n | δ∧π( )
= P L1 | δ∧π( ) × P L2 | R2 ∧δ∧π( ) ×...× P Lk | Rk ∧δ∧π( )
Conditional independance:
Decomposition:
€
L1 = Spam
€
L2 =W0
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P W0 | Spam( ) × P W1 |W0 ∧Spam( )
× ... × P WN−1 |WN−2 ∧ ... ∧W0 ∧Spam( )€
LN+1 =WN−1
€
P Wn |Wn−1∧ ... ∧W0 ∧Spam( ) = P Wn | Spam( )
€
P Spam∧W0 ∧ ... ∧WN−1( )
= P Spam( ) × P Wn | Spam( )n=0
N−1
∏73
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Parametrical Forms or Recursive Questions
€
P Li | Ri ∧δ∧π( ) = fμ Ri ,δ( )
Li( )
Parametrical form:
€
P Li | Ri ∧δ∧π( ) = P Li | Ri ∧ ′ δ ∧ ′ π ( )
Recursive Question:
€
P Wn | Spam∧δ∧π( ) ← Laplace succession law
74
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
QuestionGiven a description, a question is obtained by partitionning the set of variables into 3 subsets: the searched variables, the known variables and the free variables.
We define the Search, Known and Free as the conjunctions of the variables belonging to these three sets.
We define the corresponding question as the distribution:
€
P Search | Known∧δ∧π( )
75
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
Inference
€
P Search | Known∧δ∧π( ) = P Search∧Free | Known∧δ∧π( )Free∑
€
=
P Search∧Unknown∧Free | δ∧π( )Free∑
P Free | δ∧π( )
€
=
P Search∧Free∧Known | δ∧π( )Free∑
P Search∧Free∧Known | δ∧π( )Search∧Free
∑
€
=1Z
× P Search∧Free∧Known | δ∧π( )Free∑
76
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
2 optimisation problems
€
Draw P Search | Known∧δ∧π( )( )
€
P Search | Known∧δ∧π( )
=1Z
× P Search∧Known∧Unknown |δ∧π( )Free∑
77
Pierre Bessière — LPPA – Collège de France - CNRSCours « Cognition bayésienne » — 2010
API and Inference Enginemain (){
//Variables plFloat read_time; plIntegerType id_type(0,1); plFloat times[5] = {1,2,3,5,10}; plSparseType time_type(5,times); plSymbol id("id",id_type); plSymbol time("time",time_type);
//Parametrical forms //Construction of P(id) plProbValue id_dist[2] = {0.75,0.25}; plProbTable P_id(id,id_dist);
//Construction of P(time | id = john) plProbValue t_john_dist[5] = {20,30,10,5,2}; plProbTable P_t_john(time,t_john_dist); //Construction of P(time | id = bill) plProbValue t_bill_dist[5] = {2,6,10,40,20}; plProbTable P_t_bill(time,t_bill_dist); //Construction de P(time | id) plKernelTable Pt_id(time,id); plValues t_and_id(time^id); t_and_id[id] = 0; Pt_id.push(P_t_john,t_and_id); t_and_id[id] = 1;
Pt_id.push(P_t_bill,t_and_id); //Decomposition // P(time id) = P(id) P(time | id) plJointDistribution jd(time^id,P_id*Pt_id);
Bayesi
an P
rogra
m
Desc
rip
tion
Question
Specification
Identification
•Variables
•Decomposition
•Parametric Forms
•Learning from instances
//Question //Getting the question P(id | time) plCndKernel Pid_t; jd.ask(Pid_t,id,time);
//Read a time from the key board cout<<"P(id,time)= "<<Pid_t<<"\n"; cout<<"Time? : "; cin>>read_time;
//Getting P(id | time = read_time) plKernel Pid_readTime;
ProBT®ProBAYES.comBayesian-Programming.org
78