UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
1. First-Order Predicate Logic (FOL / FOPL)
Why FOL?
Natural language → ambiguous
Propositional logic → too weak (cannot express generalizations)
FOL allows us to talk about objects, relations, and quantifiers.
Syntax of FOL
| Component | Symbol | Example | Meaning |
|---|---|---|---|
| Constant | A, John, 5 | John, Paris | Specific object |
| Variable | x, y, z | x | Any object |
| Predicate | P(x), Likes(x,y) | Brother(John, Bob) | Relation or property |
| Function | father_of(x) | father_of(John) | Maps object → object |
| Quantifiers | ∀ (forall) | ∀x Likes(x, IceCream) | For all x |
| ∃ (exists) | ∃x Killer(x, Victim) | There exists an x | |
| Connectives | ∧, ∨, ¬, ⇒, ⇔ | P ∧ Q, ¬P | And, Or, Not, Implies, Iff |
Real-Life Example: Family Relationships
- ∀x ∀y (Parent(x,y) ⇒ Child(y,x))
- ∀x (Man(x) ∧ ∀y (Sibling(y,x) ∧ Man(y)) ⇒ ¬Sister(y,x))
- ∃x (King(x) ∧ ∀y (Citizen(y) ⇒ LoyalTo(y,x)))
2. Prolog Programming (Practical FOL)
Prolog = Programming in Logic
Uses Horn clauses (facts + rules with single positive literal).
Real-Life Prolog Example: Crime Investigation (Who is the killer?)
% Facts
man(john).
man(bob).
man(charles).
woman(sarah).
weapon(knife).
weapon(gun).
motive(john, victim). % john had motive
motive(sarah, victim).
owns(john, knife).
owns(bob, gun).
owns(sarah, gun).
% Rules
killer(X) :-
motive(X, victim),
owns(X, Weapon),
weapon(Weapon),
man(X).
% Query: ?- killer(Who).
% Answer: Who = john
Python Equivalent using PyKE or simple Prolog-like engine (pythological)
But most students use SWI-Prolog directly. Here’s a Python simulation:
# Simple Prolog-like forward chainer in Python
knowledge_base = {
"man": ["john", "bob", "charles"],
"woman": ["sarah"],
"motive": [("john", "victim"), ("sarah", "victim")],
"owns": [("john", "knife"), ("bob", "gun"), ("sarah", "gun")],
"weapon": ["knife", "gun"]
}
def is_killer(person):
if person not in [p for p in knowledge_base["man"] + knowledge_base["woman"]]:
return False
has_motive = any(m[0] == person and m[1] == "victim" for m in knowledge_base["motive"])
has_weapon = any(o[0] == person and o[1] in knowledge_base["weapon"] for o in knowledge_base["owns"])
return has_motive and has_weapon and person in knowledge_base["man"]
print([p for p in ["john","bob","sarah"] if is_killer(p)]) # Output: ['john']
3. Unification
The process of finding a substitution that makes two logical expressions identical.
Unification Examples
| Expr1 | Expr2 | Substitution (θ) | Result |
|---|---|---|---|
| P(x, f(a)) | P(y, f(y)) | {x → y, y → a} | P(a, f(a)) |
| Knows(John, x) | Knows(John, Jane) | {x → Jane} | Success |
| Knows(John, x) | Knows(y, Bob) | {y → John, x → Bob} | Success |
| P(x,x) | P(a,b) | Fail | Cannot unify |
Unification Algorithm (Pseudocode + Python)
def unify(e1, e2, subst=None):
if subst is None:
subst = {}
if e1 == e2:
return subst
if is_variable(e1):
return unify_var(e1, e2, subst)
if is_variable(e2):
return unify_var(e2, e1, subst)
if is_compound(e1) and is_compound(e2):
if functor(e1) != functor(e2) or arity(e1) != arity(e2):
return None
for a1, a2 in zip(args(e1), args(e2)):
subst = unify(a1, a2, subst)
if subst is None:
return None
return subst
return None
def unify_var(var, x, subst):
if var in subst:
return unify(subst[var], x, subst)
elif x in subst:
return unify(var, subst[x], subst)
elif occurs_check(var, x):
return None
else:
subst[var] = x
return subst
4. Forward Chaining vs Backward Chaining
| Feature | Forward Chaining | Backward Chaining |
|---|---|---|
| Direction | Data-driven (bottom-up) | Goal-driven (top-down) |
| Starts from | Known facts | Goal |
| When to use | Monitoring, diagnosis | Diagnostic systems, expert systems |
| Example | Medical alert systems | "Why did this happen?" |
Forward Chaining Algorithm (Real-Life: Medical Diagnosis)
# Forward Chaining in Python
rules = [
(["fever", "cough"], "flu"),
(["fever", "rash"], "measles"),
(["flu", "tired"], "needs_rest"),
(["measles"], "contagious")
]
facts = ["fever", "cough", "tired"]
def forward_chain(rules, facts):
new_facts = True
while new_facts:
new_facts = False
for antecedents, consequent in rules:
if all(a in facts for a in antecedents) and consequent not in facts:
facts.append(consequent)
new_facts = True
print(f"Inferred: {consequent}")
return facts
print(forward_chain(rules, facts))
# Output: Inferred flu → needs_rest
Backward Chaining (Asks "how do we prove this?")
# Simple Backward Chaining
def backward_chain(goal, rules, known):
if goal in known:
print(f"{goal} is known")
return True
for ants, cons in rules:
if cons == goal:
print(f"To prove {goal}, prove: {ants}")
if all(backward_chain(a, rules, known) for a in ants):
return True
return False
known = ["fever", "cough"]
print(backward_chain("needs_rest", rules, known))
5. Resolution in FOL
Powerful inference rule:
From (A ∨ B) and (¬B ∨ C) → infer (A ∨ C)
Steps in Resolution Refutation:
1. Convert to CNF
2. Negate the goal
3. Keep resolving until empty clause (⊥) → contradiction → goal is true
Example: "Everyone who passes AI gets a job"
Premises:
1. ∀x (Passes(x, AI) ⇒ GetsJob(x))
2. Passes(John, AI)
Goal: GetsJob(John)
In CNF:
1. ¬Passes(x,AI) ∨ GetsJob(x)
2. Passes(John,AI)
3. ¬GetsJob(John) ← negated goal
Resolve 2 & 1 → GetsJob(John)
Resolve with 3 → empty clause → Proven!
6. Knowledge Representation Issues
Semantic Networks & Frames
- Nodes = concepts
- Links = relations (is-a, part-of, instance-of)
Ontological Engineering
Building large-scale knowledge bases (like Cyc, DBpedia, Wikidata)
Categories and Objects
- Upper ontology: Thing → PhysicalObject → Animal → Dog → MyDogBruno
Events
- Event(Meeting23), startTime(Meeting23, 1pm), location(Meeting23, Room101)
Mental Events & Objects
- Believes(Mary, Raining)
- Knows(John, Prime(5))
7. Reasoning with Default Information (Non-Monotonic)
Normal case ≠ always true
Default Logic Example:
Bird(Tweety)
Bird(x) : Flies(x) / Flies(x) ← "birds fly by default"
¬Abnormal(Tweety)
→ Flies(Tweety)
But if we learn Penguin(Tweety), then Abnormal(Tweety) → no flying
Real-Life Example: "Assume adults pay tax unless student"
# Simple default reasoning
default_rules = {
"adult": "pays_tax",
"bird": "flies"
}
exceptions = {"student", "penguin"}
def default_reason(person, property):
base = None
if "adult" in person and "pays_tax" == property:
base = "adult"
if "bird" in person and "flies" == property:
base = "bird"
if base and not any(exc in person for exc in exceptions):
return True
return False
john = {"adult", "employed"}
tweety = {"bird"}
penguin = {"bird", "penguin"}
print(default_reason(john, "pays_tax")) # True
print(default_reason(tweety, "flies")) # True
print(default_reason(penguin, "flies")) # False
Summary Table
| Topic | Best For | Example Tool/Code |
|---|---|---|
| Declarative Knowledge | Facts + Rules | Prolog |
| Generalization | ∀, ∃ quantifiers | FOL |
| Inference | Forward (monitoring), Backward (diagnosis) | Python chainers |
| Theorem Proving | Mathematics, verification | Resolution |
| Commonsense Reasoning | Real-world defaults | Default logic |
| Large Knowledge Bases | Semantic Web, chatbots | OWL, RDF, Protégé |
Key Takeaway:
Knowledge Representation is the heart of intelligent systems.
Without good KR, even the best ML model cannot reason like humans.
Master FOL → Prolog → Resolution → Ontologies → You’re ready for expert systems, semantic web, and advanced AI reasoning!
Practice these codes in SWI-Prolog and Python — they cover 95% of exam + interview questions on this unit! 🚀
UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
UNIT III — Knowledge Representation & Reasoning
Complete In-Depth Notes with Real-Life Examples + Working Code
1. First-Order Predicate Logic (FOL / FOPL)
Why FOL?
Natural language → ambiguous
Propositional logic → too weak (cannot express generalizations)
FOL allows us to talk about objects, relations, and quantifiers.
Syntax of FOL
| Component | Symbol | Example | Meaning |
|---|---|---|---|
| Constant | A, John, 5 | John, Paris | Specific object |
| Variable | x, y, z | x | Any object |
| Predicate | P(x), Likes(x,y) | Brother(John, Bob) | Relation or property |
| Function | father_of(x) | father_of(John) | Maps object → object |
| Quantifiers | ∀ (forall) | ∀x Likes(x, IceCream) | For all x |
| ∃ (exists) | ∃x Killer(x, Victim) | There exists an x | |
| Connectives | ∧, ∨, ¬, ⇒, ⇔ | P ∧ Q, ¬P | And, Or, Not, Implies, Iff |
Real-Life Example: Family Relationships
- ∀x ∀y (Parent(x,y) ⇒ Child(y,x))
- ∀x (Man(x) ∧ ∀y (Sibling(y,x) ∧ Man(y)) ⇒ ¬Sister(y,x))
- ∃x (King(x) ∧ ∀y (Citizen(y) ⇒ LoyalTo(y,x)))
2. Prolog Programming (Practical FOL)
Prolog = Programming in Logic
Uses Horn clauses (facts + rules with single positive literal).
Real-Life Prolog Example: Crime Investigation (Who is the killer?)
% Facts
man(john).
man(bob).
man(charles).
woman(sarah).
weapon(knife).
weapon(gun).
motive(john, victim). % john had motive
motive(sarah, victim).
owns(john, knife).
owns(bob, gun).
owns(sarah, gun).
% Rules
killer(X) :-
motive(X, victim),
owns(X, Weapon),
weapon(Weapon),
man(X).
% Query: ?- killer(Who).
% Answer: Who = john
Python Equivalent using PyKE or simple Prolog-like engine (pythological)
But most students use SWI-Prolog directly. Here’s a Python simulation:
# Simple Prolog-like forward chainer in Python
knowledge_base = {
"man": ["john", "bob", "charles"],
"woman": ["sarah"],
"motive": [("john", "victim"), ("sarah", "victim")],
"owns": [("john", "knife"), ("bob", "gun"), ("sarah", "gun")],
"weapon": ["knife", "gun"]
}
def is_killer(person):
if person not in [p for p in knowledge_base["man"] + knowledge_base["woman"]]:
return False
has_motive = any(m[0] == person and m[1] == "victim" for m in knowledge_base["motive"])
has_weapon = any(o[0] == person and o[1] in knowledge_base["weapon"] for o in knowledge_base["owns"])
return has_motive and has_weapon and person in knowledge_base["man"]
print([p for p in ["john","bob","sarah"] if is_killer(p)]) # Output: ['john']
3. Unification
The process of finding a substitution that makes two logical expressions identical.
Unification Examples
| Expr1 | Expr2 | Substitution (θ) | Result |
|---|---|---|---|
| P(x, f(a)) | P(y, f(y)) | {x → y, y → a} | P(a, f(a)) |
| Knows(John, x) | Knows(John, Jane) | {x → Jane} | Success |
| Knows(John, x) | Knows(y, Bob) | {y → John, x → Bob} | Success |
| P(x,x) | P(a,b) | Fail | Cannot unify |
Unification Algorithm (Pseudocode + Python)
def unify(e1, e2, subst=None):
if subst is None:
subst = {}
if e1 == e2:
return subst
if is_variable(e1):
return unify_var(e1, e2, subst)
if is_variable(e2):
return unify_var(e2, e1, subst)
if is_compound(e1) and is_compound(e2):
if functor(e1) != functor(e2) or arity(e1) != arity(e2):
return None
for a1, a2 in zip(args(e1), args(e2)):
subst = unify(a1, a2, subst)
if subst is None:
return None
return subst
return None
def unify_var(var, x, subst):
if var in subst:
return unify(subst[var], x, subst)
elif x in subst:
return unify(var, subst[x], subst)
elif occurs_check(var, x):
return None
else:
subst[var] = x
return subst
4. Forward Chaining vs Backward Chaining
| Feature | Forward Chaining | Backward Chaining |
|---|---|---|
| Direction | Data-driven (bottom-up) | Goal-driven (top-down) |
| Starts from | Known facts | Goal |
| When to use | Monitoring, diagnosis | Diagnostic systems, expert systems |
| Example | Medical alert systems | "Why did this happen?" |
Forward Chaining Algorithm (Real-Life: Medical Diagnosis)
# Forward Chaining in Python
rules = [
(["fever", "cough"], "flu"),
(["fever", "rash"], "measles"),
(["flu", "tired"], "needs_rest"),
(["measles"], "contagious")
]
facts = ["fever", "cough", "tired"]
def forward_chain(rules, facts):
new_facts = True
while new_facts:
new_facts = False
for antecedents, consequent in rules:
if all(a in facts for a in antecedents) and consequent not in facts:
facts.append(consequent)
new_facts = True
print(f"Inferred: {consequent}")
return facts
print(forward_chain(rules, facts))
# Output: Inferred flu → needs_rest
Backward Chaining (Asks "how do we prove this?")
# Simple Backward Chaining
def backward_chain(goal, rules, known):
if goal in known:
print(f"{goal} is known")
return True
for ants, cons in rules:
if cons == goal:
print(f"To prove {goal}, prove: {ants}")
if all(backward_chain(a, rules, known) for a in ants):
return True
return False
known = ["fever", "cough"]
print(backward_chain("needs_rest", rules, known))
5. Resolution in FOL
Powerful inference rule:
From (A ∨ B) and (¬B ∨ C) → infer (A ∨ C)
Steps in Resolution Refutation:
1. Convert to CNF
2. Negate the goal
3. Keep resolving until empty clause (⊥) → contradiction → goal is true
Example: "Everyone who passes AI gets a job"
Premises:
1. ∀x (Passes(x, AI) ⇒ GetsJob(x))
2. Passes(John, AI)
Goal: GetsJob(John)
In CNF:
1. ¬Passes(x,AI) ∨ GetsJob(x)
2. Passes(John,AI)
3. ¬GetsJob(John) ← negated goal
Resolve 2 & 1 → GetsJob(John)
Resolve with 3 → empty clause → Proven!
6. Knowledge Representation Issues
Semantic Networks & Frames
- Nodes = concepts
- Links = relations (is-a, part-of, instance-of)
Ontological Engineering
Building large-scale knowledge bases (like Cyc, DBpedia, Wikidata)
Categories and Objects
- Upper ontology: Thing → PhysicalObject → Animal → Dog → MyDogBruno
Events
- Event(Meeting23), startTime(Meeting23, 1pm), location(Meeting23, Room101)
Mental Events & Objects
- Believes(Mary, Raining)
- Knows(John, Prime(5))
7. Reasoning with Default Information (Non-Monotonic)
Normal case ≠ always true
Default Logic Example:
Bird(Tweety)
Bird(x) : Flies(x) / Flies(x) ← "birds fly by default"
¬Abnormal(Tweety)
→ Flies(Tweety)
But if we learn Penguin(Tweety), then Abnormal(Tweety) → no flying
Real-Life Example: "Assume adults pay tax unless student"
# Simple default reasoning
default_rules = {
"adult": "pays_tax",
"bird": "flies"
}
exceptions = {"student", "penguin"}
def default_reason(person, property):
base = None
if "adult" in person and "pays_tax" == property:
base = "adult"
if "bird" in person and "flies" == property:
base = "bird"
if base and not any(exc in person for exc in exceptions):
return True
return False
john = {"adult", "employed"}
tweety = {"bird"}
penguin = {"bird", "penguin"}
print(default_reason(john, "pays_tax")) # True
print(default_reason(tweety, "flies")) # True
print(default_reason(penguin, "flies")) # False
Summary Table
| Topic | Best For | Example Tool/Code |
|---|---|---|
| Declarative Knowledge | Facts + Rules | Prolog |
| Generalization | ∀, ∃ quantifiers | FOL |
| Inference | Forward (monitoring), Backward (diagnosis) | Python chainers |
| Theorem Proving | Mathematics, verification | Resolution |
| Commonsense Reasoning | Real-world defaults | Default logic |
| Large Knowledge Bases | Semantic Web, chatbots | OWL, RDF, Protégé |
Key Takeaway:
Knowledge Representation is the heart of intelligent systems.
Without good KR, even the best ML model cannot reason like humans.
Master FOL → Prolog → Resolution → Ontologies → You’re ready for expert systems, semantic web, and advanced AI reasoning!
Practice these codes in SWI-Prolog and Python — they cover 95% of exam + interview questions on this unit! 🚀