rule induction
DESCRIPTION
Rule Induction. Rule Induction Algorithms. Hypothesis Space: Sets of rules (any boolean function) Many ways to search this large space Decision trees -> Rules is one ( simultaneous covering ) Following example: greedy sequential covering algorithm (similar to CN2). Some FOL Terminology. - PowerPoint PPT PresentationTRANSCRIPT
Rule Induction
Rule Induction Algorithms
• Hypothesis Space: Sets of rules (any boolean function)– Many ways to search this large space– Decision trees -> Rules is one (simultaneous
covering)
• Following example: greedy sequential covering algorithm (similar to CN2)
Some FOL Terminology• Constants: (Mary, 23, Joe)
• Variables: (e.g., x, can refer to any constant)
• Predicates: (have a truth value; e.g. Female as in Female(Mary))
• Functions: (apply to terms and evaluate to a constant value, e.g. Age(Mary))
• Terms: any constant, variable, or function applied to a term (e.g. Mary, x, Age(x))
• Literals: any predicate applied to terms, e.g. Female(x) or Greater_than(Age(Mary), 20)
Some FOL Terminology (cont.)• Clause: Disjunction of literals with universally quantified
variables, e.g. Greater_than(Age(x), 23) v Female(Mary)
Example• Learning Granddaughter(x,y)
• Training Data:
Target Predicates: Input Predicates:Granddaughter(Victor, Sharon) Father(Sharon, Bob)
Father(Tom, Bob)
Female(Sharon)
Father(Bob, Victor)
…all other possible predicates defined over the constants are false (e.g. Granddaughter(Tom, Bob)…so 15 negative examples of Granddaughter(x, y) as well)
Example: Learning a Rule• Learning one rule:
Granddaughter(x, y) Classifies all examples as positive (makes 15 mistakes)
Granddaughter(x, y) Father(y, z)
Makes fewer mistakes
Granddaughter(x, y) Father(y, z) ^ Father(z, x)
Makes only one mistake
Granddaughter(x, y) Father(y, z) ^ Father(z, x) ^ Female(y)
Makes zero mistakes – output rule; because rule set now covers all positive examples, we are done.