24-09-2013, 02:13 PM
Learning from Observations
Observations.ppt (Size: 866.5 KB / Downloads: 17)
Learning
Learning is essential for unknown environments,
i.e., when designer lacks omniscience
Learning is useful as a system construction method,
i.e., expose the agent to reality rather than trying to write it down
Learning modifies the agent's decision mechanisms to improve performance
Inductive learning
Simplest form: learn a function from examples
f is the target function
An example is a pair (x, f(x))
Problem: find a hypothesis h
such that h ≈ f
given a training set of examples
(This is a highly simplified model of real learning:
Ignores prior knowledge
Assumes examples are given)
Learning decision trees
Problem: decide whether to wait for a table at a restaurant, based on the following attributes:
Alternate: is there an alternative restaurant nearby?
Bar: is there a comfortable bar area to wait in?
Fri/Sat: is today Friday or Saturday?
Hungry: are we hungry?
Patrons: number of people in the restaurant (None, Some, Full)
Price: price range ($, $$, $$$)
Raining: is it raining outside?
Reservation: have we made a reservation?
Type: kind of restaurant (French, Italian, Thai, Burger)
WaitEstimate: estimated waiting time (0-10, 10-30, 30-60, >60)
Finding ‘compact’ decision trees
Motivated by Ockham’s razor.
However, finding the smallest decision tree is an unsolved problem (NPc).
There are heuristics that find reasonable decision trees in most practical cases.
Computational Learning Theory – why learning works
PAC learning )Probably Approximately Correct)
This has been a breakthrough in the theory of machine learning.
Basic idea: A really bad hypothesis will be easy to identify, With high probability it will err on one of the training examples.
A consistent hypothesis will be probably approximately correct.
Notice that if there are more training example, then the probability of “approximately correct” becomes higher!