Monday, May 7, 2012

Common errors on the final: One last mail before grades

Since you will be in no mood to think about any more AI once you get your grades, here is one cram-chance for two near universal errors:

1. Even after you are given the topology of a bayes network, you can reduce the number of independent parameters by using "parameterized" distributions
 (such as Noisy-OR) rather than CPTs.  For example, given a node with n parents, a CPT will need 2^n independent parameters, while if we decide to go with a NOISY-OR distribution, then we only need to give n parameters. (only one or two students got this right).

2. Even if a proposition p occurs in a planning graph with mutexes, it may still actually not be achievable by that level since standard mutexes only consider interactions between a pair of propositions at a time. If you have actions with >2 preconditions, it is possible that each pair of them are non-mutex, and yet all of them together are not possible. In that case, if that action gives the proposition p, you will add p in the next level, even though the action itself is not actually executable in the previous level.


No comments:

Post a Comment