CSU520 Artificial Intelligence - Spring 2009 - Final Exam notes

Professor Futrelle - College of Computer and Information Sciences, Northeastern U., Boston, MA

Version of April 16th 2009

Final at 8AM on Tuesday, April 21st in 11 Snell Library


There will be four questions -

Each of the items below refers to just a few pages. But you will need to read more than those pages in order to be sure you fully understand the relevant material.

BE SURE TO CAREFULLY STUDY THE MATERIAL ON DECISION LISTS BELOW.

1. Search

The question will be about a route-finding problem such as the Romania example in Chapter 3. You will need to show how both depth-first and breadth-first seach can solve the problem I will give you (Sec. 3.4). Make sure you understand and can use the technical description of search on page 62. (Chapter 4 topics on informed search will not be involved.)

2. First-order logic

You will be asked one or both of the following types of questions:

2A. Forward chaining as in Sec. 9.3, pages 280-282.

2B. Conversion to conjunctive normal form (CNF) as in Sec. 9.5, pages 295-297. Since this is a complicated process, I will include some hints to help you along, including Fig. 7.11. Though you will not be asked to do a resolution proof, be ready to explain the strategy behind resolution proofs and how and why CNF is needed - understanding the steps in Fig. 7.13 is all you need to know about resolution proofs, even though that example is for propositional logic.

3. Uncertainty

The material on page 495 shows how to answer a query using the various probabilities in a Bayesian network. That is the type of question you will be asked. You may also find the material in Sec, 13.4 and on pages 496, 504, and 505 helpful.

4. Learning

This question will focus on decision lists. There is only a small discussion of decision lists in the book at the bottom of page 670 and as shown in FIgure 18.13. But decision lists are closely related to decision trees and there's more material on decision trees in the chapter. You will be asked to construct and evaluate a decision list for a simple example. The decision list you will be given will have a level of difficulty between Dataset 1 and 2 below.

Decision lists

Decision lists are powerful alternatives to decision trees. Rather than basing each decision at each step on the value of a single attribute, a different strategy is followed:

1. Each decision can be any term, that is, any conjunction of the attributes, with a result that is true or false. Thus, the first decision in Decision List 1 is written as (x2, x2, 0) which means that if x1 and x2 are both true (their conjunction), the result is false (the zero).


2. When a decision succeeds, as (x1, x2, 0) does for example 4, that example is considered classified, and no further analysis of it is needed.

In the following the attributes x1 and x2 have the values shown for four examples, where 0 = false, 1 = true. The "Goal" is as in the book's restaurant example, also true or false.

Dataset 1:

Example x1 x2 Goal Step #
1 0 0 0 #3
2 0 1 0 #3
3 1 0 1 #2
4 1 1 0 #1

Here are the two conjunctive terms for a correct decision list, followed by the default, which is 0 in this case:

Decison List 1: (x1, x2, 0) ⇒ (x1, ¬ x2, 1) ⇒ (True, 0)

Exaplanation: By considering various possibilities, we discover that (x1, x2, 0) will correctly classify example 4. That is Step #1. We then remove example 4 from consideration and look for a term that can correctly classify the remaining three examples. We discover that a term made up of x1 and the negation of x2 (¬ x2) matches example 3 with goal value 1, so we write this term as (x1, ¬ x2, 1). That is step #2. The two remaining goal values are both false, 0, so we end with a single default decision, with value 0.

A good way to understand Decision List 1 is to assume that it has been created through a learning algorithm and then hand it examples and observe how it classifies them. Assume that we give the system Example 3 from Dataset 1, which has x1=1, and x2=0. The x1, x2 values do not match the first test, since it requires that they both be true. But the x1, x2 values do match the second term, since x1 is true (1) and x2 is false (0). The second term then requires that we return 1 (true) which matches the Goal 1 for Example 3. Examples 1 and 2 "fall through" to the default, since they don't match either the first term or the second. The fall-through or default term is (True, 0) which tells us to assign 0 as the class for both examples 1 and 2, which is correct. Note that once an example matches a term in the list, the result is returned and no further processing is done.

If you study the following more complex example, you'll see that the same technique works, with the decision list,

Decison List 2: ( ¬ x2, 1) ⇒ (x2, ¬ x3, 0) ⇒ (¬ x1, 1) ⇒ (True, 1)

and the dataset of eight examples,

Dataset 2:

Example x1 x2 x3 Goal Step #
1 0 0 0 1 #1
2 0 0 1 1 #1
3 0 1 0 0 #2
4 0 1 1 1 #3
5 1 0 0 1 #1
6 1 0 1 1 #1
7 1 1 0 0 #2
8 1 1 1 0 #4

Go to CSU520 home page. or RPF's Teaching Gateway or homepage