CSG120 Artificial Intelligence - Spring 2005 - Exams

Midterm Exam, February 24th

The exam focused on logic. A copy of it is now available online here.

Final Exam, April 21st - Updated April 17th

The Final Exam will be open-book, open-notes. A copy of it is now available online here.

The Final exam is designed in the following way: The questions will be closely based on examples given in AIMA text, rather than examples from the exercises at the end of the chapters. This will allow you to prepare better and answer better than if you had to base your preparation on the more open-ended exercises and their answers. The answers are not readily available to you or explained anywhere, other than in some of the grading and discussions I've given in class.

Final Exam, possible topics

1. Search: Given a problem similar to the road map problem, Sec. 3.1, or vacuum world Sec. 3.2, you would be asked to describe how one or two of the search strategies would deal with it: breadth-first search or iterative deepening depth-first search (Chap. 3) or greedy best-first search (Sec. 4.1). Drawing figures will help on this problem.

2. Constraint Satisfaction: Given a problem similar to the map-coloring of Chap. 5, you would be asked to describe how simple backtracking (Fig. 5.4) could be used to solve it.

3. Propositional Logic and Resolution: Solve a problem similar to Question 4 or 5 of the Midterm. Do not be fooled by the "common sense" interpretation of symbols such as the "Smoke" and "Fire" in Question 4. They are merely symbols - replace them by "A" and "B" if necessary, to emphasize this.

4. First-Order Logic and Resolution: Solve a problem similar to Question 6 of the Midterm. I did go over this in class. I do not consider it particularly difficult. What you need to do is to pay careful attention to the six steps in the step-by-step directions laid out in Sec. 9.5. When students did this, they did fine. When they didn't, they had problems.

5. Bayesian Networks:For Fig. 14.2, you should be able to compute the probability of the three lower nodes, once both the Burglary and Earthquake are given definite values. Be ready to do this for some other example problem of the same type, for which I will give you the diagram and the CPTs.
Also, be ready to discuss how the number of CPTs is changed if the nodes are introduced in another order, as in Fig. 14.3.

6. Utility Functions: Be able to do simple (almost trivial) computations similar to those on pg. 590.

7. Decision Tree Learning: You would be given a table similar to the Restaurant example of Fig. 18.3 and asked some questions about decision trees related to the table. You'll be given rather specific questions, e.g., Does building a tree with nodes in a certain order (of depth) that I will specify result in a more or less efficient tree than building it using nodes in a different order? Does limiting the depth lead to more accurate performance for one tree versus another, for additional test examples I will give you?

That's it, and I'm sure you'll agree that it's enough to keep you busy.

--------------------------------------------------------

Below is the previous note about the Final. It has been superseded by the note and list above. Please ignore the original note below. It is for reference only.

The Final Exam will focus on Probabilistic Reasoning (Chap. 14) including Decisions (Chap. 16) and Learning. The Learning topics will be chosen from those that are discussed in lecture, focusing on decision trees and lists, chapter 18. The questions will be quite similar to the ones on Assignment #4 - see the Assignments page. There will be no questions on natural language (chapters 22 or 23) though the topic will be discussed in lectures.