B003725 - Artificial Intelligence - Fall 2016

Learning Objectives

You will learn about the basic concepts and the main research areas of AI. Broad topics that are covered include: searching, constraint programming, logic, probabilistic reasoning, and machine learning.

Prerequisites

B003368 (Algorithms and Data Structures)

Suggested readings

Textbooks:

[RN10] Stuart Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 3rd edition. Pearson, 2010.

[B12] D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012.

Other texts:

[HTF09] T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Data Mining, Inference, and Prediction. 2nd edition. Springer, 2009.

[STC00] John Shawe-Taylor and Nello Cristianini. Support Vector Machines and other kernel-based learning methods, Cambridge University Press, 2000

[PM10] David L. Poole, Alan K. Mackworth. Artificial Intelligence: Foundations of Computational Agents. Cambridge University Press, 2010.

[D03] Rina Dechter. Constraint Processing. Morgan Kaufmann, 2003.

[F94] P. Flach. Simply Logical. Intelligent Resoning by Esample. Wiley 1994.

[GN87] M. Genesereth, N. J. Nilsson. Logical Foundations of Artificial Intelligence. Morgan Kaufmann Publishers, 1987. Alibris

Assessment

There is a final project and a final oral exam. The final project will be assigned by the teacher on one of the main topics of the course (Learning, Searching, Constraints, Logic programming). The oral exam covers the rest of the course.

Schedule and reading materials

Date Topics Readings/Handouts
2016-09-27 Administrivia. Introduction to Artificial Intelligence. Intelligence and rationality. Some basic theoretical computer science notions.
  • RN10 1
2016-09-29 Agents, percepts and actions. The agent function. Environment types. Examples: Structure of agents. Reflex agents. Goal-driven agents.
  • RN10 2.1-2.4
2016-10-04 Problem-solving agents. Examples. Search graphs and search trees. Performance measures. Blind search (Depth and breadth first).
Homework: Write a general pseudo-code for blind search that works for both BFS and DFS. Try variants with and without a closed-list and with goal test at expansion or at pop from the open list. Analyze running time and space.
  • RN10 3.1, 3.2, 3.3
2016-10-06 More blind search (uniform cost search, depth-limited search, iterative deepening, bidirectional search).
  • RN10 3.4
2016-10-11 Heuristics. Greedy best first search. Admissibility and consistency. A*. Optimality of A*. Performance measures. Local search, hill climbing, simulated annealing.
  • RN10 3.5, 3.6, 4.1.1, 4.1.2, 4.1.3
2016-10-13 Python practice on blind and heuristic search.
2016-10-18 Constraint satisfaction problems. Examples. Node and arc consistency. AC-3.
  • RN10 6.1, 6.2.1, 6.2.2
2016-10-20 Backtracking search. Variable and value ordering. Maintaining arc consistency. Forward checking. Examples.
  • RN10 6.2, 6.3.1, 6.3.2
2016-10-25 Min-conflict. Path consistency and K-consistency. Directed arc consistency. Solving tree problems. Cutset conditioning. Junction trees.
  • RN10 6.2.3, 6.2.4, 6.4, 6.5
2016-10-27 Constraing modeling with Minizinc and Numberjack.
2016-11-03 Logic-based agents. Knowledge bases. Entailment and logical inference. Model checking.
  • RN10 7.1-7.3
2016-11-08 Propositional logic. Syntax and semantics. Decidability. Satisfiability. Deduction theorem. Propositional theorem proving.
  • RN10 7.4, 7.5.1
2016-11-10 Resolution. Proofs by resolution. Conjunctive normal form. Ground resolution theorem. Definite clauses and Horn clauses.
  • RN10 7.5.2, 7.5.3
2016-11-15 Forward chaining. SAT and the DPLL procedure. Random SAT problems. WalkSAT. First order logic. Syntax and semantics.
  • RN10 7.6, 8.1, 8.2, 8.3
2016-11-17 Inference in 1st order logic. Universal and existential elimination. Skolemization. Propositionalization. Unification. Theorem proving using generalized modus ponens. Resolution in FOL.
  • RN10 9.1, 9.2, 9.5
2016-11-22 Beliefs, probabilities, and probabilistic reasoning. Conditional independence. Examples.
  • RN10 ch. 13, B12 ch. 1
2016-11-24 Directed graphical models (Bayesian networks). Semantics of directed networks (d-separation). Conditional independence entailment. Examples.
  • RN10 14.2, 14.3, B12 3.1, 3.2, 3.3
2016-11-29 D-separation and conditional independence in Bayesian networks. Inference. Message passing in linear chains. Belief propagation. Junction trees.
  • RN10 14.4.1, 14.4.3, 14.4.4, B12 5.1
2016-12-01 Algebra of probability tables. Construction of junction trees for directed networks. Propagation in junction trees.
  • RN10 14.4.4
2016-11-26 Maximum likelihood learning for Bernoulli and categorical distributions. Learning Bayesian networks with maximum likelihood (complete data). Introduction to Bayesian learning. Beta distribution. Bayesian conjugacy. Bayesian learning for Bernoulli.
  • RN10 20.1, 20.2.1, 20.2.4; B12 8.6, 9.1, 9.3, 9.4.2
2016-12-13 Dirichlet priors. Bayesian learning of belief nets (complete data). Structure learning. Supervised learning. The Naive Bayes classifier. Bernoulli model. Laplace smoothing. Estimating the prediction accuracy of a classifier. Learning curves.
  • RN10 20.2.4, 20.2.5, 18.1, 18.2, 18.4, 20.2.2
2016-12-15 Induction of decision trees. Top-down algorithm. Splitting criteria. Why misclassification error does not work. Gini index. Entropy criterion. Pre- and post-pruning. Rule pruning.
  • RN10 18.3

Note

Full text of linked papers is normally accessible when connecting from a UNIFI IP address. Use the proxy proxy-auth.unifi.it:8888 (with your credentals) if you are connecting from outside the campus network.