B003725 - Artificial Intelligence - Fall 2018

Learning Objectives

You will learn about the basic concepts and the main research areas of AI. Broad topics that are covered include: searching, constraint programming, logic, probabilistic reasoning, and machine learning.


B003368 (Algorithms and Data Structures)

Suggested readings


[RN10] Stuart Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 3rd edition. Pearson, 2010.

[B12] D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012.

Other texts:

[HTF09] T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Data Mining, Inference, and Prediction. 2nd edition. Springer, 2009.

[STC00] John Shawe-Taylor and Nello Cristianini. Support Vector Machines and other kernel-based learning methods, Cambridge University Press, 2000

[PM10] David L. Poole, Alan K. Mackworth. Artificial Intelligence: Foundations of Computational Agents. Cambridge University Press, 2010.

[D03] Rina Dechter. Constraint Processing. Morgan Kaufmann, 2003.

[F94] P. Flach. Simply Logical. Intelligent Resoning by Esample. Wiley 1994.

[GN87] M. Genesereth, N. J. Nilsson. Logical Foundations of Artificial Intelligence. Morgan Kaufmann Publishers, 1987. Alibris


There is a final project and a final oral exam.

The final project will be assigned by the teacher on one of the main topics of the course (Learning, Searching, Constraints, Logic programming, Graphical Models). The topic will be decided discussing with the student. The project typically consists of modeling or application of AI techniques to simple real problems, or it could involve the implementation and the verification of algorithmic techniques described in the course. The project will be discussed during the oral examination.

The oral exam covers the rest of the course. During the exam, students should prove they can master both theoretically and practically the methods and the algorithmic techniques described in the course. The final project will be assigned by the teacher on one of the main topics of the course (Learning, Searching, Constraints, Logic programming). The oral exam covers the rest of the course.

Office Hours

Tuesday, 10:45-12:45

Please do not email me about office hours, just check the School of Engineering Website for (unlikely) changes

Schedule and reading materials

Date Topics Readings/Handouts
2018-09-24 Administrivia. Introduction to Artificial Intelligence. Intelligence and rationality. Some basic theoretical computer science notions.
  • RN10 1
2018-09-27 Agents, percepts and actions. The agent function. Rational agents. Environment types (discrete, static, deterministic, etc.). Examples. Structure of agents. Reflex agents.
  • RN10 2.1-2.4
2018-10-01 No class today
2018-10-04 No class today
2018-10-08 Searching. Formulation of a search problem. Examples. Search graphs and search trees. Depth and breadth first search algorithms. Completeness and optimality. Time and space analysis.
  • RN10 3.1, 3.2, 3.3
2018-10-11 Depth-limited search and iterative deepening. Uniform cost search. Optimality.
  • RN10 3.4
2018-10-15 Bidirectional search. Heuristics. Greedy best first search. Admissibility and consistency. A*. Optimality of A*. Performance measures. Designing heuristics. Pattern databases. Local search. Hill climbing. Beam seach. Simulated annealing.
  • RN10 3.5, 3.6, 4.1.1, 4.1.2, 4.1.3
2018-10-18 Python practice on blind and heuristic search.
2018-10-22 Introduction to constraint programming. Constraint satisfaction problems. Examples. Inference for CSPs (constraint propagation). Node and arc consistency. AC-3 and its analysis.
  • RN10 6.1, 6.2.
2018-10-25 Limitations of arc-consistency. Path and k-consistency. Backtracking search. Variable and value ordering. Maintaining arc consistency. Forward checking.
  • RN10 6.2, 6.3
2018-10-29 Directed arc consistency. Solving tree problems. Cutset conditioning. Dual problems and their networks. Junction trees.
  • RN10 6.5
2018-11-01 No class today
2018-11-05 Constraint modeling with Minizinc and Numberjack.
2018-11-05 Logic-based agents. Knowledge bases. Formulae, syntax, semantics. Entailment and logical inference.
  • RN10 7.1-7.3
2018-11-08 Propositional logic. Syntax and semantics. Decidability. Satisfiability. Deduction theorem. Propositional theorem proving. clauses.
  • RN10 7.4, 7.5.1
2018-11-12 Resolution. Proofs by resolution. Conjunctive normal form. Ground resolution theorem. Definite clauses and Horn clauses. Forward and backward chaining.
  • RN10 7.5.2, 7.5.3
2018-11-13 SAT and the DPLL procedure. Random SAT problems. WalkSAT. Syntax and semantics of first order logic.
  • RN10 7.5.4, 7.6, 8.1, 8.2
2018-11-19 Examples of sentences/theories in FOL. Inference in FOL. Universal and existential elimination. Skolemization. Propositionalization. Unification. Resolution in FOL.
  • RN10 8.3, 9.1, 9.2, 9.5
2018-11-22 Beliefs, probabilities, and probabilistic reasoning. Examples.
  • RN10 ch. 13.1-13-3, B12 ch. 1
2018-11-26 Probabilistic reasoning. Conditional independence. Examples. Directed graphical models (Bayesian networks). Semantics of directed networks. Examples.
  • RN10 13.4-13.6, 14.1, 14.2, B12 3.1, 3.2
2018-11-29 Conditional independence entailment. D-separation and conditional independence in directed networks.
  • RN10 14.2, 14.3, B12 3.3
2018-12-03 Brief introduction to Hugin
2018-12-03 Inference. Junction trees for probabilistic inference. Algebra of probability tables. Absorption. Local and global consistency. Propagation in junction trees. Construction of junction trees for directed graphical models
  • RN10 14.4.4
2018-12-06 Introduction to machine learning. Supervised learning. Hypothesis spaces. Training error (empirical risk), true error (risk), estimation the true error using a test set or using cross-validation. The Naive Bayes classifier.
  • RN10 18.1, 18.2, 20.2.1, 20.2.2
2018-12-10 Laplace smoothing for Naive Bayes. Multinomial model for text categorization. Decision trees (definition and their hypothesis space)
  • B12 ch. 10, RN10 18.3.1, 18.3.2.
2018-12-17 Top down induction of decision trees. Splitting criteria (Gini, entropy, why classification error does not work as measure of impurity). Handling continuous attributes. Combatting overfitting with pre-, post-, and rule-pruning. Perceptron. Block-Novikoff theorem.
  • RN10 18.3.3, 18.3.4, 18.3.5, 18.3.6, 18.4.1, 18.6.3


Full text of linked papers is normally accessible when connecting from a UNIFI IP address. Use the proxy proxy-auth.unifi.it:8888 (with your credentals) if you are connecting from outside the campus network.