B003725 - Artificial Intelligence - Fall 2020

Bachelor degree in Computer Engineering, University of Florence

Contacts

Paolo Frasconi, DINFO, via di S. Marta 3, 50139 Firenze

email: (please do not use my address @unifi.it, it was forcibly moved to gmail by the central administration and it has all sorts of problems: messages may be replied with delay or not replied at all).

Office Hours

Tuesday, 10:45-12:45

Until further notice, it will be on Skype. Please with your Skype ID on the day before and I will reply with a tentative meeting time.

Learning Objectives

This course introduces some basic concepts in AI and covers four main topics:

  1. Searching
  2. Constraint programming
  3. Probabilistic reasoning and graphical models
  4. Learning in graphical models
  5. Supervised learning

Prerequisites

B003368 (Algorithms and Data Structures)

Suggested readings

Textbooks

[RN10] Stuart Russell and Peter Norvig. Artificial Intelligence: A Modern Approach. 3rd edition. Pearson, 2010.
[D17] Hal Daume III A Course in Machine Learning 2017 (free PDF)
[B12] D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012 (free PDF).

Other texts

[PM17] David L. Poole, Alan K. Mackworth. Artificial Intelligence: Foundations of Computational Agents. 2nd edition. Cambridge University Press, 2017 (free online).
[HTF09] T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Data Mining, Inference, and Prediction. 2nd edition. Springer, 2009 (free PDF).
[D03] Rina Dechter. Constraint Processing. Morgan Kaufmann, 2003.
[J97] F. Jensen. Introduction to Bayesian Networks, 1997

Assessment

There is a final project and a final oral exam.

The final project will be assigned by the teacher on one of the main topics of the course (Searching, Constraints, Logic programming, Graphical Models, Learning). The topic will be decided discussing with the student during office hours. The project typically consists of modeling or application of AI techniques to simple real problems, or it could involve the implementation and the verification of algorithmic techniques described in the course. Students should be ready to answer questions about their project during the oral exam.

The oral exam covers the rest of the course. During the exam, students should prove they can master both theoretically and practically the methods and the algorithmic techniques described in the course.

Schedule and reading materials

For videos please go to the Moodle-WebEx connector (UniFI credentials required).

Date Topics Readings/Handouts
2020-09-21 Administrivia. Introduction to Artificial Intelligence. Intelligence and rationality. Basic notions of theoretical computer science: decidability and NP-completeness.
  • RN10 1
2020-09-24 Agents, percepts and actions. The agent function. Rational agents. Environment types (discrete, static, deterministic, etc.). Examples. Structure of agents. Reflex agents. Using a model. Goal based agents. Examples of search problems.
  • RN10 2.1-2.4, 3.1
2020-09-28 Formulation of a search problem. Search graphs and search trees. General structure of the search procedure. Main algorithms: DFS, BFS, Uniform Cost (completeness, optimality, time and space analysis). Introduction to iterative Deepening.
  • RN10 3.2, 3.3, 3.4.1-3.4.4
2020-10-01 More blind search: details on Iterative Deepening, Bidirectional Search. Comparisons. Heuristics. Greedy best first search. Admissibility and consistency. A*. Optimality of A*.
  • RN10 3.4.5-3.4.7, 3.5
2020-10-05 Designing heuristics. Dominance. Linear conflicts. Pattern databases. Performance measures, effective branching factor. Local search. Hill climbing. Beam search.
  • RN10 3.6, 4.1.1,4.1.3
2020-10-08 Python practice on blind and heuristic search (code is on Moodle).
2020-10-08 More on local search. Simulated annealing.
  • RN10 4.1.2
2020-10-12 No class today
2020-10-15 Constraint satisfaction problems. Examples. Inference for CSPs (constraint propagation). Node and arc consistency. AC-3 and its analysis. Limitations of arc-consistency. Path and k-consistency. Introduction to backtracking search.
  • RN10 6.1, 6.2, 6.3
2020-10-19 Backtracking search. Variable and value ordering. Maintaining arc consistency. Forward checking. Solving tree problems. Cutset conditioning. Dual problems and their networks. Junction trees.
  • RN10 6.3, 6.5
2020-10-22 Triangulation of graphs. Min-conflicts. Introduction to logical agents.
  • RN10 6.4, 7.1
2020-10-22 Constraint modeling with Minizinc.
2020-10-26 Knowledge bases. Formulae, syntax, semantics. Entailment and logical inference. Propositional theorem proving. Inference rules. Resolution.
  • RN10 7.1-7.5.2
2020-10-29 Proofs by resolution. Ground resolution theorem. Definite clauses. Forward and backward chaining. SAT and the DPLL procedure. MaxWalkSAT. Random SAT problems.
  • RN10 7.5, 7.6
2020-11-02 Beliefs, probabilities, and probabilistic reasoning. Updating beliefs from evidence using Bayes' rule. Conditional independence. Examples.
  • RN10 ch. 13.1-13-6, B12 ch. 1
2020-11-05 Directed graphical models (Bayesian networks). Semantics. Examples. Conditional independence entailment. D-separation and conditional independence in directed networks.
  • RN10 14.1, 14.2, B12 3.3
2020-11-09 Limitations of D-separation. Markov networks. Inference in directed networks. Junction trees for probabilistic inference.
  • RN10 14.4.4., B12 6.
2020-11-12 Frequentist and bayesian parameter learning. Priors and posteriors. Beta distribution.
  • RN10 20.2.4, 20.2.5
2020-11-16 No class today
2020-11-19 More Bayesian conjugates: Dirichlet distribution. Hyperparameters and pseudo-counts. Parameter learning for directed models in the complete data case. Structure learning. Naive Bayes classifier.
  • RN10 20.2; B12 9.4, 9.5
2020-11-23 Laplace smoothing for Naive Bayes. Multinomial Naive Bayes. Some performance measures for binary classification: Accuracy, Precision, Recall, Breakeven point. General formulation of the supervised learning problem. Naive Bayes as a linear classifier.
  • RN10 20.2.2, 18.1, 18.2; D17 5.5, 9.3, 9.4; B12 10.1, 10.2; D17 9
2020-11-26 Hyperplanes and the perceptron algorithm. Block-Novikoff theorem. Voted perceptron. Dual form of the perceptron algorithm,
  • RN10 18.1, 18.2, 18.6.3, 18.7.2; D17 4.
2020-11-30 Decision trees. Top down induction algorithm. Handling continuous attributes. Pruning. Splitting criteria (Gini, entropy, why classification error does not work).
  • RN10 18.3; D17 1.
2020-12-03 Pruning and rule-pruning for decision trees. Sketch of the random forest algorithm.
  • RN10 18.3; D17 13.3; HTF09 15.2
2020-12-03 Introduction to Scikit-Learn.

Note

Full text of linked papers is normally accessible when connecting from a UNIFI IP address. Use the proxy proxy-auth.unifi.it:8888 (with your credentals) if you are connecting from outside the campus network.