B003725 - Artificial Intelligence. Fall 2025

Bachelor degree in Computer Engineering, University of Florence

Instructor

Paolo Frasconi, DINFO, via di S. Marta 3, 50139 Firenze

email: .

Office Hours

Friday 11:15-13:15Monday 10:45-12:45 (S. Marta) — Check here for any variations

Description

The course will introduce problems and basic algorithmic techniques in the following areas:

Learning objectives

You will be able to design and apply simple searching agents, solvers for constraint satisfaction problems, understand and apply simple probabilistic models and solve basic data-driven classification problems. The course will serve as a foundation for further study in AI as well as in engineering areas where AI is becoming predominant.

Prerequisites

B003368 (Algorithms and Data Structures)

Suggested readings

Textbook

Other texts

[B12]
D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press, 2012 (free PDF).
[P88]
Judea Pearl Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann Series in Representation and Reasoning. 1988. (PDF available from Unifi network).
[D17]
Hal Daume III A Course in Machine Learning 2017 (free PDF).
[PM17]
David L. Poole and Alan K. Mackworth. Artificial Intelligence: Foundations of Computational Agents. Cambridge University Press. (free online).
[HTF09]
T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning. Data Mining, Inference, and Prediction. 2nd edition. Springer, 2009 (free PDF).
[D03]
Dechter, R. (2003). Constraint Processing. Morgan Kaufmann.
[J96]
Jensen, F. V. (1996). An introduction to Bayesian networks. UCL Press.
Misc
Additional reading materials (papers or book chapters) are often listed on the side of each lecture.

Assessment

There is a final project and a final oral exam.

The final project will be assigned based on a short discussion during office hours. It would often take several iterations by email so please do not ask the project by email. You have some freedom in choosing the area and the modality of your project but not in choosing the exact problem. The project typically consists the application of AI techniques to simple real problems, or it could involve the implementation and the verification of algorithmic techniques described in the course. Your should be ready to answer practical and theoretical questions about your project during the oral exam.

The oral exam covers the rest of the course. During the exam, you should prove that you can master both theoretically and practically the methods and the algorithmic techniques described in the course.

Schedule and reading materials

Relevant sections of the textbook(s) are listed on the right side. You are expected to study these sections thoroughly and be able to discuss their contents during the exam. The listed additional materials (usually papers) are to be studied only when marked as "required" (unless they are relevant to your project work, then you need to study them thoroughly).

Date Topics Readings/Handouts
2025-09-17 Administrivia. Main topics in current AI and what will be covered in this course.
  • RN20 1
2025-09-19 Agents and environments. Search problems. States, actions, costs. Search graph. Search tree. General best-first algorithm.
  • RN20 2, 3.1—3.3.2
2025-09-24 Blind search algorithms: DFS, BFS, Uniform Cost. Analysis. Iterative deepening. Bidirectional search.
  • RN20 3.4, 3.5.1-3.5.4, 3.6.
2025-09-26 Heuristics. Algorithm A* and conditions for optimality. Dominance and empirical evaluation of heuristics. Pattern databases.
  • RN20 3.5.5, 4.1
2025-10-01 Searching and optimizing goal states. Metaheuristics for local search: Hill climbing, local beam search, simulated annealing. Constraint satisfaction problems.
  • RN20 4.1.4, 6.1, 6.2
2025-10-03 Beam search. Examples of CSPs. Inference for CSPs (constraint propagation): Arc consistency, AC-3 and its analysis.
  • RN20 6.3
2025-10-08 Backtracking search. Heuristics for variable and value ordering. Inference during backtracking: maintaining arc consistency and forward checking. Cutset conditioning. Dual problems and cluster trees.
  • RN20 6.5, 6.4
2025-10-10 No class today.
2025-10-15 Junction trees and triangulated graphs. Cliques in triangulated graphs. Triangulation via variable elimination. Introduction to logic. Knowledge bases. Formulae. Syntax, semantics, interpretations.
  • RN20 6.5, 7.1-7.2
2025-10-17 Constraint modeling with Minizinc and CPMpy.
2025-10-17 Ontological and epistemological committments of a logic. Semantics. Interpretation functions. Propositional logic and its semantics. Entailment and its decidability.
  • RN20 7.1-7.4
2025-10-22 Propositional formulae. Valid and satisfiable formulae. Logical equivalence. Deduction theorem. Decidability of propositional logic. Reduction of entailment to SAT. Introduction to SAT solving. Conjunctive normal form.
  • RN20 7.4, 7.5
2025-10-24 SAT and the DPLL algorithm. Randomized algorithms. WalkSAT. Randomized k-SAT and phase transition. Introduction to theorem proving.
  • RN20 7.6.
2025-10-29 Theorem proving. Inference rules and theorem proving as search. Soundness and completeness. Resolution. Ground resolution theorem.
  • RN20 7.5
2025-10-31 Definite clauses. Backward chaining and main ideas behind logic programming. Limitations of logic under uncertainty. Probability distributions on discrete spaces. Introduction to probabilistic inference. Computing conditional probabilities by marginalization. Hugin.
  • RN20 7.5.4, 12, B12 1
2025-11-05 Beliefs. Evidence and probabilistic reasoning. Independence and conditional independence. Using graphs to represent conditional independence. Markov networks and undirected separation.
  • RN20 12.2-12.4;
2025-11-07 Examples of Markov networks. Markov chains. Language models. Hidden Markov models. I-maps, D-maps, perfect maps. Hammersley-Clifford theorem. Potential functions, normalized exponentials and clique features. Directed probabilistic graphical models. Factorization of joint distributions according to a directed graph. D-separation and conditional independence in directed networks.
  • B12 4.1-4.2, 3.1-3.3
  • Optional: B12 23
2025-11-12 Review of D-separation. Conditional probability tables. Expressiveness of Bayesian and Markov networks. Algorithmic problems for graphical models. Moralization. Junction trees for probabilistic inference. Initialization, message passing, inserting evidence.
  • B12 6; RN20 13.1-3
2025-11-14 Application example: POS-tagging with HMMs. Sketch of the Viterbi algorithm with beam search. Parameter learning in graphical models. Problem setup and assumptions. Maximum likelihood estimation.
  • RN20 20.2.5, 20.2.7, 20.2.2; B12 8.
2025-11-19 Frequentist (maximum likelihood) and bayesian parameter learning. Beta distribution. Bayesian conjugacy. Laplace smoothing. Dirichlet priors. Sketch of the structure learning algorithm for Bayesian networks.
  • RN20 20.1, 20.2
2025-11-21 Supervised learning. Examples. Empirical risk minimization. Generalization. Train and test error. Naive Bayes and its discrimination surface. Hyperplanes and the perceptron algorithm.
  • RN20 19.1,19.2
2025-11-26 The separation surface of Naive Bayes (more details). Linearly and nonlinearly separable data. Block-Novikoff theorem (with proof). Dual form of the perceptron algorithm and its interpretation. Basic ideas behind kernel methods.
2025-11-28 No class today.
2025-12-03 No class today.
2025-12-05 Decision trees for classification. Greedy top-down algorithm. Impurity measures. Hyperparameters. The decision tree hypothesis space. Bias-variance tradeoff. Bagging. Random forests. Bias-variance tradeoff. Bagging. Random forests.
  • RN20 19.8
2025-12-10 Adaboost and its analysis.

Note

Full text of linked papers is normally accessible when connecting from a UNIFI IP address. Use proxy-auth.unifi.it:8888 (with your credentials) if you are connecting from outside the campus network.