CS670: Graduate Computer Vision

• Fall 2013; In LGRC A310, 10:30-11:50, Monday and Wednesday

Instructor

Erik Learned-Miller
elm at cs.umass.edu
(413) 545-2993

Grader: Raelen "Rae" Recto Email: recto@cs.umass.edu
Draft Syllabus: syllabus.pdf

Prerequisites

Readings

1. Introduction to Computer Vision

2. A review of basic probability.

3. Supervised learning and Bayesian classification

4. Entropy and mutual information

5. Image Alignment, by Rick Szeliski

6. Lightness Perception and Lightness Illusions

7. Unsupervised learning in vision.

8. Light and cameras.

Resources

MATLAB Tutorial

Matlab Diary for Lecture on Sep. 4


Interesting Links

The Necker Cube illusion

Checker shadow illusion

Movie on optical illusions

Early color photographs by S. M. Prokudin-Gorsky

Description

Schedule
Date Lecture topic New assignments Assignments due Reading
Sep. 4 UNIT 1: Introduction. What is Computer Vision? What are the goals of computer vision? What should we take from humans? Also, Intro to Matlab. See Matlab Diary of in class session under "Resources" above. Lecture slides Assignment 1: Probabilistic classification.

Digit data for assignment 1

Due by midnight on Sept 11. Please email the solution to Rae as a zipped tar file of all necessary files.



Readings 1, 2, and 3 from the "Readings" list above by Monday, September 11.
Sep. 9 UNIT 2: Probability, Statistics, and Learning Basics. Review of basic discrete probability. Samples spaces. Events. Joint Probability. Conditional Probability. Marginalization. Role of Probability and Statistics in Computer Vision. Bayes' rule. Likelihoods, priors, and posteriors. Estimating likelihoods, priors, and posteriors.

Lecture slides



Sep. 11

MAP classification. Optimality of MAP classification with true posteriors. Feature selection. Which pixel is best? Which two pixels are best?

Statistical independence. Mutual information. Information gain. Lecture slides



Assignment 1 Due by midnight
Sep. 16 Feature selection. Maximum information gain and greedy maximum information gain. Lecture slides

Reading number 4, for next lecture.
Assignment 2: Greedy Feature Selection. Due on Sept. 30, midnight.
Sept. 18 UNIT 3: Alignment Alignment: motivation and definitions. Image to image, image to model, and joint alignment. The core matching problem: aligning Patch J to Image I. Families of transformations. Translations. Rigid. Similarity. Affine. Linear. Homographies or perspective. Diffeomorphisms. Implementing transformations as "looking back" to original image using transform inverse.



Reading number 5, SECTIONS 2.0, 2.1, 3.0, 3.1
Sept. 23 Tranformations continued. Mechanics of forward and backward transformations.

Alignment criteria, briefly.

Optimization of alignment criteria. Exhaustive search, keypoint methods, gradient descent.

Mutual information based aligment of medical images.



Sep. 25 Alignment continued.

Brief detour: High level anatomy of the eye: pupil, iris, retina, rods and cone cells.

3-color vision (tristimulus theory) and the human cone cells. Duplicating cone responses is good enough to give any visual percept about color.

Prokudin-Gorski photographs.

Lecture slides from last 3 lectures.



Sept. 30 Congealing: Joint alignment. Lecture 1. Assignment 3: Automatic alignment of Prokudin-Gorsky plates.

Plates

Due by midnight on Oct.14.



Oct. 2 Congealing continued. Complex image congealing. Congealing on 3D arrays. Congealign with brightness transformations.

Lecture slides from last 2 lectures.



Oct. 7 Convolution.

Convolution and non-parametric density estimation slides.



Oct. 9 Maximum likelihood estimation and non-parametric density estimation (also known as Parzen window estimation, kernel density estimation).

Some simple matlab examples.



Oct. 15 Distribution fields. Exploding an image. Convolving with a Gaussian. Basin of attraction with distribution fields. Likelihood match. Sharpening match.



Oct. 16 Distribution fields continued.

Slides from last few lectures on distribution fields and image comparison functions.



Oct. 21 Finish distribution fields. Start Features Unit.

Oct. 23 Features

Oct. 28 PROJECT DISCRIPTIONS. CHOOSE PROJECT BY Nov. 4. Turn in short description of proposed project to Prof. Learned-Miller as a single page pdf by Nov. 4.

More features. SIFT. SIFT keypoints. Difference of Gaussian scale space. Extrema of difference of Gaussians. Unstable keypoints: low contrast and low minimum curvature points.

SIFT slides used in class. NOTE: See the notes on the slides at the bottom of this link. It gives a short explanation of each slide.

Project descriptions from class.



Oct. 30 SIFT continued. More technical detail on low minimum curvature keypoints. Orientation of a keypoint: maxima of angle-frequency distribution.

Nov. 4 SIFT descriptors.

Nov. 6 Image formation: Light, cameras, and signal transduction.

Nov. 11 HOLIDAY, NO CLASS.

Nov. 13 Guest Lecture: Andrew Kae. Topic: Conditional Random Fields, Restricted Boltzmann machines, and face segmentation.

Andrew Kae guest lecture slides.



Nov. 18 Image formation: lecture 2

Nov. 20 Image formation: lecture 3

Slides from last 3 lectures

Study Guide, Part 1

Study Guide, Part 2



Nov. 25 TEST IN CLASS

Nov. 27

Dec. 2

Dec. 4 LAST DAY of class.