CS370: Introduction to Computer Vision

• Spring 2014; Monday and Wednesday, 10:35-11:50 • Computer Science Building, Room 142

Instructor

Erik Learned-Miller
elm at cs.umass.edu
(413) 545-2993

Teaching Assistant

Cheni Chadowitz
Email: First name, followed by "@", followed by "cs.umass.edu"
TA Hours: Tuesdays 4:00-5:00, Wednesdays: 3:30-4:30, room 256.

See Cheni's web page here for more information.

Please, write "CS370" as the subject of any email regarding the class or containing assignments.

Prerequisites

CMPSCI 240 OR CMPSCI 383. Must have a grade of 'C' or better in one of these courses. Also must have Math 132, since it is required for CMPSCI 240.

Assignments

Need to be emailed to the Teaching Assistant only.

Reading Materials

Textbook: Computer Vision: Algorithms and Applications, by Richard Szeliski: On-line copy available here

I do NOT recommend buying the textbook unless you want it for your own purposes. We will use it some in this course, but not a lot. You should be able to get by the on-line version.

Resources

MATLAB Tutorial

Interesting Links

Beau Lotto Ted Talk on optical illusions

Checker shadow illusion

Early color photographs by S. M. Prokudin-Gorsky

Flower Garden movie.

Description

Schedule
Date Lecture topic New assignments Assignments due Reading
Jan. 22 UNIT 1: Introduction. What is Computer Vision? What are the goals of computer vision? What can we learn by studying the human vision system?

Transcript from Matlab session in class.

Lecture slides.

Introduction to MATLAB.

Assignment 1: Colorizing the Prokudin-Gorsky photo collection
As. 1 due Jan. 29th, at 11:59pm

Introduction to computer vision
Jan. 27 UNIT 2: Image Formation. EM Spectrum. Distribution of wavelenghts of common light sources. Linearity of light and it's interaction with surfaces.

Lecture slides.



Jan. 29 UNIT 2: Image Formation. How light goes from a point source, to a surface to the eye, and finally, how it is absorbed by the different photoreceptors in the eye. Intro to anatomy of the eye: rods and cone cells on the retina.

Lecture slides.



Feb. 3 UNIT 2: Image Formation. Pinhole cameras and perspective projection. Cameras with lenses. Pros and cons of pinhole cameras and cameras with lenses. Basics of perspective projection.

Lecture slides.



Feb. 5 SNOW DAY



Feb. 10 UNIT 2: Image Formation. Digital camera sensors. Charge coupled devices. 1-CCD and 3-CCD cameras. Bayer filters. Interpolating from Bayer filters to 3-Color images.

Digitization of brightness signals. Trade-off of spatial resolution and brightness resolution. Number of bits per pixel.

Increasing the image contrast: Histogram equalization versus brightness remapping. Lecture slides.



Feb. 12 Image enhancement! Histogram equalization, look up tables and pseudo-color.

UNIT 3: Pattern Recognition and Classification. Elements of pattern recognition. The sets of images of objects of a given class. Measures of differences (and distances) between images.

Supervised learning as a formalization of learning from examples.

Handwritten digit recognition. Features.

Nearest neighbor classification. Euclidean distance in 2, 3, and many dimensions. Lecture slides.

Assignment 2: Image Formation As. 2 due Wed., Feb. 19th, at 11:59pm

Feb. 18 UNIT 3: Pattern Recognition and Classification. More supervised learning.

Nearest neighbor classification. Euclidean distance in 2, 3, and many dimensions. Lecture slides.


Introduction to supervised learning
Feb. 19 UNIT 3: Pattern Recognition and Classification. Dealing with variability of images due to translation, rotation, and scaling.
Four methods: Massive training sets, alignment, invariant metrics, invariant features.
Weighted Euclidean distance.
No lecture slides for this class.

Feb. 24 UNIT 3: Pattern Recognition and Classification. Matlab tutorial. Vectorizing code. Efficiently computing nearest neighbor, etc.
See Cheni's web page for code.

No lecture slides for this class.

Feb. 26 UNIT 3: Pattern Recognition and Classification. A probabilistic approach to supervised learning.
Likelihoods, priors, and posteriors. Bayes' rule.
Estimating likelihoods and priors from data.
How much data do you need to estimate a likelihood?
Impracticality of estimating likelihoods from images and the need for simple features.

No lecture slides for this class.

A review of basic probability.

March 3 UNIT 3: Pattern Recognition and Classification. Simple features: single pixels, two pixels, n pixels.
Statistical independence, mutual information.
Greedy feature selection and maximizing the information gain.


Entropy, Joint entropy, and mutual information
March 5 UNIT 3: Pattern Recognition and Classification. Information gain, continued.


Assignment 3: Supervised digit classification
March 10 UNIT 4: Filtering, edges, and complex features.
Different types of features, and criteria of good features. Why not use just pixels? Why not use the whole image? Motivations for good features.

Lecture slides.
March 12 UNIT 4: Filtering, edges, and complex features. Linear filtering.

Filtering as a comparison operation. Applications of filtering. Smoothing, edge enhancement, edge detection, shifting.

Lecture slides.
March 17, 19 SPRING BREAK


March 24 UNIT 4: SIFT features. SIFT slides used in class (Michal Erel's slides)
Two core elements of SIFT: keypoints and descriptors. Gaussian pyramid. Building the Gaussian pyramid by repetitive filtering with Gaussian filter. Difference-of-Gaussian pyramid. Keypoints as local extrema of DOG pyramid.
MATLAB Code examples:
pyramid.m
filt.m


March 26 UNIT 4:. SIFT continued. Review of Gaussian pyramid and Difference-of-Gaussian pyramid. Finding the extrema of DOG pyramid. Refining the extrema using Taylor series method (you will not be responsible for this particular material.) Eliminating unstable keypoints.

Estimating the dominant orientation of a patch.


Exam review
March 31 MID-TERM, IN CLASS


April 2 SIFT: the descriptor as a 4x4 array of orientation histograms.
Detailed discussion of finding the local maxima of a function from a discrete set of function samples.


April 7 Random Sample Concensus (RANSAC). Application to Image Stitching.

Lecture slides.

April 9 Alignment: Alternatives to keypoint methods.
Basics of alignment. Ingredients of alignment. Exhaustive search. Gradient methods. Mutual information alignment.


April 14 Alignment continued. Tracking.

Lecture slides.

Assignment 4: Image Stitching with RANSAC
April 16 Finish alignment.

MATLAB code demonstrated in class:
Forward transform version 1
Forward transform version 2
Forward transform version 3
Forward transform version 4
Forward transform version 5

Reverse transform version 1
Reverse transform version 2
Reverse transform version 3
April 21 HOLIDAY: PATRIOT'S DAY. NO CLASS.



April 23 Alignment by gradient descent. Tracking using alignment by gradient descent. Derivative of image matching function with respect to x-coordinate. Derivative with respect to y-coordinate. Calculating the normalized gradient vector.

Coordinate descent.

Assignment 5: A simpler tracker Due on last day of class.
April 28 Face recognition: Part 1.

The face recognition pipeline: detection, alignment, recognition.
Recognition: verification vs. identification.
Hyper-features: Learning which features are consistent and discriminative.
April 30 LAST DAY OF CLASS (NOTE: THERE WILL BE A REVIEW FOR THE FINAL ON THURSDAY, TIME TO BE ANNOUNCED IN CLASS)

Face recognition: Part deux.

May 2nd FINAL EXAM: 1:30-3:30pm, ELAB 303, Engineering Laboratory, Room 303.