Pattern Recognition - CS 663 -Winter 2018

Schedule: TU 9-10:20 AM

Building: 61 Room: 106G

:Text Books

Machine Learning: A Probabilistic Perspective by Kevin P. Murphy - Main Book

Pattern Classification by David G. Stork, Peter E. Hart, and Richard O. Duda

:Other Useful Resources

Pattern Recognition and Machine Learning by Christopher Bishop

Pattern Recognition by Sergios Theodoridis

Artificial Intelligence: A Modern Approach by Peter Norvig and Stuart J. Russell

Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman

:Course Description

Pattern recognition techniques aim to build an automated system that learns from patterns to make decisions with regards to categories. This course introduces the theory, principles, and practice of statistical pattern recognition from a variety of perspectives. Topics including: supervised (discriminative vs. generative models, parametric vs. non-parametric models, kernel methods, neural network, etc.), unsupervised (e.g., k-means clustering), and semi-supervised learning models (e.g., label propagation) will be presented 


Students must have graduate standing in computer science or computer engineering. In particular, they should be familiar with linear algebra, probability, and statistics (e.g., CS461 or an equivalent course). The students also should have programming experience in one of the following programming languages: MATLAB/Python/Java/C/C


:Course Objectives

Upon completion of the course, students will be able to formulate machine learning problems corresponding to different real-world applications, apply an appropriate machine learning method to solve a problem of moderate complexity, read and critique research articles in the pattern recognition and machine learning research area, understand and present current research studies in the field, and propose a solution to a machine learning problem and write a research article


:Course Outline

Topic Week

Introduction to Pattern Recognition and Machine Learning
Probability Theory -review 1
Probability Distributions -review 1
Supervised Learning Models 2
Parametric vs. Non-parametric Learning Models 2
Liner Predictions 2
Kernel Methods 3
Dimensionality Reduction 3
Support Vector Machine 3
Perceptron and Neural Network 4
Deep Neural Network 4
Unsupervised Learning Models 5
Clustering 5
Semi-supervised Learning Algorithms 6
Feature Selection and Sampling Methods 7
Paper Reading and Critiquing 8-12
Project Presentations 12-14


:Paper Reading

Each student should present two papers at least from the following proceedings

 AAAI 15 - AAAI 16 AAAI 17

NIPS 2017NIPS 2016 - NIPS 2015

آخر تحديث
1/17/2018 1:33:02 PM