ORIE 6730
Last Updated
- Schedule of Classes - September 10, 2024 10:17AM EDT
- Course Catalog - September 10, 2024 9:48AM EDT
Classes
Links for textbooks and Cornell Store open in new tab.
ORIE 6730
Course Description
Course information provided by the Courses of Study 2024-2025. Courses of Study 2024-2025 is scheduled to publish mid-June.
Empirical observation of deep neural networks shows surprising phenomena that classical statistical theory fails to fully describe. For example, bounds on generalization error from classical statistics grow with the flexibility of the model class but neural networks often exhibit low generalization error despite being extremely flexible. Also, training deep neural networks with stochastic gradient descent produces accurate models despite non-convexity of the loss landscape. A recently emerged literature is developing new theory to explain these and other mysteries. After presenting relevant theoretical results from classical statistics and a brief refresher on deep learning, the course will present theoretical results from recent research articles, complementing them with empirical evidence, emphasizing what is unknown along with what is known.
When Offered Fall.
Prerequisites/Corequisites Prerequisite: CS 4782 or CS 5787, and ORIE 6500.
Comments Familiarity with deep learning and graduate-level familiarity with probability encouraged.
Outcomes- Identify the main ideas in classical statistical theory and what phenomena they fail to explain in deep learning.
- Identify the main hypotheses in the recent literature for the emergence of these phenomena and the evidence for and against these hypotheses.
- Demonstrate an understanding of the main ideas in recent theoretical results explaining why deep learning works so well.
Regular Academic Session.
-
Credits and Grading Basis
3 Credits Stdnt Opt(Letter or S/U grades)
Share
Disabled for this roster.