Machine Learning: Capabilities, Limitations and Misconceptions

Decanato - Facoltà di scienze informatiche

Data: 7 Luglio 2022 / 11:00 - 12:30

USI Campus EST, room D0.02, Sector D - Online on MS Teams

Please click here to join

Speaker: Vladimir Cherkassky, Dept. of Electrical & Computer Engineering, University of Minnesota

Abstract: This talk presents an overview of predictive data-analytic methods, using VC-theoretical framework. This framework helps to understand better technical limitations of modern approaches (such as Deep Learning), and to separate conceptual vs. theoretical vs. computational aspects of machine learning. On a technical side, I will show that ‘double descent’ phenomenon recently discovered in Deep Learning can be fully explained by classical Vapnik-Chervonenkis (VC) theory. On a methodological side, I will present examples of ‘non-standard’ learning problems, such as Universum Learning and Learning Using Privileged Information (LUPI). These new learning problem settings are appropriate for many real-life applications that have complex data, in addition to labeled samples used for traditional supervised learning. Overall, this talk will emphasize practical importance of scientific and conceptual framework for machine learning, rather than brute-force computational approaches for Big Data.

Biography: Vladimir Cherkassky is Professor of Electrical and Computer Engineering at the University of Minnesota, Twin Cities. He received MS in Operations Research from Moscow Aviation Institute in 1976 and PhD in Electrical and Computer Engineering from the University of Texas at Austin in 1985. He has worked on theory and applications of statistical learning since late 1980’s and co-authored the monograph Learning from Data, Wiley-Interscience, now in its second edition. He is also the author of a new textbook Predictive Learning - see www.VCtextbook.com

He has served on editorial boards of IEEE Transactions on Neural Networks (TNN), Neural Networks, Natural Computing, and Neural Processing Letters. He was a Guest Editor of the IEEE TNN Special Issue on VC Learning Theory and Its Applications in 1999. Dr. Cherkassky was Director of NATO Advanced Study Institute (ASI) From Statistics to Neural Networks: Theory and Pattern Recognition Applications held in France in 1993. He received the IBM Faculty Partnership Award in 1996 and 1997 for his work on learning methods for data mining. In 2007, he became Fellow of IEEE for ‘contributions and leadership in statistical learning’.  In 2008, he received the A. Richard Newton Breakthrough Award from Microsoft Research for ‘development of new methodologies for predictive learning’.

Facoltà

Eventi
25
Giugno
2022
25.
06.
2022
30
Giugno
2022
30.
06.
2022

Device Accelerated solvers with PETSc: current status, future perspectives, and applications

Facoltà di scienze biomediche, Facoltà di scienze economiche, Facoltà di scienze informatiche
01
Luglio
2022
01.
07.
2022

Scalable Gaussian Processes

Facoltà di scienze informatiche
06
Luglio
2022
06.
07.
2022

Robust Sensor-based Recognition of Human Behavior

Facoltà di scienze informatiche