Talk by Alex Smola, Yahoo! Research
Yahoo! Academic Relations and The Columbia University Center for Computational Learning Systems Present:
Title: Fast and Sloppy – Scaling Up Linear Models
In this talk I present an overview over a range of methods designed to scale up linear models both in terms of model complexity and in terms of their ability to process large amounts of data. The first aspect is addressed by hashing feature vectors for both prediction and matrix factorization. The second aspect can be dealt with by parallelizing stochastic gradient descent optimization procedures. I will present an algorithm suitable for multicore parallelism.
Alex Smola studied physics in Munich at the University of Technology, Munich, at the Universita degli Studi di Pavia and at AT&T Research in Holmdel. During this time he was at the Maximilianeum München and the Collegio Ghislieri in Pavia. In 1996 he received the Master degree at the University of Technology, Munich and in 1998 the Doctoral Degree in computer science at the University of Technology Berlin. Until 1999 he was a researcher at the IDA Group of the GMD Institute for Software Engineering and Computer Architecture in Berlin (now part of the Fraunhofer Geselschaft). After that, he worked as a Researcher and Group Leader at the Research School for Information Sciences and Engineering of the Australian National University. From 2004 onwards Alex worked as a Senior Principal Researcher and Program Leader at the Statistical Machine Learning Program at NICTA. He is currently working as Principal Research Scientist at Yahoo! Research.
Talk Sponsored by CCLS and Yahoo! Academic Relations