Skip to main content

Communication-Avoiding Algorithms for Linear Algebra, Machine Learning and Beyond

James Demmel ( Berkeley )
Algorithms have two costs: arithmetic and communication, i.e. moving data between levels of a memory hierarchy or processors over a network. Communication costs (measured in time or energy per operation) greatly exceed arithmetic costs, so our goal is to design algorithms that minimize communication. We survey some known algorithms that communicate asymptotically less than their classical counterparts, for a variety of linear algebra and machine learning problems, often attaining lower bounds. We also discuss recent work on automating the design and implementation of these algorithms, starting from a simple specification as nested loops.

Speaker bio

James Demmel is the Dr. Richard Carl Dehmel Distinguished Professor of Computer Science and Mathematics at the University of California at Berkeley, and former Chair of the EECS Dept. His research is in numerical linear algebra, high performance computing, and communication avoiding algorithms. He is known for his work on the widely used LAPACK and ScaLAPACK linear algebra libraries. He is a member of the National Academy of Sciences, National Academy of Engineering, and American Academy of Arts and Sciences; a Fellow of the AAAS, ACM, AMS, IEEE and SIAM; and winner of the IPDPS Charles Babbage Award, IEEE Computer Society Sidney Fernbach Award, the ACM Paris Kanellakis Award, and numerous best paper prizes.




Share this: