Seminar: Daniel Kröning, AI Accelerators 101–And Why You Should Build AI Accelerators with Us
Content
Speaker
Abstract
The development of the technology and infrastructure to train large-scale large language models (LLMs) is arguably one of the largest inventions in the history of IT. I will present a case for custom hardware for training the largest LLMs, and why compilers and deep program analysis are the key enabling technology for this hardware. I will give a quick introduction to AI accelerators based on systolic multiplier arrays, and how these are used in AWS’s very latest Tranium processor. We are looking for students to help us build, and for faculty present I will give rationale and details for Amazons $110m Build on Trainium research grant program, to accelerate AI research and education at universities.
The talk targets a broad audience including undergraduates, graduate students, and faculty members. No prior knowledge of deep learning is assumed. Pizza will be provided.
Bio
Daniel Kroening is a Senior Principal Applied Scientist at Amazon, where he works on the correctness of the Neuron Compiler for distributed training and inference. Prior to joining Amazon, he worked as a Professor of Computer Science at the University of Oxford and is the co-founder of Diffblue Ltd., a University spinout that develops AI that targets code and code-like artefacts. He has received the Semiconductor Research Corporation (SRC) Inventor Recognition Award, an IBM Faculty Award, a Microsoft Research SEIF Award, and the Wolfson Research Merit Award. He serves on the CAV steering committee and was co-chair of FLOC 2018, EiC of Springer FMSD, and is co-author of the textbooks on Decision Procedures and Model Checking.