Nir Shavit
Nir Shavit (Hebrew: ניר שביט) is an Israeli computer scientist. He is a professor in the Computer Science Department at Tel Aviv University and a professor of electrical engineering and computer science at the Massachusetts Institute of Technology.
Nir Shavit | |
---|---|
Alma mater | Technion, Hebrew University of Jerusalem |
Known for | Software transactional memory, wait-free algorithms |
Awards | Gödel prize, Dijkstra prize |
Scientific career | |
Fields | Computer science: concurrent and parallel computing |
Thesis | (1990) |
Website | www |
Nir Shavit received B.Sc. and M.Sc. degrees in computer science from the Technion - Israel Institute of Technology in 1984 and 1986, and a Ph.D. in computer science from the Hebrew University of Jerusalem in 1990. Shavit is a co-author of the book The Art of Multiprocessor Programming, is a winner of the 2004 Gödel Prize in theoretical computer science for his work on applying tools from algebraic topology to model shared memory computability, and a winner of the 2012 Dijkstra Prize for the introduction and first implementation of software transactional memory. He is a past program chair of the ACM Symposium on Principles of Distributed Computing (PODC) and the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA).
His research covers techniques for designing, implementing, and reasoning about multiprocessors, and in particular the design of concurrent data structures for multi-core machines.
Recognition
- 2004 Gödel prize
- 2012 Dijkstra Prize
- 2013 Fellow of the Association for Computing Machinery[1]
Currently he has co-founded a company named Neural Magic along with Alexzander Mateev. The company claims to use highly sparse neural networks to make deep learning computationally so efficient that GPUs won't be needed. For certain use cases they claim a speed up of 175x.[2]
References
- ACM Names Fellows for Computing Advances that Are Transforming Science and Society Archived 2014-07-22 at the Wayback Machine, Association for Computing Machinery, accessed 2013-12-10.
- "The Future of Deep Learning is Sparse. - Neural Magic". 12 July 2019.