Background on VLSI, Supercomputing, and other engines of growth in computer hardware

Much of the following topics appear in my Computer Systems undergraduate course (CS 3650), as part of the computer architecture component. Since ther is no obvious place for it in the graduate program (since the Computr Architecture course is almost never taught), I'm putting the material here.

VLSI -- How are CPUs built, and when will Moore's Law end?

What are the largest supercomputers today?

NEWS: Coral supercomputers (appearing from 2016--2018; see especially the table, lower down in this article)

NEWS:

Intel Announces Knights Mill: A Xeon Phi For Deep Learning (see also Coral supercomputers, below)

Deep Learning on NVIDIA GPUs

DeepMind Beats Human Champion at Game of Go (in 2015)

"[The deep learning algorithm of] AlphaGo doesn't actually use that much hardware in play, but we needed a lot of hardware to train it and do all the different versions and have them play each other in tournaments on the cloud. That takes quite a lot of hardware to do efficiently, so we couldn't have done it in this time frame without those resources."