Very Large Scale Integration (VLSI): What does it mean?

Technology has come a long way in a short time, this is the sentence we keep hearing from our elders. The transition from telephones to pagers to cell phones felt earned and hard-fought, the switch from bulky, handheld, external keyboard phones to sleek, powerful, addicting smartphones was abrupt. The rate of progress has been astounding these past few years, and at the forefront of this revolution is the technology we call semiconductor devices.

You thought I was going to say VLSI? Semiconductor devices have been around for a long time, the semiconductor effect was first noted in 1874 by Karl Ferdinand Braun. Then, crystal detectors for microwave radiation emerged in 1901 when Jagadish Chandra Bose demonstrated its working. The biggest discovery came in 1947, when the first transistor was invented at Bell Labs by John Bardeen, William Shockley, and Walter Brattain, and since that event we have never looked back. The solution to the bulky vacuum tube was the small and tiny transistor. Changing 1s to 0s and vice versa became easier and scalable.

The starting point in the early 1960s was small scale integration (SSI), only 1-10 transistors per chip. A very humble starting point for military applications. Many such chips were required to make a single processor. The unwieldy nature of this device pushed scientists and engineers to fit more and more transistors in one chip. Medium scale integration (MSI) came along due to improvements in photolithography resolution in the mid 1960s. 10-1,000 transistors could be stuffed into one chip. The period between 1970-1980 saw the incorporation of 1,000-100,000 transistors, marking the arrival of large scale integration (LSI). The invention of ion implantation machines and precision oxidation allowed for the achievement of this feat. 

The need to keep pushing technology led us into creating the topic of this post: VLSI. It refers to 100,000 to millions of transistors in one chip due to the advent of CMOS technology and computer aided design (CAD) tools (yes we are using computers to make more computers). The famous Moore's law that states that the transistor density in a chip doubles from the previous generation was in full effect during this period i.e. 1980-2010. The next generation of computing lies in ultra large scale integration (ULSI) consisting of transistors in the range of billions. Moore's law begins to fail here because we are starting to wrestle with the fundamental building blocks of our existence; I refer to these devices having dimensions comparable to a line of 50 atoms. Silicon has a bond length of 2.35 Angstrom, it is 10 billion times smaller than a meter. The channel length of a ULSI level transistor can be around 10-20 nanometers. These advancements will have to work with the consideration of quantum effects like quantum mechanical tunnelling, short channel effects, quantum confinement and the like.

Figure 1: A graph depicting Moore's law. (Reference: The Centre for Computing History)


Seems like the age of VLSI is over. Yet why do we keep talking about it? Why is there more demand for engineers working in VLSI more than ever before?

Each integration scale that we talked about has its own usage when scaling with information density. All of these devices shown in Figure 1 are digital central processing units (CPUs). Small digital devices and analog devices work in tandem with the CPU to process and transmit information. Devices like operational amplifiers, comparators, multiplexers, counters, registers, and many others do not require such a high density of transistors. While the scaling of analog devices does not follow this method due to separate constraints, digital devices can be separated based on information density and complexity.

A brief comparison of the various scaling usages:
  • SSI: logic gates (AND, OR, NAND, NOR, etc.), 2:1 MUX
  • MSI: counters, registers, decoders, 4:1 MUX ,8:1 MUX
  • LSI: 16 bit ALUs, Intel 4004
  • VLSI: microcontrollers, digital signal processors (DSP), Intel Pentium
  • ULSI: Modern CPUs and GPUs, Nvidia's RTX 50 series, Intel's I7, I9
Table 1: Table describing various digital devices (1E+006 means 1 x 106). Effective bits per second = clock * width * parallelism



Figure 2: Graph showing the trend in transistor count vs performance (log scale)

Every company cannot work in creating the next Blackwell substitute or AMD Ryzen competitor. These are ventures that can cost upwards of a Hadron collider. The CEO of Nvidia Jensen Huang has stated in a podcast that Blackwell had cost around 10 billion USD in research and development. This amount is already greater than the amount Tata and PSMC are spending to make a fabrication plant in Dholera, Gujarat. No small VSLI company in India is earning anything comparable to that amount; so these companies switch to different types of devices like application specific integrated circuits (ASICs) and field programmable gate arrays (FPGAs) which have found a foothold in indigenous companies like Tata Elxsi, MosChip Technologies, and Sasken Technologies. 

Good news for future engineers of India because all the heavyweights in the industry have design centres in Bengaluru. AMD has opened their largest design centre in the Technostar campus. India is a big attraction for designing due to easy access to good designers (like you, maybe). Every ECE student in India dreams to enter one of these companies (mostly due to the paycheque, wink). VLSI design and its process flow is something that is far out of this post's scope, and frankly beyond my knowledge. I am more interested in the fabrication side, do understand that designing and fabricating are two sides of the same coin. 

If you are interested in VLSI process then here are some easy to follow references:

Comments

Popular posts from this blog

VIKRAM3201: Space grade versus regular processors

The Mount Everest of Capital Investment: Understanding the huge barrier of entry in the semiconductor fabrication industry

Lithography AKA Rock Drawing