Why is the number of transistors in a computer important?


In the context of this exercise, why is the number of transistors in a computer important?


The number of transistors in a computer is important because it correlates to a computer’s performance.

The main functionality of a transistor is to switch or to amplify electronic signals. These seemingly simple tasks are what makes things such as binary logic operations and ultimately computing possible. With more transistors, more of these actions can take place, at the same time. This is important because computers tend to do things in parallel, so more transistors will correlate to improvements in functionality and speed.

As the number of transistors grows over time, this means more powerful computers, and the ability for programmers to solve more difficult and complex tasks and problems. These tasks and problems may be in fields such as health, education, science, travel, and many others that require the use of computing.


It takes six transistors, or at least did originally in RCA CMOS RAM back in 1968, to store one BIT in memory. That means 48 for a single byte.

RCA COS/MOS 288-bit RAM Array 1968. Gift of Richard Ahrons

My TIMEX Sinclair had 4KB of RAM. That’s 48 times 4 times 1024 transistors.

48   == 2 ** 4  * 3
4    == 2 ** 2
1024 == 2 ** 10
     => 2 ** 16 * 3
     == 196 608

Nearly two hundred thoursand transistors for four kilobytes of RAM.

If your computer has 64 GB of RAM, that’s 48 times 64 times 2 ** 30.

48    ==  2 ** 4 * 3
64    ==  2 ** 6
1GB   ==  2 ** 30
      =>  2 ** 40 * 3
      ==  3 298 534 883 328

And that’s just the RAM. What about screen and audio memory? Buffers? Caches?

How many transistors are involved just keep the clock working?

Bottom line is we are in the age that could not have been predicted when this technology was newly formed. The number of generations since this CMOS was introduced is many. It all started with 288 bits and gradually snowballed from there.

This level of integration would have cracked Enigma in hours/days as opposed to weeks/months in the Second World War. Imagine all those transistors were vacuum tubes back then. How many bits of memory could they even keep alive back then? And how much space would it take to store one kilobyte of RAM? Integration brought things from building size to room size to refrigerator size to bread box size to toaster size to coaster size to postage stamp size to pin head size to microdot size in a matter of decades.

Now we have quantum computing that is using extremely low temperatures and super computers to perform tasks that used to take months in picoseconds. It is all thanks to the transistor and the logic gates that emerged from that technology.


Computer makers were thinking about system overhead, even back then and probably designed the extra 32 bits into the chip for the system to use, giving us the conventional 256 bits. Not to mention how convenient the 10, 10, 10, 10 pin configuration worked out for all the taps. That gives us enough room for eight four-bit registers, or four eight-bit registers, or two sixteen-bit registers on each 256 bit block of memory. For addressing this would be most useful.

Learners today have no idea the minutae of the past. They have in their hands the progenations of it. We could never have conceived this back in the day, but secretly wianted it. Hence, we have it. Also hence, ‘Careful what you wish for.’

It is because of computing ability such as this that we have GPS tracking and Amazon, UBER and LYFT, Google and Facebook, to name just the apex of the technological pyramid. Grab on and don’t let go.


thanks sir this was a great information.


Ok, I understand how a transistor works and that more transistors mean more calculations at the same time but how can the system get access to this many transistors technically, that i don’t understand?


Wow, just wanted to say what a great source of info that was. :+1:

1 Like

I keep reading this tidbit of info and it blows my mind equally every time. Thank you for sharing this.

The main question that entered my mind is - if the purpose of transistors is for switching and amplifying signals, how do the transistors know which signals to amplify / switch and when?

1 Like

Switching transistors generally work in groups, and there are many, many transistors involved. Amplifiers don’t use very many transistors (a 1500 Watt amplifier may have only a dozen or so transistors) but as we can see handle of tonne of current. They take a small current at the collector and output a high current at the emitter (or is it the other way around).

The circuit itself dictates the logic and flow of current. Logic circuits use gates, which amplifiers do not. They are also usually only one polarity, say +5V, where amplifiers are tapped onto both the V- and the V+ rails with the output somewhere in between in terms of amplitude (analog).


That was so informative, and amazing - being a first hand account.
With the increase in the number of transistors and their decrease in size, I read that stability issues are caused. How does that occur? Is it the number of transistors or their size pushing the technology to quantum mechanics or is quantum mechanics the solution?

I have no real knowledge of electrical engineering so cannot give you a good, or even reasonable answer. With the increase in scale, the size and distance need to get smaller and smaller. Charge states will have less and less electrons else the thing would just burn up. A modern CPU will fry in milliseconds if powered up with no cooling. We could assume that heat can create instability given thermodynamics and chaos.

The release of Apple’s M1 chip opens up a whole new world of scale. 16 billion transistors!

Mac - Apple

To get a perspective of how far we have come, see if you can track down this article…


The MOS fabrication technique can put more than 10,000 electronic components on a silicon “chip” only a few millimeters across. Its economic impact can be seen in such devices as pocket calculators.

By William C. Hittinger

Scientific American
August 1973
pp. 48 thru 57

Found the article with pay-for-access…

Metal-Oxide-Semiconductor Technology | Scientific American

If you have any interest in the science of the past, the 99 dollar option is loaded with value. 12 new issues and full archive access back to 1845. The PDF is well priced, and worth the 8 bucks.

Call your favorite book shop and see if they keep old magazines in their warehouse.

If you live near a good library you may want to check there. I suspect you would need to submit a special request for them to retrieve something this old.


Went down a rabbit hole with this trying to understand transistor technology, miniaturization, exponential computing processes, Moore’s Law, etc, and found out about a really interesting and surprisingly mature subset of study known as Neuromorphic engineering. Trying to replicate the functions of the human brain with its particularly low wattage demands and high comparative “transistor” counts (synapses, dendrites and somas) with more gradated states that build up to a range of values rather than just on-off gating.

Super interesting stuff. :face_with_monocle:


Thanks MTF for your article. It was very interesting!

1 Like

2 posts were split to a new topic: Should I purchase some textbooks?

I just wanted to plug in this website that I have recently found, as it relates to how computers are made and work between hardware and software and the puzzles can be tricky sometimes.

1 Like

Thank you for this great piece of information!

1 Like