Recently I studied the very fundamentals of how computers work at the level of 1s and 0s, wires, and logic gates from a book written for laypersons. I followed the book along and built a very, very primitive computer with a cpu and ram in a simulator by plotting different kinds of logic gates and connecting them with wires.
After this exercise I’m left wondering how are new chips designed nowadays considering that there are billions and billions of microscopic transistors in a modern chip? I’m assuming there are some levels of abstraction to simplify the process? I can’t imagine all those billions of transistors and wires being plotted manually one by one by people. Is there like a programming language of some sort where a compiler converts syntax into circuitry layouts?
Also, I don’t mean the physical manufacturing process. I think I have a good grasp of that. I’m purely talking about the design stage.
- After this exercise I’m left wondering how are new chips designed nowadays considering that there are billions and billions of microscopic transistors in a modern chip? - You have to use one, to design one. - And the first ones would have been done using more primitive computers, and those by hand. There’s indeed a whole bootstrapping problem here where you need the tech to make the tech. You see this with other tech like compliers where new versions of compilers are compiled using the old versions of the same compilers. 
 
- Check this out http://opencircuitdesign.com/magic/ 
- You might start with the now quite old but groundbreaking book “Introduction to VLSI - designSystems”[*] by Mead and Conway. It shows the chip layouts for basic switches, gates, and stuff like that, and design tools for building circuits up from those. Now imagine decades of improvements to the design tools. For the past few decades chip designs have looked like software programs, except highly parallel. The two main languages (“HDL’s”) are Verilog (looks like C) and VHDL (looks like Ada). You say what circuits you want and how they should be connected, and the compiler and layout software figure out how to place thst stuff on the chip. You can drop “macros” for very complex subsystems into your design, such as communication ports or even microprocessor cores, without having to worry about the insides of the macro blocks. There is a significant FOSS ecosystem of such macros too: see opencores.org for some of what’s available. It’s sort of like Github but for hardware.- Edit: I mis-remembered the book title, it’s “Introduction to VLSI Systems” not “Design”. See also: - https://en.wikipedia.org/wiki/Mead–Conway_VLSI_chip_design_revolution - Thanks! I’ve been collecting old computer books from resell shops. I’ll look for it. - See my edit above where I fixed the book title, which I had originally mis-remembered. It looks like the book is on archive.org: - https://archive.org/details/introductiontovl00mead/page/n1/mode/2up 
- Doesn’t sound easy to find that way. It wasn’t a popular-type computer book. You might be better off looking online. 
 
 
- Is there like a programming language of some sort where a compiler converts syntax into circuitry layouts? - You are looking for something like System Verilog (Or VHDL). - Both these languages let you describe hardware. They can both go down to the circuit and transistor level, but people won’t write that by hand. Rather, they will write code that is a description of digital hardware (flip-flops and the logic between them), and then let tools synthesize their description down to individual logic cells and simple functions. Often, chip fab houses have “standard cell libraries” that implement the most common logical functions, and the tools stich them together based on the higher level description. - Then there is all the verification that needs to be done, not just verification that the design is doing what it needs to do at all times, but that every individual chip is made correctly. Defects do happen, and you want to find them as early as possible in the process. Chip companies spend considerable effort on verification. - Lots and lots of expensive tools and specialized knowledge! A good middle ground are FPGAs. These are special chips with lots of generic logic building blocks, and ways to programmatically make connections between them. You can write the same VHDL or Verilog for FPGAs, butt the tools map the logic to the FPGA vendor’s chip directly, and “programs” the chip to implement the logic. These still require tools and specialized knowledge, but much cheaper than a fully custom chip. - One of the trippy things to understand about digital logic when coming from a software background is that it is massively parallel. All the logic is ticking on each clock edge, all the time. While there may be an order to how an HDL file is written, the individual blocks in it are all operating at the same time once the design is actually running in silicon. So when you write it, you need to keep all this parallelism in mind - Thanks! I’ve heard of FPGAs in the retro gaming space. Cool stuff. 
 
- It’d be so cool if it was actually worth learning this stuff to get a job. Id have loved it! 
- I love how OP got a lot of in depth detailed answers and replied to none of them - Perhaps they were at work? Or sleeping? Or having sex? Or just not on social media? 
- Ah sorry I was asleep. Just now reading all the responses. - Na my bad 
 
- It took one hour for you to reply, and another for me to reply to you. Op may eventually return. 
 
- What was the book and would you recommend it? - “But How Do It Know? - The Basic Principles of Computers for Everyone” by J Clark Scott. It’s a short self published book aimed at laypersons written in a very simple, sort of conversational style. Through the whole book he guides you to building a functioning computer that does some bitwise operations. I built most of the computer in a simulator program called Logisim (https://cburch.com/logisim/). - I’d recommend it if you’re looking for an easy introduction to the topic. It was a fun read for me. Get the print version; some of the diagrams in the Kindle version are badly compressed. 
- It sounds awfully like “the elements of computer systems” but I still haven’t gotten around to working through it so don’t really know. - https://mitpress.mit.edu/9780262361002/the-elements-of-computing-systems/ 
- I’m also curious about this. Conceptually I understand the types of logic used for inferencing, and I have a modest high-level understanding of the material science behind the production and etching of silicon wafers. It would be interesting to finally learn how that logic is implemented into a physical logic gate though. - See my reply to jcubed above 
 
 
- When I was studying this formally they were using VHDL and Verilog which are programming languages to do circuit design. - Roughly they can take a logical description (input A goes to output B, etc), do A LOT of math and create the circuit layout. 
- In a high-level, you don’t design them anymore. You write them, in code. The compiler turns your code into the chip masks, and has an optimizer that will mangle the hell out of the relatively simple stuff you wrote. - In a lower level, that compilation is not really done automatically, and people will intervene in lots of places, and AFAIK, how people divide it and interact with it are well guarded secrets from the chip makers. 
- I can’t imagine all those billions of transistors and wires being plotted manually one by one by people - They don’t really need to be cause it’s a repeating pattern. - Doesn’t matter what the pattern is, but think of it like a checkerboard. - If you were making one, you’d take the time to just draw it yourself. This would be designing a chip. - If you decided to cover a football field in checkerboard pattern, you’d take you’re existing checkerboard, and just “stamp” that over and over again. Maybe even strap a dozen checkerboards together and then stamp that. This would be scaling your initial chip design for producture. - Once the entire design is done, you make some prototypes, test them, and fix fail points. - When you go in wide scale production, some have too many errors, so you blank out even more and sell it for cheaper. Although that’s out of scope it’s good to highlight that there are acceptable fail rates for design. If they went only for 100% success nothing would ever make it to production 
- I don’t know enough to say anything concrete, but it’s for sure going to be abstractions on top of abstractions. 




