Recently I studied the very fundamentals of how computers work at the level of 1s and 0s, wires, and logic gates from a book written for laypersons. I followed the book along and built a very, very primitive computer with a cpu and ram in a simulator by plotting different kinds of logic gates and connecting them with wires.

After this exercise I’m left wondering how are new chips designed nowadays considering that there are billions and billions of microscopic transistors in a modern chip? I’m assuming there are some levels of abstraction to simplify the process? I can’t imagine all those billions of transistors and wires being plotted manually one by one by people. Is there like a programming language of some sort where a compiler converts syntax into circuitry layouts?

Also, I don’t mean the physical manufacturing process. I think I have a good grasp of that. I’m purely talking about the design stage.

  • spittingimage@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    7 hours ago

    After this exercise I’m left wondering how are new chips designed nowadays considering that there are billions and billions of microscopic transistors in a modern chip?

    You have to use one, to design one.

    • False@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      2 hours ago

      And the first ones would have been done using more primitive computers, and those by hand. There’s indeed a whole bootstrapping problem here where you need the tech to make the tech. You see this with other tech like compliers where new versions of compilers are compiled using the old versions of the same compilers.

  • solrize@lemmy.ml
    link
    fedilink
    arrow-up
    19
    ·
    edit-2
    5 hours ago

    You might start with the now quite old but groundbreaking book “Introduction to VLSI design Systems”[*] by Mead and Conway. It shows the chip layouts for basic switches, gates, and stuff like that, and design tools for building circuits up from those. Now imagine decades of improvements to the design tools. For the past few decades chip designs have looked like software programs, except highly parallel. The two main languages (“HDL’s”) are Verilog (looks like C) and VHDL (looks like Ada). You say what circuits you want and how they should be connected, and the compiler and layout software figure out how to place thst stuff on the chip. You can drop “macros” for very complex subsystems into your design, such as communication ports or even microprocessor cores, without having to worry about the insides of the macro blocks. There is a significant FOSS ecosystem of such macros too: see opencores.org for some of what’s available. It’s sort of like Github but for hardware.

    Edit: I mis-remembered the book title, it’s “Introduction to VLSI Systems” not “Design”. See also:

    https://en.wikipedia.org/wiki/Mead–Conway_VLSI_chip_design_revolution

  • dhork@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    10 hours ago

    Is there like a programming language of some sort where a compiler converts syntax into circuitry layouts?

    You are looking for something like System Verilog (Or VHDL).

    Both these languages let you describe hardware. They can both go down to the circuit and transistor level, but people won’t write that by hand. Rather, they will write code that is a description of digital hardware (flip-flops and the logic between them), and then let tools synthesize their description down to individual logic cells and simple functions. Often, chip fab houses have “standard cell libraries” that implement the most common logical functions, and the tools stich them together based on the higher level description.

    Then there is all the verification that needs to be done, not just verification that the design is doing what it needs to do at all times, but that every individual chip is made correctly. Defects do happen, and you want to find them as early as possible in the process. Chip companies spend considerable effort on verification.

    Lots and lots of expensive tools and specialized knowledge! A good middle ground are FPGAs. These are special chips with lots of generic logic building blocks, and ways to programmatically make connections between them. You can write the same VHDL or Verilog for FPGAs, butt the tools map the logic to the FPGA vendor’s chip directly, and “programs” the chip to implement the logic. These still require tools and specialized knowledge, but much cheaper than a fully custom chip.

    One of the trippy things to understand about digital logic when coming from a software background is that it is massively parallel. All the logic is ticking on each clock edge, all the time. While there may be an order to how an HDL file is written, the individual blocks in it are all operating at the same time once the design is actually running in silicon. So when you write it, you need to keep all this parallelism in mind

    • Sheridan@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      6 hours ago

      “But How Do It Know? - The Basic Principles of Computers for Everyone” by J Clark Scott. It’s a short self published book aimed at laypersons written in a very simple, sort of conversational style. Through the whole book he guides you to building a functioning computer that does some bitwise operations. I built most of the computer in a simulator program called Logisim (https://cburch.com/logisim/).

      I’d recommend it if you’re looking for an easy introduction to the topic. It was a fun read for me. Get the print version; some of the diagrams in the Kindle version are badly compressed.

    • demi_demi_demi@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      9 hours ago

      I’m also curious about this. Conceptually I understand the types of logic used for inferencing, and I have a modest high-level understanding of the material science behind the production and etching of silicon wafers. It would be interesting to finally learn how that logic is implemented into a physical logic gate though.

  • CookieOfFortune@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    10 hours ago

    When I was studying this formally they were using VHDL and Verilog which are programming languages to do circuit design.

    Roughly they can take a logical description (input A goes to output B, etc), do A LOT of math and create the circuit layout.

  • marcos@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    10 hours ago

    In a high-level, you don’t design them anymore. You write them, in code. The compiler turns your code into the chip masks, and has an optimizer that will mangle the hell out of the relatively simple stuff you wrote.

    In a lower level, that compilation is not really done automatically, and people will intervene in lots of places, and AFAIK, how people divide it and interact with it are well guarded secrets from the chip makers.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    10 hours ago

    I can’t imagine all those billions of transistors and wires being plotted manually one by one by people

    They don’t really need to be cause it’s a repeating pattern.

    Doesn’t matter what the pattern is, but think of it like a checkerboard.

    If you were making one, you’d take the time to just draw it yourself. This would be designing a chip.

    If you decided to cover a football field in checkerboard pattern, you’d take you’re existing checkerboard, and just “stamp” that over and over again. Maybe even strap a dozen checkerboards together and then stamp that. This would be scaling your initial chip design for producture.

    Once the entire design is done, you make some prototypes, test them, and fix fail points.

    When you go in wide scale production, some have too many errors, so you blank out even more and sell it for cheaper. Although that’s out of scope it’s good to highlight that there are acceptable fail rates for design. If they went only for 100% success nothing would ever make it to production

  • Mark with a Z@suppo.fi
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    10 hours ago

    I don’t know enough to say anything concrete, but it’s for sure going to be abstractions on top of abstractions.