# Printed Circuit Boards Chips by themselves are useless. They provide their value by being powered on and by interconnecting with other chips and processing inputs and outputs according to the application requirements, and this can only be achieved utilizing Printed Circuit Boards (PCBs). Printed Circuit Boards are the foundational building blocks of modern digital systems. They are used to mechanically support and electrically connect semiconductors and other electronic components through the ingenious use of conductive pathways, tracks, or signal traces etched from copper sheets laminated onto a non-conductive substrate. PCBs are typically made from layers of fiberglass infused with epoxy (with FR4[^73] being the most common material), with copper cladding on one or both sides. Complex boards may have multiple layers, with insulating materials separating layers of copper circuits and vias connecting them according to the circuit layout. The design of a PCB is usually carried out using specialized software, known as Electronic Design Automation (EDA) or Computer-Aided Design (CAD) tools. These tools assist in the creation of the schematic for the circuit and the layout of the PCB, defining the placement of components and the routing of electrical connections. # Design Process We design to solve problems. The thing is, there are multiple ways of solving a problem. Irrespective of the paths that we choose, everything we engineer boils down to a combination of analysis and synthesis, decomposition and integration. This is the true nature of engineering: unpacking a problem into tractable bits to conceive a solution that also is made of parts that we realize on their own until we integrate them back all together to obtain an integral unit. The terms analysis and synthesis come from (classical) Greek and mean literally "to loosen up" and "to put together" respectively. These terms are used within most modern scientific disciplines to denote similar investigative procedures. In general, analysis is defined as the procedure by which we break down an intellectual or substantial whole into parts or components. Synthesis is defined as the opposite procedure: to combine separate elements or components to form a coherent whole. Careless interpretation of these definitions has sometimes led to quite misleading statements, for instance, that synthesis is "good" because it creates wholes, whereas analysis is "bad" because it reduces wholes to alienated parts. According to this view, the analytic method is regarded as reductionistic whereas synthesis is seen as a holistic perspective. Analysis and synthesis always go hand in hand; they complement one another. Every synthesis is built upon the results of a preceding analysis, and every analysis requires a subsequent synthesis to verify and correct its results. In this context, to regard one method as being inherently better than the other is pointless. When we design, there cannot be synthesis without analysis, as well as we can never validate our analyses without synthesis. For the tasks we must perform while we design systems, we must formulate them before executing them. There is an adage that goes: it is not the idea that counts, it's the execution. We can poorly execute an excellently thought idea, as well as we can poorly think something we execute perfectly. For the designs that we want to achieve, it is always about thinking before doing. We cannot expect good results out of no planning or no thinking. It's like taking a bus before you know where you want to go. Whatever the problem is that we are trying to solve as designers, we must take time to analyze what needs to be done and describe it to others involved with us in the adventure. Time is always a constraining factor: we cannot take too long to think about everything: we always need to move into execution otherwise nothing happens. To create complex digital systems from scratch, we must start by identifying a need, or an idea, that triggers a set of activities. Mature companies usually work from an identified need (using techniques such as market research, focus groups, customer feedback, etc.). Smaller companies may not be able to afford such a sophisticated approach and instead work on ideas based on intuition or innovation. The analysis-synthesis duality is probably the highest level of abstraction we can put ourselves to observe how we design complex things. Both sides offer a lot to discuss. As engineers, we are put in a quest where we must turn an abstract concept into a device that will perform a function that will—hopefully—realize the idea or fulfill the need that triggered it all. Think about it again: something that is a thought in our brains needs to become an actual technical artifact that must perform a job in a given environment. What type of device or system will fulfill the need is entirely our decision. Once we decide what type of artifact we must realize for the problem we ought to solve, it becomes our design subject. At this stage, it is more of a wish than something else, or a collective wish if we are working in a team. We must then dismember it into fictional bits, chunks, or parts that our design subject is made of, even without being sure what those bits and chunks will effectively work together in the way we initially wished for. Designing requires a dose of betting as well. Eventually, the moment of bringing the bits together arrives, and that is precisely the objective of the synthesis part: connecting the pieces of the puzzle and checking—verifying—that the whole works as the analytical phase thought it would. In any project, the time comes when analysis meets synthesis, and that is when reality kicks in. Here's one key: analysis works on *fictions* without any operative behavior. The analytical phase is rich in requirements, models, diagrams, schematics, and layouts. Whereas synthesis works with a more concrete or behavioral reality (electronics, software). It is clear then that analysis and synthesis belong to different domains: the former to the intangible, the latter to the tangible and behavioral which can be operated by a user. Moreover, the real world is full of imperfections the analytical world does not have. During the design process, analysis and synthesis feed each other with tons of information. This feedback helps us adjust our analytical and synthetic works by perceiving, interpreting, and comparing the results from our initial wishes and needs. Is the new whole performing as expected? If not, do more analysis, do more synthesis, verify, repeat. If yes, enjoy. The circular nature of the design process clearly shows a "trial and error" essence at its core. We think, we try, we measure, we compare, we adjust (with more thinking), and we try again. Engineering activity requires a constant *osmosis* between analysis and synthesis. More accurately—if we are allowed to steal terms from biology—a *symbiosis*. Both need each other and complement each other. In more classic engineering, the analysis and synthesis activities can run long distances on their own. Extended periods of analysis can take place before thinking about putting anything real together. In more agile approaches that seek to change that paradigm, both activities get closer together. ### Design lifecycle stages The design process can be generically unpacked in a set of six stages: - **Conceptualization:** - Identify needs. - Define problem space. - Characterize the solution space. - Explore ideas and technologies. - Explore feasible concepts. - Identify knowledge blind spots. - **Development** - Capture/Invent requirements. - Decompose and break down the work: - Capture architecture, perform system breakdown - Cost Analysis - Work breakdown - Org breakdown - Align all the above under the Work Breakdown Structure (WBS) - Capture knowledge - Integrate, prototype, test. - **Production (Construction)** - Replicate the design into multiple instances that can reliably perform. - **Utilization** - Operate the instances created. - **Support** - Ensure the Product's capabilities are sustained. - **Retirement** - Archive or dispose of the Product ### Work Breakdown As we said, problem-solving requires us to break things down into a variety of smaller, more manageable things. These constituent parts of a problem are not isolated from each other, they require some level of connection or linking, in a sort of network. To make things more interesting, the problems we ought to solve are seldom isolated from the rest of the world. We usually deal with a set of connected problems; what management science calls *messes*. In this decomposition process, we assign to that network of decomposed *bits* a structure that often appears to be hierarchical, where we categorize and group things. It is like forming a small cabinet in our brains where we have little drawers where we put stuff in, and we take stuff out. But do our brains work in this folder-like manner? The file cabinet approach makes it difficult or impossible to map or reuse the same piece of information in multiple drawers. Each time a change is made to any given file, it must be tracked down and updated in every location in which it exists. This leads to redundancy, with a cluttering of near-identical ideas, and significant work any time a system-wide change is required. It could not be overstated how critical, and ubiquitous hierarchies are in engineering design. > [!attention] > It is by breaking—as in, decomposing things down—the wrong way that we can easily break (as in, destroy) projects beyond repair. Bad breakdowns can break everything down. Not only do the activities and tasks need to be broken down, but the functional and physical organization of the device-to-be as well. Practically every technical object around us is a container: objects composed of other objects, composed of other objects. We commented that during analysis we work with fictional representations of how things will look like, and one key depiction is the architecture. Architectural diagrams allow for an organized sharing of both business and technical information related to a project and product. Architectural depictions are a cornerstone of the systemic approach: a system is ultimately a collection of relationships, interfaces, form, and function. > [!attention] > The essence of architecture is structuring, simplifying, compromising, and balancing. Architectures are a powerful graphical tool for the representation of system aspects, thus capturing the imagination of people's minds and streamlining communication. Architecture diagrams are the main character and output of the analytical phase. Analysis requires us to break things down by creating a tree-like set of (parent-child) relationships of objects, which in turn expand to many branches themselves. Breaking down means applying a "divide and conquer" approach. The challenge is that this analysis phase produces many parallel breakdown structures, and not just one. For example, cost analysis, bill of materials, labor, product, and even org charts. The recursive nature of the systems we design doesn't help; this means there are breakdowns inside the breakdowns, and so on. We don't have the luck of dealing with only one breakdown structure; we recursively deal with many of them. Trying to keep them all aligned is a continuous activity throughout the project life cycle which must never be abandoned. ### Board Design As we discussed, any engineering design process starts with a set of drivers that quickly become the primary objectives that the design must meet. These objectives may originate from customer needs, market analysis, or any other good reason to justify the effort of having to design a board from scratch. In time, these high-level needs will be translated into more specialized requirements. Requirements will be flowed down into different domains: mechanical, software, thermal, EMI/EMC, and data interfaces. And, once again, hierarchy kicks in: an electronic board might be a final product in itself (for instance, sold as an OEM[^74] product), or it can be a part of another board as is the case for [[Backplanes and Standard Form Factors#Mezzanines|mezzanines]] or [[Backplanes and Standard Form Factors#COM Express (Computer-on-Modules) and Carrier Boards|computer-on-modules]]. Or the PCB could be a slot module to be plugged into a [[Backplanes and Standard Form Factors|backplane-based]] unit with the unit being the final product (for instance, an avionics computer or a rack-mount radar processor). Or the board could be part of a unit that belongs to a subsystem in a bigger system, for instance, attitude control in a satellite, or a server for edge computing. Still, the process of designing a board does not dramatically change with respect to where in the hierarchy it lies, but certainly, the requirements will be shaped by this. Most electronic boards of reasonable complexity will contain some level of embedded software or programmable logic, and the co-development of the hardware and the software is a complex topic in and of itself. In most cases, the software development kicks off even before the existence of any hardware by means using prototyping means like development kits (which are partially representative of the target system) or lately by using sophisticated software-based [[System Level Simulation|simulation tools]] that model the target system with various degrees of fidelity. In general, designing a board includes a set of typical activities although their evolution may not always follow a straight line. Also, some of the steps are not strictly sequential and some of them might run in parallel): - Architecture design, analysis, elicitation and flow down of requirements (software, mechanical, thermal, interface), initial sketching, prototyping, conceptual block diagram capture. This initial stage tends to include feverish engineering discussions, brainstorming, and supply chain analysis. - Schematic Capture - Software Development - Layout routing - Board Manufacturing - PCB Assembly (PCBA) - Bring Up - Supply chain management for bill of materials (BoM) - Integration and Verification, including integration to the host system (if applicable) - Support The design process for boards is depicted graphically in the diagram below. The diagram aims to show which tasks happen sequentially (for instance, a board cannot be brought up before it's manufactured and assembled) and which tasks happen in parallel (software development may run concurrently while the board is being designed by using development kits[^75] or virtual platforms). Note that there are loops all over the place in the process that are not shown in the diagram for clarity. Also, consider that some of these activities may happen inside the same organization or they might be contracted to third-party companies. ![Board design process](image295.png) > [!Figure] > _Board design process_ #### AI-powered board design? On September 19th, 2023, Zuken Inc.^[https://www.zuken.com/] (a provider of EDA tools) Zuken announced its approach to AI-assisted PCB design. The new technology, called AIPR—an acronym for Autonomous Intelligent Place and Route—will be provided as part of Zuken’s CR-8000 PCB design suite. According to Zuken, "It will enable new levels of PCB design efficiency and accuracy by harnessing machine learning". How exactly does this new feature work? According to Zuken, the new feature will be comprised of a new routing engine called Smart Autorouter that will leverage machine learning based on a technology Zuken calls "the Brain". The Brain will be introduced in 3 stages starting with the "Basic Brain", and the future extensions called the "Dynamic Brain", and, ultimately, "Autonomous Brain". According to Zuken, the first stage, the Basic Brain will learn from Zuken’s library of design examples and existing design expertise. By integrating it into the CR-8000 suite, Basic Brain will enhance the user experience by routing the design utilizing the Smart Autorouter based on learned approaches and strategies. In the second stage, Zuken’s Dynamic Brain will learn from newly created PCB designs, utilizing past design examples and integrating them into AI algorithms. Combining customers’ best practices with AI insights will "result in accelerated design iterations and notable enhancements in overall productivity, all within the framework of the CR-8000 platform". The third and final stage is the Autonomous Brain, according to Zuken: "an AI-driven powerhouse in continuous learning mode, pushing the boundaries of creativity. The Autonomous Brain’s capability to self-improve with each project will herald a new era of AI-driven innovation and is available exclusively within Zuken’s CR-8000 platform". Besides EDA tools incorporating AI capabilities, the board design process has already been impacted by the new trend of AI-esque technologies like ChatGPT. PCB design involves a multitude of tasks, ranging from [[Printed Circuit Boards#Schematic Capture|schematic capture]] to [[Printed Circuit Boards#Board Layout|PCB layout]] and preparation for [[Printed Circuit Boards#Board Manufacturing|production]]. A bit more tangible take on a potential AI-driven board design process is about leveraging existing tools like generative chatbots, which one can use for: - **Component selection or descriptions:** Accuracy here varies wildly, but plugins help provide specific information or datasheets ([see here](https://resources.altium.com/p/can-you-use-chatgpt-pcb-design)). There are millions of components, and an [AI-driven supply chain](https://octopart.com/pulse/p/ai-driven-supply-chains-when-will-it-hit-electronics) with tagged data could help with part selection. - **Code or script generation -** Writing code or scripts is one of the classic use cases of ChatGPT, including scripts for board design such as [Altium scripts](https://resources.altium.com/p/using-chatgpt-altium-scripting). - **Automated testing -** It's been shown how to use the generative capabilities in ChatGPT for automated testing of embedded systems, [read more here](https://resources.altium.com/p/using-chatgpt-automated-testing). - **Basic calculations -** As designers, we don’t have all formulas memorized, so we can always query ChatGPT for the formulas or an example calculation. - **Datasheet questions using plugins -** Internet search features allow users to pass datasheet URLs to a chatbot, so you can now use information from datasheets in other prompts. > [!note] > > **AI-driven auto-placement and auto-routing** > > Although schematic capture appears as a more "human" and creative activity that translates requirements coming from other humans into a circuit, placement and routing look more like activities that could be great subjects for algorithmic optimization and AI-driven capabilities. Fun fact: auto placers and auto routers have been around for decades. I remember 20 years ago playing with Protel 99's (Altium Designer's ancestor) auto-placer and auto-router, leaving it all night long working only to come in the morning and find that all components and tracers were packed in one of the board's corners... I remember sighing and just placing and routing the board myself. > Auto placers and auto routers have definitely improved in the last years, and AI technologies can give them a nice push to make them—finally—useful. Both placement and routing must deal with billions of combinatorial paths and possibilities, and such complexity increases with stack-up complexity. What is more, placement and routing cannot be done unaware of manufacturing and [[Physical Layer#Signal Integrity|signal integrity]] considerations. A question remains on how the high amount of computing power to achieve such a feat will be handled. ##### Design From System Requirements Another AI-driven approach that implements code to automate front-end system-level design comes from JITX^[https://www.jitx.com/]. JITX takes a code-based approach to developing circuits and outputs a project in Altium’s file format. While not prompt-based, it uses AI in its core to generate the front-end CAD inputs and take a designer through an important portion of the design process. In JITX, designers write code like this: `inst PMIC : TPS65400(VOUT = [3.3 2.5 1.2 0.9] )` And the tool generates this: ![](jitx.png) > [!Figure] > A layout generated from textual directives (credit: JITX) There must be still humans in the loop involved as the CAD outputs need to be inspected and verified against constraints before routing. ![](https://www.youtube.com/watch?v=OjQ5smWnabU&ab_channel=AltiumAcademy) ### Architecture Design, Requirements Analysis and Prototyping The first steps of board design are not glamorous. Before a single line in a schematic is drawn or a trace in a PCB is laid out, we as designers need to figure out what is needed and gather the consensus in the team to agree on the direction to go. This includes creating a collection of spreadsheets, block diagrams, sketches, and hours in front of whiteboards. Board design is a multidisciplinary adventure that cannot go far without the contribution of many domains like project management, supply chain management, and testing. The next sections unpack a bit of the design process to create the boards that give life to the digital systems we architect. > [!tip] Here are some good practices for good block diagrams: >- Avoid random color coding as much as possible (that is, giving meaning or categorizing things according to a color). If you have no choice but to color code, clearly explain what the code means. The same goes for line width: if it means something ("the thicker, the higher data rate". Or "the thicker, the higher the power the line carries") make it explicit. Not everybody will interpret that if not openly said. > - Avoid putting effort into depicting things as they look like in real life when drawing block diagrams. There are specialized tools for that, keep the diagrams as schematic as possible: drawing an IC to resemble how it looks in reality adds zero value. If you are drawing a network architecture, it adds very little value to putting real computer pictures. In block diagrams, things can be schematic boxes comfortably because that's why they are called block diagrams. > - Block diagrams are not slides. If you are trying to explain something that requires text or bulletizing, you are better off in PowerPoint. Block diagrams are supposed to visually represent structure or flow and be as self-explanatory as possible. > - If choosing any visual modeling language like SysML, UML, BPML or similar, stick to the semantics specified by those languages. It creates even more confusion when diagrams look like they have been drawn according to a specification when they are not. ### Schematic Capture A schematic diagram is _the_ place where the connections of the components that will be part of a board are specified in a simplified manner (hence the name). Why do we say "in a simplified manner"? Because schematics are coarse representations of the circuits. They are meant to essentially depict an electric connection, but not more than that. Schematics do not include details related to the footprint (how the component will sit on the actual board), nor do they need to represent any physical resemblance with the actual component. In schematics, components are represented by graphical symbols with input and output ports. Basic components will be represented by the usual standard symbols[^76] (resistors, capacitors, inductors, relays, diodes, switches, opamps), whereas any component that has more complex internal composition will be represented by boxes with pin numbers and pin names as per the datasheets. Capturing (good) schematic diagrams is a bit of an art. Schematic design tools tend to come with rich libraries of commercial components, and these databases include not only the footprint to be used during layout but also the schematic symbol of the component. In case a component is not present in the database (and in the rare case where the manufacturer would not provide any information), it is the task of the designer to create a new symbol and populate the pins (with number and name) as per the device datasheet. Mind you, for complex components this is a tedious and very error-prone task which is nowadays rarely done, as most manufacturers tend to provide EDA information about their products for the most popular design tools. In some cases, the schematic diagram and the layout are performed by the same person. When the schematic designer and the layout designer are not the same human being, things may easily go south and that is when the quality of the schematic diagram pays off the most. Bad schematics are very damaging, as they confuse everybody throughout the board life cycle. Not only do they increase the possibility of mistakes during the layout process, but they also create confusion during bring-up, testing, and repair. ![A case of "floating" schematic diagram. It's a good practice to never use net labels as implicit electrical connections; they make the diagram very hard to read.](image296.png) > [!Figure] > _A case of "floating" schematic diagram. It's a good practice to never use net labels as implicit electrical connections; they make the diagram very hard to read._ ![A somewhat eccentric schematic diagram (source: user Circlotron in EEVBlog)](image297.jpg) > [!Figure] > _A somewhat eccentric schematic diagram (source: user Circlotron in EEVBlog)_ ![Drawing the ICs the way they look like adds no value and breaks the rule of "inputs to the left, outputs to the right, voltages up, grounds down" rule.](image298.png) > [!Figure] > _Drawing the ICs the way they look adds no value and breaks the rule of "inputs to the left, outputs to the right, voltages up, grounds down" rule._ ![](busy_schem.jpg) > [!Figure] > Somewhat busy, yet still readable, 1-page schematic (source:https://github.com/TinyTapeout/) ![[Pasted image 20240723103845.png]] > [!Figure] > Don't do this; power ports always up, ground ports always down ![[Pasted image 20240723104150.png]] > [!Figure] > Never use power ports for ground nets! ![[Pasted image 20240723104416.png]] > [!Figure] > Mind aesthetics and clarity when drawing schematics. Overlapping text is never a good idea; makes the reading way more difficult. There are some unwritten rules in schematic capture good practices: - The golden rule: ==Inputs to the left, outputs to the right== (schematics libraries must follow this) - Voltage supply at the top, ground at the bottom. Voltage ports in an upright position. Ground port icon never facing down (see image above) - All text upright! - Ideally, all components upright, there is no real need for weird rotations. Only rotations that add clarity or are expected should take place, for instance drawing a common-base amplifier would require rotating the transistor with the base pointing down. - Designers shall use buses as much as possible. - Nets shall be named in a descriptive and minimalistic way: UART0\_RX is fine, whereas ASX\_DRV\_UART0\_PP\_RX is painful to see, and most likely such long text will overstep onto something sooner than later. - There has to be consistency between using the "hat" for active low signals and using the '#' or "\_N" markers. They shall not be mixed. - Designers must not add unnecessarily chatty comments - Designers must add enough directives for the PCB designer when it comes to power and signal integrity - Designers must always provide PDF versions of schematics; not everybody has EDA tools or viewers installed. This is particularly useful for older designs whose designers may not be in the organization anymore - Designers shall avoid, if possible, pasting photos to the schematic diagram - Designers must think hierarchically when drawing and use a multiple-sheet design extensively except for very small boards where everything could fit in an A4 page > [!danger] > Designers must draw their schematics as if the layout engineer is a psychopath who knows where they live. #### Multi-sheet Schematics Back when schematics were originally captured on paper, it was often on a single sheet of paper large enough to fill a big drafting table, which was reproduced on a dedicated, large-format copier. Now schematics are captured on a PC, stored on servers, and printed (if needed) on commercial small-format laser printers. This change means that even a simple design can be more easily displayed and understood if it is presented on multiple schematic sheets. Even when the design is not particularly complicated, there can be advantages in organizing it across multiple sheets. For example, every board design has power management (where all DCDC modules and power regulators are defined, and the different voltage rails are specified). Most designs have a JTAG section for reprogramming or expansion. Maintaining these modules as individual documents allows several designers to work on a project at the same time. Breaking the design into functional modules improves the readability of the design too, an important consideration for those who need to read and interpret the schematic during pre-production reviews or later stages of the project, like during verification and testing. There are two decisions to make if a schematic design is to be spread over multiple sheets: - The structural relationship of the sheets - The method used for electrical connectivity between the circuitry on those sheets. This choice will vary according to the size and type of each project, and sometimes on personal preferences. The schematic designer must then decide how the schematic sheets are organized and how the connectivity is established between those sheets. There are two approaches to structuring a multi-sheet design: flat or hierarchical. Both approaches are valid; each has its strengths and weaknesses. A flat design will be quicker to create but harder for others to follow signals and interpret the functionality, especially from a printed copy. A hierarchical design will take longer to draw as there are more steps to create the connectivity with the reward being a design for which others more easily interpret its functionality and follow the signals across the sheets. Hierarchical design also improves design reuse. ##### Flat Schematic Design A flat design is simply a large schematic sheet that has been cut up into several smaller sheets; in a flat design all sheets exist on the same level; there is no hierarchy nor composition. The connectivity in a flat design is created directly from any sheet to any other sheet. This type of connectivity is referred to as horizontal connectivity. The use of a top sheet is optional in a flat design. If one is included, it will have a sheet symbol for each of the sheets in the design, but it will not include any wiring. There can be any number of sheets in a flat design (see figures below). ![Flat design with top sheet (credit: Altium)](image299.png) > [!Figure] > _Flat design with top sheet (credit: Altium)_ ![Flat design without top sheet (credit: Altium)](image300.png) > [!Figure] > _Flat design with top sheet (credit: Altium)_ ##### Hierarchical Schematic Design A hierarchical design follows a tree-like. This is typically done in some tools like Altium with sheet symbols, which represent lower sheets in the design hierarchy. The symbol represents the sheet below, and the sheet entries in it represent (or connect to) the ports on the sheet below. The connectivity is through the use of Sheet Entries in those Sheet Symbols, not directly from the Ports on one sheet to the Ports on another sheet. As in a flat design, the child sheet is identified by defining its filename in the sheet symbol. In a hierarchical design, that child sheet can also include sheet symbols, referencing lower-level sheets, thus creating another level in the hierarchy. The image below shows a hierarchical design, with 3 levels in the hierarchy. ![Hierarchical Design (credit: Altium)](image301.png) > [!Figure] > _Hierarchical Design (credit: Altium)_ In a hierarchical design, a signal on a child sheet leaves the sheet via a Port, which connects upward to a matching Sheet Entry on the parent sheet. The parent sheet includes wiring that carries the child signal across to a Sheet Entry in another Sheet Symbol, it then travels down to a matching Port on the second child sheet, as shown in the image below. This parent-child sheet structure can be defined to any depth, and there can be any number of sheets in a hierarchical design. ![Ports are used to connect sheets. Net labels' scope is local to each sheet.](image302.png) > [!Figure] > _Ports are used to connect sheets. Net labels' scope is local to each sheet._ > [!attention] > To be #expanded #### Reference Designators (RefDes) Depending on the complexity, boards may contain only a few components or hundreds, if not thousands in some cases. Irrespective of the number of components sitting on the board, every one of them must be uniquely named. Naming components is not only necessary to locate things inside schematics; reference designators accompany a component throughout its life cycle in the project. Designators are kept and used during the assembly process, verification, and support phases. Naming components used to be a tedious, manual task back in the day when schematics were drawn by hand; today the process of naming is automatically done by EDA tools. There are established ways of calling components, although it is not so uncommon to still find schematics that do not follow them. ![Reference designators in schematics and layout (credit: Altium)](image303.png) > [!Figure] > _Reference designators in schematics and layout (credit: Altium)_ Reference designators are standardized by the ASME Y14.44-2008 standard titled "Reference Designations for Electrical and Electronics Parts and Equipment" and maintained by the American Society of Mechanical Engineers (ASME). This standard tackles not only component naming but also unpacks a naming convention across the system's hierarchy. The IEEE 315-1975 standard also complements this by standardizing graphical depictions of electronic symbols and names. Here's a list of some common standard designators as per the IEEE 315-1975 and its updates: - **A** - Assembly (including subassemblies and assemblies) - **C** - Capacitor - **D** - Diode (including LED, Zener, Schottky, and others) - **F** - Fuse - **G** - Generator or alternator - **J** - Jack (connector, female) - **K** - Relay - **L** - Inductor or coil - **M** - Motor - **N** - Numeric display, like a seven-segment display - **P** - Plug (connector, male) - **Q** - Transistor (including BJT, FET, and others) - **R** - Resistor - **S** - Switch (including push, toggle, and others) - **T** - Transformer - **U** - Integrated circuit (IC) - **V** - Vacuum tube - **W** - Wire, cable, or connection - **X** - Socket or connector (general) - **Y** - Crystal or oscillator - **Z** - Impedance (sometimes used for filters or network components) This list isn't exhaustive, as there are many other designators used for specific applications or in particular contexts within the standard. Additionally, components can have prefixes for mechanical parts or for parts that don't fit into the standard electronic component categories, and these prefixes can vary between organizations or projects. The standard also allows for custom designators as needed, provided they don't conflict with the established ones. #### Netlist & Footprints Once a schematic diagram reaches a certain maturity, it is time to kick off the layout activity. As we discussed, schematics contain, in a graphical manner, how everything is connected to everything on a board. But also, every single component in a schematic must include its footprint. A footprint refers to the layout and dimensions of a component on the board, representing its physical characteristics such as size, shape, and pin configuration. A footprint must define the placement and soldering points for the component on the board. Each type of electronic component, from resistors and capacitors to integrated circuits and connectors, has its own unique footprint defined by its manufacturer. ![Silkscreen and pin 1 in yellow; pads on top layer in red; mechanical outline in purple](footprint.png) > [!Figure] > _Silkscreen and pin 1 in yellow; pads on top layer in red; mechanical outline in purple_ The footprint is essential for accurate component placement and soldering, both in manual and automated assembly processes. EDA design software typically includes libraries of standard footprints, although designers may also create custom footprints for non-standard components. The footprint not only influences the layout of the PCB, including component arrangement and trace routing but also affects aspects like thermal management and signal integrity. A correctly designed footprint is essential for the reliability and functionality of the board. An incorrect footprint can lead to issues such as poor solder connections, short circuits, or mechanical failure of the component, underscoring its importance in the design and manufacturing stages of production. Note that footprints may include holes or not. Through-hole components (those that have leads that pierce through the board) need holes in their footprints. Surface mount components (SMT) sit on the board without piercing through it (although an SMT component may have mounting holes that need to pierce through the board. SMT footprints and land patterns are standardized in the IPC 7351. Through-hole component footprints used to be standardized in the IPC-7251 but this standard has been merged with the SMT standard some years ago. With all, EDA tools then export the netlists, or "rats nest", of connections and footprints into the PCB editor side of the tool. Things do not look pretty and seeing the mess of nets and footprints is not for the faint of heart. Good EDA tools come with a "cross-probe" function which helps the user go back and forth from schematic to PCB editor and it acts as a GPS when finding things visually gets too challenging. Tools also tend to include a "cross select" (also called cross-probe) function which will automatically select a component on the PCB editor if we select it on the schematic editor (and vice versa). ![After exporting netlists, it's always a game of finding Waldo. Cross-probe/cross-select function in EDA tools is critical as a GPS to find yourself between schematic and PCB](image304.png) > [!Figure] > _After exporting netlists, it's always a game of finding Waldo. Cross-probe/cross-select function in EDA tools is critical as a GPS to find yourself between schematic and PCB_ ### Board Layout There are excellent sources with great material on PCB design, and this section by no means intends to challenge such outstanding bibliography and content. Examples of this are the online courses from Fedevel Academy[^77] and books like "High-Speed Digital Design, a Handbook of Black Magic" by Howard Johnson, among many others. Board layout is the activity that bridges the virtual, CAD-oriented side of digital systems design with the physical reality, with real electric signals, electromagnetic waves, and interconnects. What the PCB layout dictates, the board manufacturing and assembly process will try to replicate in real life. Of course, the world inside computers and CAD/EDA tools is a well-behaved world of digital perfection: there are no tolerances, no manufacturing artifacts or mistakes, no cumulative errors (besides the errors the engineer can make while designing the circuits). Inside a software tool, 10 millimeters are always exactly 10 millimeters, not a single micron more or less. In our not-so-well-behaved physical reality, semiconductors, boards, and cables are all far from perfect, and there are artifacts produced along the manufacturing process that irremediably separate the manufactured hardware from the design. There will always be a gap between what we design and the manufactured version of what we design. Not two semiconductors are the same, not two boards will be the same, and not two digital systems are the same. #### Design Rules While routing a board, the tools play an essential role in keeping the designer connected with reality as they make their design decisions. Not all routing decisions are manufacturable. That means, there are certain design rules that the designer must stick to if they pretend the design will pass the DFM (manufacturability) check at the PCB house. For this, EDA tools come with DRC (design rule check) functionalities which act like online assistants to mark violations of the rules. Of course, a DRC is as effective as the rules it contains. Ideally, design rules in an EDA tool will always match the rules of a realistic, known PCB house. In general, DFM guidelines aim to encourage designers to consider aspects such as the width and spacing of traces, the size and type of vias and pads, the layout and orientation of components, and the complexity of the PCB layers. The objective is to ensure that these elements are not only functionally appropriate but also manufacturable with high yield. Of course, in the grand scheme of things, manufacturability aspects of layout do not directly imply the board will work. There are still many other places where things can go south, as we will see. #### Placement and Board Shape and Dimensions Board placement is not the most glamorous activity in board design, but it's immensely important due to the implications it may have in the assembly stage, during testing, and also for signal integrity and EMI compatibility. Placement done wrong may mark the difference between a working board and a dead board. Before starting placement, let's discuss for a moment the boundaries of a board. This means, the available area for components, but also how this area will look. In general, PCB layout engineers would love to choose the board shape and area themselves. Of course, they would choose generously so the routing work could be simple and quick. In most cases, board shape and area are constrained by mechanical requirements, and it is not possible to randomly choose it. If the board is supposed to be part of a standard backplane like [[Backplanes and Standard Form Factors#CompactPCI Serial|Compact PCI Serial]] (see figure below) or an [[Backplanes and Standard Form Factors#ATX|ATX motherboard]], our board dimensions will be strictly defined by the specification, with mounting hole locations, connector locations, connector types, and all. The same applies if the board was a mezzanine—like a [[Backplanes and Standard Form Factors#PMC Module Standard|PMC]], [[Backplanes and Standard Form Factors#FMC Module Standards|FMC]], or [[Backplanes and Standard Form Factors#XMC Module Standards|XMC]]—in a plug-in module. In most cases, the board designer will be given a blank PCB with the available area for layout demarcated. ![CompactPCI Serial 3U Backplane PCB Dimensions and mounting hole locations](image305.png) > [!Figure] > _CompactPCI Serial 3U Backplane PCB Dimensions and mounting hole locations_ ![ Not all boards are square (source: https://boldport.com/blog/2014/8/14/case-study-the-nutclough-commemorative-board-for-calrec-audio)](image306.png) > [!Figure] > _Not all boards are square (source: https://boldport.com/blog/2014/8/14/case-study-the-nutclough-commemorative-board-for-calrec-audio)_ ![Not all boards are square! (source: Reddit) ](PCB_non_rect.webp) > [!Figure] > _Not all boards are square! (source: Reddit)_ ![](nintendo.png) > [!Figure] > Not all boards are square! Here, a Nintendo Switch PCB (plus the eMMC storage, non-square mini-PCB) (source: https://www.ifixit.com/Teardown/Nintendo+Switch+Teardown/78263) Once the real estate area is known, common sense would dictate that placement should start with the bulkier components in the bill of materials. After all, these will take most of the real estate in the board and they tend to be the most complex ones, so it makes sense they take the priority when kicking off. For boards with high-end processors, FPGAs, and/or System-on-chips, these will surely be the first components to be placed, as they typically have big packages with hundreds if not thousands of leads/balls. Next, memories would typically follow. Memories cannot be located too close to the components they are connected to, like CPUs or SoCs, because there will be differential pairs between memories and these devices and those need space for length matching. Next, in general, comes the moment to place power supply chips and their supporting passive components like inductors and capacitors. When doing so, it is highly recommended to read the PCB layout guidelines in the device datasheet. Every decent power management chip manufacturer will provide these guidelines (see figure below). ![Example PCB layout guidelines for step-down converters (source: https://www.ti.com/lit/an/slyt614/slyt614.pdf?ts=1699878872860)](image307.png) > [!Figure] > _Example PCB layout guidelines for step-down converters (source: https://www.ti.com/lit/an/slyt614/slyt614.pdf?ts=1699878872860)_ A factor that tends to be underestimated when placing high-end, multi-core processors, SoCs, and FPGAs on PCBs, is heat dissipation. Although the design process will be shaped by the cooling requirements of the system the board is part of (for instance, forced-air cooling rack, or conduction cooling chassis, etc.), in general, it is typical for components that will dissipate large amounts of power to be sitting on the same side of the board, and if the board will be horizontally oriented, to place them on the top side. ![iMX6 Rex Computer-on-Module infrared temperatures (source: imx6rex.com)](image308.jpg) > [!Figure] > _iMX6 Rex Computer-on-Module infrared temperatures (source: imx6rex.com)_ Industrial digital systems are rapidly reaching their limits in terms of forced-air cooling. Some densely packed systems are using so many cooling fans that factors like noise pollution are starting to be an issue. Designing boards to comply with certain standards will calibrate the design process, and that includes power dissipation. For example, VITA 48.2 standard (part of the [[Backplanes and Standard Form Factors#VPX|VPX]] family of standards) defines the mechanical requirements that are needed to ensure the mechanical interchangeability of conduction-cooled 3U and 6U Plug-In Modules in VPX backplanes. According to this standard, conduction-cooled Plug-In Modules should be designed to require less than 75W per slot. Needless to say, if the board must include heatsinks, the mechanical consideration for the heatsinks must be considered, including mounting holes. In terms of component height, standards in general specify guidelines as well. For instance, [[Backplanes and Standard Form Factors#PC/104|PC/104]], being a stacked form factor, understandably specifies a maximum component height for the top and bottom sides of the boards. #### Routing Placement and routing are not sequential activities. They are somewhat symbiotic and are done more or less concurrently: the designer tries a placement idea, routes some traces, checks if feasible, and corrects as needed. As the routing activity matures, placement solidifies and then the routing takes a different speed when it gets all the designer's brain time. > [!note] > It is not the intention of this section to teach about how to route a PCB, but only to describe the complexity of the routing activity with a decent level of fidelity. The routing strategy relies on the layered nature of PCB design and of PCBs in general. PCB design tools allow the designers to route traces in different layers that sit on top of each other, using vias (see figure below) to conveniently connect the layers wherever needed. The designer can choose which layers to use to connect a lead from a device somewhere to the lead of a device or a connector elsewhere. For high-speed designs, a trace would need to cross the least number of layers possible, otherwise, discontinuities along the way can cause all the headaches we discussed when we talked about [[Physical Layer#Signal Integrity|signal integrity]]. As said, to cross between layers, designers use vias. Vias are plated holes that electrically connect selected parts of the copper foils that layers are made of. There are many types of vias: notably exposed or tented. Tenting a via covers the via hole and annular ring with solder mask and should be set as the default method in the design workflow. But there are more types. There are blind and buried vias. Blind vias connect an outer layer to one or more inner layers but not to both outer layers, and a buried via connects one or more inner layers, but not to an outer layer. This is important as these types of vias allow for denser boards and can save board real estate by not requiring any space on the component layers. ![Through hole, buried, and blind vias (credit: Altium)](image309.png) > [!Figure] > _Through hole, buried, and blind vias (credit: Altium)_ When it comes to routing traces to/from dense microprocessors, FPGAs, and System-on-chips, manufacturers also provide extensive assistance with guidelines on PCB routing techniques for differential transceivers and decoupling capacitors. Examples of these are AMD's Ultrascale Architecture PCB Design User Guide[^78], NXP's High-Speed Interface Layout Guide[^79], and Intel's Board Developer Center[^80] among others. What is more, most manufacturers provide Reference Designs that developers can use to observe a working example of the device in a real setup and use it as a platform to create their own designs. In general, reference designs provide schematics, layouts, gerbers, and BoM, and they act as the "hello world" equivalent of example code. For configurable devices like FPGAs, and because device capacitance varies with logic cell and I/O utilization, PCB decoupling guidelines must be provided on a per-device basis based on very high utilization to cover a majority of use cases. ##### Escape Routing Routing the signal traces out of the BGAs is a process that is generally referred to as “escape routing.” With the more dense parts, however, this can be a real challenge. You typically have to plan ahead of time where the routes must go, change the trace widths to a smaller size to fit, and tunnel through the board using a variety of vias. Getting all the nets successfully routed out of a BGA can end up being a real challenge. #### Layout versus Schematic (Pre-production Design Verification) In software, when you screw up you have a quick way of fixing the problem and getting back on track; you debug the issue, you hopefully find it, you modify the code accordingly, you recompile, and off you go. In hardware, the story is a bit different. Hardware bugs have considerably slower turnaround; they take longer to be solved due to the fact there is a complex supply chain involved. Then, if the board design has some issue (for instance, some net was left unconnected, or a component footprint is wrong) this means that the board must be re-spun: a new PCB must be produced, and a new PCB assembly needs to be done. This may include different organizations and take from days to weeks or months, depending on the levels of quality involved and depending on how busy suppliers might be. Note that there are several levels of mistakes during board design; from coarse things (someone forgot to connect a pin in an IC to a voltage rail) to more subtle things, for instance, spacing between traces not according to PCB manufacturer capabilities. Today's tools help capture a big part of these mistakes during the design stage using Design Rule Checks (DRCs), footprint wizards, and the like. Twenty years ago, designing a PCB included manually creating a myriad of new components in schematic and PCB libraries and the process was very error-prone, as it meant an engineer manually naming pins out of reading a datasheet. Today, with component vaults and manufacturers providing footprints and schematic symbols, this is luckily almost a thing of the past. Layout-versus-schematic (LVS) checks are also important for verifying that the PCB layout accurately reflects the schematic diagram. This process ensures that every connection in the schematic is correctly implemented in the layout and that there are no added or missing connections. It\'s a critical step for catching errors that can occur during the transition from schematic design to PCB layout. While DRC focuses on physical layout rules, tools also incorporate Electrical Rules Check (ERC) which focuses on the electrical correctness of the design. It checks for issues like incorrect logic levels, unconnected pins, or potential power supply conflicts. ERC can also validate impedance matching, signal integrity, and other electrical characteristics critical for the proper functioning of the PCB. There are also signal integrity (SI) analyses involved which check for potential issues that could affect the performance of high-speed signals. This includes analyzing reflections, crosstalk, and timing delays to ensure that the PCB design will perform reliably at its intended operating speeds. Tools for signal integrity analysis help designers optimize trace routing and termination strategies to minimize signal degradation. Note that SI checks at the design stage are highly theoretical and lack the constructive factors that manufactured PCBs bring. Before production, also power integrity analysis checks are performed to find issues related to the distribution of power across the PCB. It helps identify potential problems such as voltage drops, ground bounce, and areas of excessive current density that could lead to overheating or device failure. This analysis is especially relevant for complex boards with high power consumption or sensitive analog components. Thermal analyses as well: they simulate the heat distribution across the PCB and identify hot spots that could lead to overheating or thermal stress on components. This step is vital for ensuring the long-term reliability of the PCB, especially for designs with high-power components or those intended for harsh environmental conditions. But there are more "semantic" mistakes no tool can easily catch. Picture for instance that a designer chose an obsolete component, or misunderstood a signal being active-low for an active-high. There are still plenty of human factors involved in designing electronic boards. Then, a new design must be thoroughly checked before being sent for production. If possible, an independent review team should be the one performing this, with the original designers not taking part. #### Gerbers and DFM Inspection at the PCB manufacturer Designers, in general, do not deliver full design projects to PCB houses. Instead, designers export a special kind of file from EDA tools called gerbers the PCB manufacturers use for fabricating the boards. A Gerber file is a standard format that represents a blueprint for constructing the PCB, detailing the layout of copper tracks, planes, vias, pads, and other components essential to the circuit\'s functionality. The Gerber format, originally developed by Gerber Systems Corp., has undergone several revisions to enhance its capabilities. The most widely used format today is the Extended Gerber or RS-274X. This extended version has significant improvements over the older standard, RS-274-D, primarily in how it manages image data. While the older format required separate aperture files to interpret the shapes and dimensions in the design, the RS-274X integrates this information directly into the file, improving the process and reducing the risk of errors during PCB manufacturing. The Gerber X2 format (introduced in 2013) has lately gained popularity, and it's richer in information. Gerbers serve as a universal language between the designer and the fabricator. It contains a series of images, each corresponding to a different layer of the PCB, such as the copper layers, solder mask, and silkscreen. These images are vector-based, providing precise instructions for the exact placement and size of each component and trace on the board. PCB fabricators use Gerbers to print the films that are eventually used to etch the copper layers, apply the solder mask, and print the silkscreen. This level of precision ensures that the physical PCB matches the designer's original intent. This being said, still there is a dose of reverse engineering that PCB houses must perform to understand the designer's intent. Minimizing the gap between what the designer wants and what the PCB house operator interprets is key to success. This goes to say, that Gerber files should be augmented with design documentation for the PCB manufacturing house, without incurring the problem of overwhelming them. #### DFx Design for Manufacturability, Fabrication and Assembly A PCB designer must know the limitations of the PCB fabrication and assembly processes to ensure the design and manufacturing of the board go smoothly. DFM is closely related to design for fabrication (DFF) and design for assembly (DFA). There are multiple sides of manufacturability. Because the [[Printed Circuit Boards#PCB Assembly|PCB assembly]] line is a combination of different processes linked together in a sequence, there are design measures that may improve solder paste printing, pick and place, reflow, inspection, and rework. Oftentimes, there will be conflicting sides of a design decision: what may improve reflow could degrade visual inspection, or vice versa. There are no perfect designs, but the idea would be to hit a balance. DFM measures filter in the placement activity as well, as it directly affects the board manufacturability and the reliability of the assembly process. For instance, clustering large components together will require the board to be reflowed at higher temperatures which could cause damage to chip components. This is due to having large components located in one particular area of the board. If a board consists of large components with substantial height, it is recommended to make the component-to-component spacing equal to the height of the largest component package. This strategy gives ample room for visual inspection and makes rework easier. To achieve a better thermal balance of a board during reflow, one should distribute the components as evenly as possible throughout your board. This will ensure that no area on a board will be substantially hotter than another. It is also recommended to avoid concentrating large components in one area of your board to help minimize bow and twist while providing a balanced thermal distribution. It is recommended to place BGAs on the top of the board to eliminate the possibility of open solder connections during a second pass reflow. Board manufacturers might require additional steps in the assembly process if there are BGA components on both sides of your board. Designers shall avoid placing BGA and larger quad-flat package (QFP) components in the center of the PCB to prevent board bending caused by heavier parts. Not following this guideline can result in open solder connections. ![Bow and twist effect in BGA (credit: Altium)](image310.png) > [!Figure] > _Bow and twist effect in BGA (credit: Altium)_ If a design must have BGA components on both sides of the board, it is recommended to offset each BGA to ease rework and facilitate the solder ball inspection (see figure below). ![BGA Mounting Strategy](image311.png) > [!Figure] > _BGA Mounting Strategy_ Test Points: When it comes to testability, the designer also has the responsibility of providing access points for verification through testing. Defining proper test points on a board layout during the design process is critical for having the board tested and verified by the manufacturer. The test points will allow the manufacturer to identify and diagnose any potential issues before the board ever leaves the processing facility. There are several general guidelines to keep in mind: - Each net on a board should have at least one test probe point (preferably two), including the component pins connecting to that net. - It is not recommended to use component leads as test points as this method can result in missing and cracked solder joints. - It is recommended to spread test points throughout the board, rather than have them concentrated in any one board location as this will help to avoid air leaks in the vacuum sealing process of the packaged board. Test pads can be either vias/pads, a component pad (PTH), or a specified Test Point (TP) with its own reference designator. ![Through-hole test via (credit: Altium)](image312.png) > [!Figure] > _Through-hole test via (credit: Altium)_ The spacing between test pads (center-to-center) should be maintained at 0.100". This will enable the use of larger probes which are less expensive to set up and provide a more reliable reading. The smaller the spacing between the test pads, the more likely the manufacturer will have to use smaller, costlier, and less reliable probes. Panelization: Panelization, also known as step-and-repeat, is the method of placing two or more PCBs onto one panel, which allows boards to be secured during manufacturing, shipping, and assembly. Since PCBs are priced per panel, cost will be directly impacted by how many PCBs can be fabricated on a panel. Panelization can also save time by processing multiple boards at once in bulk. ![Rectangular Circuits in a Single Panel with Tooling Holes and Breakout Tabs (credit: Altium)](image313.png) > [!Figure] > _Rectangular Circuits in a Single Panel with Tooling Holes and Breakout Tabs (credit: Altium)_ The PCB images on a panel can be a single design or a grouping of various designs. As the number of circuits increases within a panel, its mechanical strength becomes weaker and can cause the panel to bend under its weight during assembly and reflow. While a smaller panel containing fewer boards could be stronger, it may not be the most efficient way to utilize the PCB manufacturer's standard fabrication panel sizes and will add additional costs during the assembly process. Documenting the Design: Before a designer can send a design off to manufacturing, they will need to ensure that it is properly documented to clearly communicate the design intent. While electronic files such as Gerber provide enough basic information to fabricate a board, these files don't include all of the fine details in the designer's head about how they intend to have the board produced. The documentation stage is the chance to precisely document a board layout and avoid any of design intent miscommunications that typically occur when design goals are not conveyed. Creating a solid PCB documentation package that outlines all of the necessary details the designer will want to include to increase understanding by the manufacturer is of paramount importance. The IPC-D-325A standard establishes the general requirements for documentation necessary to fully describe end-product printed boards, regardless of raw material, special fabrication requirements, layer count, or end-product usage. The standard Includes master drawing requirements, board definition, and artwork/photo tooling. # Board Manufacturing Boards are like sandwiches composed of several layers etched with the circuit patterns, cleverly aligned on top of each other, separated by insulating material. Boards are largely two-dimensional artifacts that expose an area for the designers to place components on, and when the area available to place components and route traces is not large enough, designers can choose to add more layers to the sandwich and thus grow the board in the height dimension. The "sandwich" is in fact a stack of laminate cores pressed with insulating resin used to bond multiple cores together. How is this sandwich manufactured? Let's see the steps next. - First, all layer patterns are printed in negative transparent films. All films are registered with each other utilizing alignment holes that make sure all layers are correctly aligned when put one on top of another. - Then comes the laminates. Laminates are epoxy and fiberglass cores pre-laminated with a thin copper layer on both sides. The cores are first cleaned from all impurities to make sure the copper is clean. The clean panel is then coated with a layer of photo-sensitive film, called photoresist. - Then, the operator will load the films that are going to be etched into each side of each core. If this was a 2-layer PCB, then this would just be the top and bottom layer, with just one laminate core. But if it was a 4-layer PCB, then this core would be added with two prepreg layers to bond the top layer (which will not be sitting on the core) and the second layer, which sits on the core, and another prepreg layer between the bottom of the core and the board bottom layer (see figure below).![4-layer stackup in KiCAD with 1 core and 2 prepreg layers](image314.png) - For a 6-layer PCB, then we would need two cores bonded together with prepreg (note the core widths of 0.535 instead of 1.24mm in the 4-layer stackup):![6-layer stackup in KiCAD](image315.png) ![A 16-layer stack up (credit: https://www.nanotech-elektronik.pl)](image316.png) > [!Figure] > _A 16-layer stack up (credit: https://www.nanotech-elektronik.pl)_ - For more layers, the process is more or less repeated, with layer widths decreasing accordingly. For example, for a 16-layer stack up (see figure above), there would be 7 cores of about 210 micrometers each. ![Prepreg weaves and their material properties. (Credit: Isola Group)](image317.jpg) > [!Figure] > _Prepreg weaves and their material properties. (Credit: Isola Group)_ - To create the desired circuit patterns, the films with the negative trace pattern are placed on each side of the core. As a prerequisite, the cores have been pre-laminated on both sides with a photosensitive chemical that hardens when exposed to UV light. Then, the films and the core are exposed to UV light, and the resist that is exposed to the light hardens. Then, the remaining unhardened photoresist is chemically removed. Each side of the core(s) shows the circuit pattern of the inner layer in the form of solidified photoresist. - The next stage is to remove the unwanted copper—the copper that now has no photoresist covering it—using an alkaline solution that corrodes the copper away. Once this is done, no copper should exist outside of the traces the circuit design has defined. - Next, the blue photoresist that was still on the copper is removed. - Each core is inspected - Once all necessary cores are complete, it's time to align all the cores together with the top and bottom layers. For this, the layup operator adds separation layers between cores consisting of sheets of glass cloth pre-impregnated with uncured epoxy resin. - The layup and bonding process continues by adding the top and bottom copper foils (with prepregs as well) and preparing everything to be pressed together. - The stack is put in a bonding press which uses heated plates. The heat melts and cures the epoxy resin in the prepreg layers while the pressure bonds the cores and copper foils together. Hopefully, this bond will last for the lifetime of the board. - Once the stack is out of the press, it's time to drill holes for leaded components (which will pierce the stack completely) and for vias holes that link copper layers together. Drilling is the most expensive and time-consuming process in PCB manufacturing. There are two types of drilling technologies: mechanical and laser. - Then, it is time for hole plating by copper deposition. This is done by adding a very thin layer of copper in the holes' walls. A good connection needs about 25 microns of copper on the walls of the holes. This thickness must be electroplated, due to the fact the holes' walls are basically made of resin and fiberglass. - Now it's the time to image the outer layers. Again, the panels are coated with a photoresist. Films for the top and bottom layers are aligned with the rest of the stack, and everything is exposed to UV light. Again, the photoresist will harden where the copper must stay. - The board goes now for plating, where a layer of copper is added on top of the outer layers of copper foil (designers must be aware that thicker copper foil requires wider distances between tracks). A tin plating process is also performed. - Then, the outer layers are etched. - Epoxy solder mask is added to protect the copper. The part of the board without solder mask remains as exposed copper. - Now a protective layer of nickel and a thin coating of gold over the nickel is added on the exposed copper parts. - Silk-screen with [[Printed Circuit Boards#Reference Designators (RefDes)|reference designators]] of all components is added by using a laser printer. - An electrical test of the bare board is performed using flying probe testers. Unlike traditional bed-of-nails testers, which require a custom fixture for each PCB design, flying probe testers use movable probes to test various points on the PCB. Flying probe testers use several probes that can move independently. These probes are mounted on precise, computer-controlled arms. The number of probes can vary, but typically there are at least two, and often more to increase testing speed and efficiency. Before testing, a test program is created, usually from the PCB gerbers. This program instructs the tester where to move the probes to test different points on the board. Sophisticated path planning algorithms are used to optimize the movement of the probes, reducing test time. The probes make contact with test points (like pads or vias) on the board. The tester then performs various electrical tests to check for manufacturing defects. These tests can include checking for open circuits, shorts, resistance measurements, capacitance measurements, and sometimes even more complex functional tests. The tester identifies defects by comparing the actual electrical properties and connections of the board with the expected results derived from the PCB design data. Any discrepancies can indicate a potential defect, such as a short circuit, open circuit, or a component placed with incorrect polarity. While highly flexible, flying probe testing is generally slower than bed-of-nails testing, making it less suitable for high-volume production. Also, it may not be able to test as comprehensively as some fixed-fixture methods in certain scenarios, particularly for complex, multi-layer PCBs. - As most PCB fabs panelize boards, a V-cut scoring is used to separate the boards from the panels. - A final visual inspection is performed, and the boards are vacuum-sealed and shipped to the customer. This video summarizes the process, for a 4-layer PCB (Credit: Eurocircuits) ![](https://www.youtube.com/watch?v=sIV0icM_Ujo&ab_channel=Eurocircuits) # PCB Assembly Assuming our PCB has been manufactured, now it's time to assemble it. This means, to populate the board with the components so it can continue on its journey towards a functional life. As the phrase goes, the best improvement any system can have is when it goes from a non-working condition to a working condition. To achieve such working condition, all components from the bill of materials need to be at the PCBA house for the assembly to materialize. Missing components will halt the line and increase costs. In general, bills of materials are heterogeneous lists of components of different packages and types. Although most components nowadays are surface mount, there are still through-hole components filtering in like connectors or some high-power components. We will describe here largely an SMT line, and we will comment at the end on what happens with TH (through-hole) components. ## SMT Components In surface-mount technology, the components' leads do not come across the board thickness. This means that surface mount components sit on top of the board's top or bottom layer. Describing SMT packages is out of the scope of this text, and we will only highlight that we can find SMT packages for basically every component type: resistors, capacitors, inductors, switches, diodes, transistors, power regulators, microprocessors, FPGAs, system-on-chips, connectors, etc. That being said, not all components in a board need to be SMT. In general, designers choose a combination of SMT and TH (through-hole) components, although a typical board will predominantly have surface mount parts. The advantages of using SMT components are known: - Small form factor and lightweight (better use of real estate) - No need to drill holes through the board - Simpler automated assembly - Sometimes, component placement errors are corrected automatically thanks to self-alining effects - Mounting on both sides is possible On the disadvantages, we can say: - High initial cost and time of setting up for production. - Some mischievous effects during assembly (for instance, tombstoning^[https://vinatronic.com/blog/smt-tombstone-defects/]) - Difficulty in manual rework due to the very small sizes. - Repair might be difficult, and often uneconomical ## SMT Line Equipment SMT components must be placed on a board precisely. But they also need to be glued and soldered properly. The assembly of board with surface mount components in scale must be done with a special production line. A typical SMT line will contain a set of stations that will perform different operations on the boards using specialized equipment. Standards such as the IPC-9850A specify the procedures to characterize the capability of surface-mount assembly equipment. We will briefly describe this equipment in the sections below. ### Conveyors In an SMT line, conveyors facilitate the smooth transition of PCBs (Printed Circuit Boards) through various stages of the assembly process. These conveyors are integrated into the SMT line to link together different pieces of equipment, such as printers, pick-and-place machines, reflow ovens, and inspection stations, ensuring a continuous flow of production. The operation of SMT conveyors involves moving PCBs along the assembly line with precision and care to avoid damaging the boards or the components already placed on them. The conveyors use belts or chains with adjustable speeds to transport the PCBs from one station to the next. The speed and motion of the conveyor are controlled electronically, allowing for synchronization with the operation of the connected equipment to match the production flow rate and avoid bottlenecks. As PCBs exit one machine, such as a solder paste printer, the conveyor gently moves them to the next station, for example, a pick-and-place machine, where components will be mounted. After component placement, the conveyor again transports the PCBs to a reflow oven for soldering and then onward to inspection and testing stations. The conveyors are equipped with sensors and stop mechanisms to ensure that PCBs are precisely positioned for pickup by the next piece of equipment and can also be halted for manual inspection or intervention if necessary. Moreover, SMT conveyors are designed with flexibility in mind to accommodate PCBs of different sizes and shapes. They can be adjusted for the width of the PCBs being processed and are often modular, allowing for reconfiguration of the SMT line as production needs change or as additional equipment is incorporated. This modularity and adjustability make SMT conveyors a key factor in the efficiency and adaptability of the PCB assembly process. ![SMT conveyor. Credit: ICT](smt_conveyor.webp) > [!Figure] > _SMT conveyor. Credit: ICT_ ### Solder Paste Printer The Solder Paste Printer is tasked with the precise application of solder paste to the PCB where components will be placed. The solder paste acts as both an adhesive during component placement and a medium for electrical connection once it has been reflowed (melted and then solidified). The solder paste printer operates by first securing the PCB in place on its platform. Once the PCB is firmly positioned, a stencil is aligned over the top of it. This stencil is a thin sheet of metal or plastic that has apertures cut into it, corresponding to the locations on the PCB where solder paste is required. The stencil ensures that solder paste is applied only where it is needed and in the correct amounts. After aligning the stencil, the machine spreads solder paste over it using a squeegee. The squeegee moves across the stencil, pressing the paste through the apertures and onto the PCB's surface. The amount of solder paste deposited is controlled by the size of the apertures in the stencil and the pressure applied by the squeegee. This process requires precise control to ensure that a consistent amount of solder paste is applied, as too much or too little paste can lead to soldering defects like short circuits or insufficient solder joints. To provide a good seal between the stencil and PCB, apertures are typically .002 inches smaller on all sides than the PCB component pads. If the size is not correct, solder defects can occur such as bridging and solder beads. Some apertures need a special design to deposit less paste on the inner edge of the part to reduce the possibility of ‘mid-chip’ solder balls forming. ![A PCB stencil](stencil.jpg) > [!Figure] > _A PCB stencil_ When PCB designs contain a large variety of parts, including fine-pitch devices that require a thin stencil and larger parts that require a thicker stencil, a multilevel or "stepped"’" stencil can be employed. Step-up areas are used to increase the volume and height of the solder paste in selected areas by adding material to the stencil. Step-down areas are created by removing material from the stencil and are used to reduce the volume and height of the solder paste deposit in selected areas. To print successfully, apertures must be designed to be a minimum distance from the step edge. The distance is dependent on the step dimensions and is known as the "keep-out area". If the PCB design has large copper pads underneath a component (typically to provide heat dissipation) this can cause issues if the entire pad has solder paste applied to it, resulting in the device being lifted and the outer leads not connecting well. This can be overcome by creating a ‘window effect’ in the stencil aperture design to reduce the solder volume (shown below). ![Window effect added to stencil](window.jpg) > [!Figure] > _Window effect added to stencil_ Once the solder paste has been applied, the stencil is lifted away from the PCB, and the board is moved along the SMT line to the pick-and-place machines, where components will be placed onto the freshly applied solder paste. The solder paste application must be accurate, as it lays the groundwork for successful component placement and soldering, directly influencing the quality and reliability of the finished product. ![Solder paste applied in a PCB (also with window effect)](solder_paste.jpg) > [!Figure] > _Solder paste applied in a PCB (also with window effect)_ Another PCB design feature on the same type of device that can cause undesirable solder results is when there are vias within the large copper pad for heat dissipation. In this case, the stencil apertures should be specially designed to prevent the solder paste from being deposited onto the vias. ![Stencil pattern when there are vias are under the chip](onto.jpg) > [!Figure] > _Stencil pattern when there are vias are under the chip_ Solder paste printers often incorporate features like 2D solder paste inspection to verify the quality of the paste application before the PCB moves to the next stage. This inspection can detect issues such as insufficient or excessive paste, misalignment, or the presence of smears, allowing for corrective action to be taken early in the assembly process. ![Solder Paste Printer (Credit: Yamaha)](image318.jpg) > [!Figure] > _Solder Paste Printer (Credit: Yamaha)_ ### Pick and Place A pick-and-place machine operates by picking up components such as resistors, capacitors, and integrated circuits from a feeder, to accurately place them onto a bare PCB. The machine consists of a head with multiple nozzles that can pick up different components using suction cups. Pick and place machines use a combination of cameras and machine vision to identify the components and their orientation. The head (or heads) move quickly and accurately in an XY plane to place the components at different locations on the PCB as per the design. Pick and place machines can be either fully automatic or manual/semi-automatic. Automatic machines can place thousands of components per hour, depending on machine specs. #### Automatic In automatic pick-and-place equipment, an operator ensures that components are fed into the input bays of the machine using reels of component-carrying tape (see figures below), tubes, or trays. Then, the machine automatically picks the components as per the loaded program and places them on the right coordinates. Note that the feeding orientation of the component is a relevant piece of information to ensure the machine will apply the right rotation to the component before placing it on the bare board. ![JUKI FX-3RA; 90,000 component/hr pick&place automatic machine (credit: Juki)](image319.png) > [!Figure] > _JUKI FX-3RA; 90,000 component/hr pick&place automatic machine (credit: Juki)_ ![Component reel (credit: SparkFun Electronics)](image320.jpeg) > [!Figure] > _Component reel (credit: SparkFun Electronics)_ ![Cover tape and dissipative carrier tape (credit: Fairchild Semiconductor)](image321.png) > [!Figure] > _Cover tape and dissipative carrier tape (credit: Fairchild Semiconductor)_ ![Orientation of a component in the tape (in the example, SOT-23 transistor) (credit: Fairchild Semiconductor)](image322.png) > [!Figure] > _Orientation of a component in the tape (in the example, SOT-23 transistor) (credit: Fairchild Semiconductor)_ Complex packages like Ball Grid Arrays (BGA) are placed using standard pick-and-place equipment with a placement accuracy of ±0.10 mm. This accuracy is enough since the parts will align due to the self-centering feature of the BGA solder joint during solder reflow. #### Semi-Automatic and Open Source Projects For very low-volume or for prototyping purposes, it is also possible to use more cost-effective, simpler semi-automatic pick and place stations where an operator can manually pick components for the feeders and place them on a bare board. ![A semi-automatic pick & place station (credit: Essemtec)](image323.jpg) > [!Figure] > _A semi-automatic pick & place station (credit: Essemtec)_ There are also open-source alternatives like the LumenPnP^[https://www.opulo.io/products/lumenpnp] station whose design files are available. The goal of the LumenPnP is to help makers bridge the gap between prototyping and mass production. The salient requirements of the LumenPnP design are: - Automated - no human interaction is necessary from attaching the paste-applied board to the machine to having a board ready for reflow - Capable of picking and placing components down to 0402 passives - Integrated up and down vision for fiducial scanning and on-nozzle component alignment - Automatic nozzle tip changer to support a wide range of component sizes - Frame and motherboard design capable of future upgrades - Expansion ports for hacking and interfacing with the machine - Feeder Implementation - The system supports banks of intelligent, powered feeders - Feeders can handle down to 4mm pitch components - Feeders can handle 8mm, 12mm, 16mm, and 24mm tape, both paper and plastic - Feeders can hold a standard 7\" diameter reels of components. More information about this equipment can be found at https://github.com/opulo-inc. ![ Lumen PnP Desktop Pick and Place station (credit: Opulo.io)](image324.png) > [!Figure] > _ Lumen PnP Desktop Pick and Place station (credit: Opulo.io)_ ### Reflow Ovens A reflow oven is the equipment used for the soldering of the components onto the boards. Boards with components placed (and with solder paste applied) are fed from one side of the oven, where a conveyor belt will make the board enter the oven. Once inside, a carefully controlled heating process is initiated. The oven typically has several zones, each set to different temperatures. As the PCB passes through these zones, it experiences a thermal profile, which includes preheating, thermal soak, reflow, and cooling stages. During the preheat and soak stages, the temperature of the board is gradually increased. This slow increase prevents damage to the components and the PCB, while also activating the flux in the solder paste. The flux cleans the metal surfaces, which is essential for a good solder joint. The next stage is the reflow phase, where the temperature is raised to a point where the solder paste melts, or "reflows". This melting allows the solder to flow and form solder joints, which electrically connect the components to the board. Finally, the cooling stage solidifies the solder, forming a permanent bond between the components and the PCB. The cooling must be controlled to prevent thermal shock to the components and to ensure the formation of good-quality solder joints. Reflow ovens vary in size and complexity, ranging from small bench-top models for low-volume production or prototyping to large, multiple-rail conveyor-style ovens used in high-volume manufacturing. The precise control of temperature and the ability to program specific thermal profiles are key factors to ensure the success of the reflow process, yielding high-quality soldering while preventing damage to sensitive electronic components. Successful reflow cycles strike a balance among temperature, timing, and length of the cycle. Mistiming may lead to excessive fluxing activation, oxidation, excessive voiding, or even damage to the package. Heating the paste too hot, too quickly before it melts can also dry the paste, which leads to poor wetting. Process development is needed to optimize reflow profiles for each solder paste/flux combination. Device manufacturers specify their preferred reflow profiles. Complex packages like BGAs show good self-alignment during solder reflow if a minimum of 50% of the ball is aligned with the pad. The 50% accuracy is in both the X and Y directions as determined by the following relation: ![BGA Ball self-centering](image325.png) > [!Figure] > _BGA Ball self-centering_ ![Preferred reflow profile using no clean paste that produces good board-level reliability results for BGA packages (credit: Texas Instruments)](image326.png) > [!Figure] > _Preferred reflow profile using no clean paste that produces good board-level reliability results for BGA packages (credit: Texas Instruments)_ ### Inspection Automatic Optical Inspection (AOI) stations capture high-resolution images of the PCBs by using a combination of various types of cameras and lighting systems. The cameras can be positioned above, below, or around the board to capture different angles and aspects of the components and solder joints. For the AOI system to evaluate the PCB, it relies on a set of pre-programmed patterns based on the specifications of the PCB design, including the placement, orientation, and type of components, as well as the quality of solder joints. Once the images are captured, the AOI system uses machine vision software to compare the images of the PCB with the pre-programmed patterns. The software is capable of processing and analyzing these images at a high speed, much faster than human inspection. The comparison process involves checking component placement and solder quality, and ensuring there are no defects. If the AOI system identifies discrepancies between the PCB and the pre-set criteria, these are flagged as potential defects. The system can detect a wide range of issues, from simple ones like misaligned components to complex problems like insufficient solder paste or floating leads. The AOI system generates a report detailing any detected defects. This report is used by technicians to make necessary corrections to the process. In advanced setups, this feedback can be integrated directly into the manufacturing process, allowing for real-time adjustments and improvements. AOI stations can command conveyors and feeders to separate good and bad boards. Many modern AOI systems come with machine-learning capabilities. This means they can learn from past inspections, improving their accuracy and efficiency over time. The more PCBs the system inspects, the better it becomes at detecting subtle or complex defects. AOI is often integrated with other systems in the PCBA process, such as Automated X-ray Inspection (AXI) for checking hidden solder joints under components. ![AOI station (credit: ALeader Europe)](image327.jpg) > [!Figure] > _AOI station (credit: ALeader Europe)_ ## Configuring and Programming the SMT line We have not reached yet a level of sophistication with manufacturing machinery where machines would be able to read designers' minds. Therefore, there is no way an SMT line can know beforehand what components a board needs and where those components are supposed to go with which orientation; we have to program the machines to achieve this. The PCB layout and the bill of materials are two essential pieces of information when it comes to programming an SMT line. EDA tools can export the exact location and orientation of every component with respect to an origin, and this information must be programmed into the machines. Note that bare PCBs are typically panelized, therefore the machine needs to be configured to be able to assemble all PCBs in a panel. Modern EDA can export this data in (more or less) standardized formats that machines can parse and process. Back in the day, there was a good dose of manual processing between EDA tools and the machines. The pick-and-place machine I used to work with used diskettes! The IPC-2581^[https://www.ipc2581.com/] is a generic standard for printed circuit board and assembly manufacturing description data and transfer methodology. It provides a unified and comprehensive XML-based data format that includes all aspects of a PCB design including the bill of materials (BOM), Gerber data, component placement, netlists, and other critical information needed for automated assembly processes. This standard allows for seamless communication and data transfer between different EDA tools and the manufacturing equipment. > [!info] > To be #expanded ## SMT Line Layout SMT lines, as the name implies, tend to be linear arrangements of loaders, conveyors and processing stations. Depending on the scales involved, the SMT line layout may be optimized or parallelized for higher yields. ### Single Line with Single-Rail Pick & Place This line is a basic standard production line that includes these machines: - Loader - Solder Paste Printer - Conveyor - On-line SPI (Solder Paste Inspector) - NG (not-good) buffer - Single rail Pick & Place - Conveyor - Single Rail Reflow Oven - Conveyor - Vertical buffer - Online AOI - Reject Conveyor - Automatic Unloader ![Single line SMT production (credit: Talk Electronics)](image328.png) > [!Figure] > _Single line SMT production (credit: Talk Electronics)_ 1. The loader is used for loading bare PCB boards into the line. 2. Solder paste printer is used for the scraper to squeegee the solder evenly on the panel of PCB for the next reflow process. 3. A Conveyor is used for delivering PCB boards onto the next machine. 4. Optical SPI (solder paste inspection) is used to inspect the solder paste thickness printed on PCBs. 5. NG (Not Good) buffer is used for storing bad boards after receiving the signal from the SPI. 6. The pick and place machine places components like chips, capacitors, resistors, and diodes in the right precision position at high speed. 7. The conveyor here is used for transporting boards between the pick and place machine and the reflow oven. 8. A single rail reflow oven is used for applying a temperature profile to the solder paste to melt it and let it cool down. The profile is pre-loaded in software in the oven. 9. A conveyor here is used for transporting the boards out of the reflow oven to the next buffer conveyor. 10. A vertical buffer is used for storage of PCB boards as they exit the reflow oven. 11. Automatic Optical Inspection (AOI) is performed to check the finished PCB board after the reflow process. This process focuses on solder quality and component alignment quality. 12. A Reject Conveyor is used for rejecting bad PCB boards after the inspection process. 13. An automatic unloader is used for storing the finished PCB boards. ### Double Line with Single-Rail Pick and Place This layout is designed for single-rail pick and place machines working with a dual-rail reflow oven. This layout achieves dual-rail production by combining two separated SMT lines into one dual-rail line. The advantage of this layout is that the dual-rail production capacity is still applicable when there is only a single-lane pick and place machine available. Factory space is saved by the use of a PCB Shuttle Conveyor. ![Double SMT Line with Single-Rail Pick and Place (credit: Talk Electronics)](image329.png) > [!Figure] > _Double SMT Line with Single-Rail Pick and Place (credit: Talk Electronics)_ The special equipment in this layout compared to the previous one is: 1. The stacker and magazine combination loader are used for the feeding of bare PCB boards. With its two loading structures, it can feed the bare boards and double-sided boards at the same time. 2. The PCB Shuttle Conveyor merges boards out of the two single-rail pick & place machines into a double-rail reflow oven. 3. Dual rail reflow oven, able to bake boards on two parallel rails simultaneously. 4. Not Good/OK double-feed unloader is used for receiving and storing the boards out of the dual rail reflow oven. ### Double Line with Dual-Header Pick & Place (option A) This layout is designed based on the dual-rail pick and place and dual-rail reflow oven and needs dual-rail conveyors for linkage of stages. The advantage of this layout is that it doubles the production capacity by adding another SMT assembly line in parallel. Also, this layout saves factory floor space compared to using two independent one-rail lines. ![Double SMT Line with Double-Rail Pick and Place (credit: Talk Electronics)](image330.png) > [!Figure] > _Double SMT Line with Double-Rail Pick and Place (credit: Talk Electronics)_ 1. Dual rail pick and place is used for dual-rail production which doubles the mounting speed of PCB board. 2. A dual rail reflow oven is used for doubling the soldering capacity. 3. Dual-rail conveyors are used for delivering PCB board from two lanes onto the next machine. 4. A PCB Shuttle Conveyor is used for redirecting flow by combining two SMT lines into one dual-rail production line. 5. Another PCB Shuttle Conveyor is used here is for splitting the dual-rail production line into two single lines. 6. The Not Good/OK unloader is used for detecting good and out-of-family boards in the last step of the SMT line after the inspection process. ### Double line with Dual rail Pick and Place Machine (Option B) This layout is based on the dual rail layout but feeding speeds are improved by changing the loader into a new type of stacker and magazine combination loader. Also, compared to the previous scheme, it eliminates the solder paste inspection and the NG buffer after the solder paste printer and adds another dual rail pick and place machine to reach higher placement speeds. Instead, this layout installs the PCB Shuttle Conveyor after each solder paste printer and puts a dual-rail inspection station for the quality inspection of the PCBs that are delivered from the dual-rail reflow oven. Additionally, there is a dual-rail rejection conveyor applied to screen bad boards that come out from the AOI machine. ![Double SMT Line with Double-Rail Pick and Place, optimized for higher speed (credit: Talk Electronics)](image331.png) > [!Figure] > _Double SMT Line with Double-Rail Pick and Place, optimized for higher speed (credit: Talk Electronics)_ The advantage of this layout is that more factory space is saved due to a more compact use of dual-rail stations. With this, the production speed has increased significantly for nearly the same length of the SMT line. The elements of this layout are: 1. Stacker and magazine combined loader. 2. Dual rail optical inspection is applied at the exit of the dual rail reflow oven to check the PCB solder quality. 3. Dual rail screening conveyor used to store bad boards that come out from the dual-rail AOI machine. ### Triple Line with Single Rail Pick and Place This layout is designed for combining three SMT lines into one. The special equipment applied in this layout includes a stacker and magazine combination loader, a 90-degree PCB guiding machine, a dual rail 90-degree PCB guiding machine, an intermediate section conveyor, and a dual rail reflow oven. The advantage of this layout is that it connects three SMT production lines together, increasing production capacity while minimizing operating costs and factory floor footprints. ![Triple Line with Single rail Pick and Place (credit: Talk Electronics)](image332.png) > [!Figure] > _Triple Line with Single rail Pick and Place (credit: Talk Electronics)_ The equipment used in this layout is: 1. Stacker and magazine combination loader both utilized the advantage of stacker and magazine loader, which is suitable for the massive feeding and collection of PCB board. 2. PCB guiding conveyors are used to change the direction of PCBs at an angle of 90 degrees or 180 degrees, to deliver the PCB from two opposite directions onto one line. 3. Dual rail PCB guiding conveyor, similar to the previous one but with one extra rail. 4. Intermediate section conveyor for transferring boards from multiple assembly lines. 5. Dual rail reflow oven ### Multi-Line, Combined Layout This layout is designed for combining multiple, complex SMT lines together, achieving maximum yield. The layout is depicted below. On the left-hand, the layout includes 7 instances of loaders, solder paste printers, conveyors, single rail pick and place machines, and another conveyor. The right-hand layout includes 4 instances of loaders, solder paste printers, SPI, a splitter conveyor, two single rail pick and places, a PCB shuttle merger, automatic inspection, and another conveyor. These 11 assembly lines, connect with an intermediate section line combiner which feeds three reflow ovens: two dual-rail reflow ovens and one single-rail reflow oven. The three reflow ovens are all connected by a horizontal PCB shuttle conveyor that interconnects all SMT lines in the layout. Each dual rail oven has its dual rail conveyor and dual-rail unloaders. The advantage of this layout is that it achieves a high-capacity SMT line in a minimal factory floor footprint. ![Multiple Line, Combined Layout (credit: Talk Electronics)](image333.png) > [!Figure] > _Multiple Line, Combined Layout (credit: Talk Electronics)_ The next video shows the board fabrication and assembly process in one single video (credit: PCB Way) ![](https://youtu.be/o8NOK1JJbgw?si=1Xog_sb3aRVLgMCx) ### Installing and Operating SMT Lines Installing and operating an SMT line requires special facilities with strict requirements in terms of air quality (dust control) and electrostatic discharge control. SMT lines will require the supply of compressed air, three-phase power, and an exhaust system capable of managing the by-products of the reflow process. Last but not least, operating an SMT line requires the necessary measures to ensure operators' safety while operating machines working with high-speed heads and high temperatures. # Embedded Software An essential part of designing digital systems is writing the software that provides runtime behavior to the underlying hardware. Embedded software is a rare breed within the diverse software family. Unlike software made for web applications or PCs, embedded software is software with specific, almost obsessive objectives, and that is reflected in the way the software is conceived and implemented. It is because of this single-objective approach that embedded software usually involves the execution of a cyclic "main loop" that performs all the tasks necessary for the system to perform its function. This loop usually involves acquiring data from a set of devices, performing some processing on that data, and taking some action. At any time, the main loop can be interrupted by, well, an interruption, which the software must service accordingly before coming back into the main execution. A big portion of the attention of embedded software developers is dedicated to the main loop and the interruptions, their priorities, nesting, etc. In general, embedded software does not start from scratch, but it relies on pre-existing bits and parts before its development kicks off. Software developers typically use startup code and libraries provided by the devices' manufacturers. In some cases, manufacturers provide complete build tools to create full-blown operating systems and applications, easing the process of "breaking the ice". Once this initial set of tools is working, the developers are free to add their own "magic" to the software, according to the application they are pursuing. Developing embedded software requires a good dose of datasheet and documentation consumption. Technical reference manuals are the "bibles" developers must rely on when it comes to understanding the inner workings of the device they are developing on. These manuals are typically thick and very detailed, so the engineers must focus on the area they are interested in at the moment and filter all the rest; no one reads a TRM from cover to cover. But developers also participate and use online forums and wikis to ask questions and obtain valuable lessons learned from other developers working on similar devices. Embedded software can be bare metal, or use a real-time operating system (RTOS). We will dive a bit into each kind. ## Bare Metal In bare metal scenarios, the software main loop is just a collection of functions manually crafted by the developer, along with the pre-existing startup code and compiled libraries that the manufacturer might require for the device to work. Bare metal scenarios are plausible for small-footprint applications requiring very little complexity ## RTOS > [!warning] > To be #expanded ## Software/Hardware Co-Design Developing embedded software for boards must take into consideration the fact that the hardware is not always there when the software development takes place. > [!warning] > To be #expanded ## Bootloader A bootloader is a program whose sole function is to fetch a piece of executable code—typically application code—from some location or through some data interface, for example, a serial interface, and place it in a way the microprocessor can find it and run it. The bootloader is invoked during the power-on sequence of the processor, and usually loads and executes the same application code, over and over. Under special circumstances, the bootloader might be commanded to load another image from some other location or take a new image through some interface. Mind that bootloaders can also create some headaches. For example, if during the process while a bootloader is loading an application memory something odd happens and the image gets corrupted, then the application program will never execute because it is only partially present. This is typically and somewhat colloquially called "bricking" the thing; because the device becomes as useful as a brick. You can brick your smart TV, your Wi-Fi router, or your phone while updating firmware. The key is to ensure the bootloader can always recover from a failed flashing procedure. Goes without saying, that bootloaders should not be able to overwrite themselves, at least not their most critical part (there can be more than one bootloader calling another bootloader and another bootloader). ## Board Support Package (BSP) > [!warning] > This section is under #development ## Embedded Software Integration > [!warning] > This section is under #development ## Integration to Host System > [!warning] > This section is under #development [^73]: https://resources.altium.com/p/fr4 [^74]: Note that in the context of electronics design, OEM (Original Equipment Manufacturer) refers to bare boards sold "as is" to be integrated into bigger designs. [^75]: A development kit (or dev kit) is a generic board made by the chip manufacturer whose main goal is to provide the user with familiarity with the device and easy access to its resources. [^76]: These are specified in a set of standards such as IEC 60617 and [^77]: https://www.fedevel.com/courses [^78]: https://docs.xilinx.com/r/en-US/ug583-ultrascale-pcb-design/UltraScale-Architecture-PCB-Design-User-Guide [^79]: https://community.nxp.com/pwmxy87654/attachments/pwmxy87654/qoriq-grl/9543/1/High\_speed\_interfaces\_layout\_spraar7f.pdf [^80]: https://www.intel.com/content/www/us/en/support/programmable/support-resources/design-guidance/board-developer.html # Bring Up Process PCBA houses will most likely not apply any power to the boards they assemble, nor run any "live" tests with power applied on a customer's board, principally because they will lack the information for safely powering things up, and even the means (such as harness) for doing so. Usually, the assembly house will deliver the assembled board to the customer, and it will be up to the board's owner to perform the first-ever bring-up. In a nutshell, performing a board bring-up means doing what other engineering disciplines more formally call a "commissioning", which is a set of activities to be performed on a system to bring it from a non-working state into a working state. It is important to note that we are discussing here the first-ever bring-up process for a board that has been recently designed and it is landing on the designer's hands for the first time. Once the design is mature and enough iterations of the design have passed (complex boards are seldom right the first time), the bring-up process can be automated to some extent. Needless to say, there are different levels of commissioning in a digital system: from board level bring up to backplane and plug-in boards bring up to unit, rack, and system level commissioning. In this section, we will strictly be covering board-level commissioning, assuming we are talking about a single board that contains all components and no computer-on-module or mezzanine is on top. We are also assuming the board contains at least one programmable device such as a processor, a microcontroller, an FPGA, or a System-on-Chip. The first-ever bring-up process involves the people who designed the board, for they have a level of insight about the board that cannot be easily found in anyone else who hasn't been part of the schematic capture and the layout activity. Although the theoretical advantage of giving the first bring-up process to someone who has not designed the board might sound interesting under the pretense of "independence", it may make the process longer and it will inevitably require the designer's (or designers') time and attention anyway, so it is highly advised the designer or designers are made responsible for the bring up. The bring-up process is typically a sequence of steps described next. ## Initial Inspection Before powering up, the PCBA is visually inspected for any manufacturing defects like short circuits, missing or misplaced components, or soldering issues. This step can also involve using tools like a magnifying glass or special microscopes^[https://www.visioneng.us/products/stereo-microscopes/mantis/]. ## Power Tree Verification More often than not, powering up a new board results in smoke. To avoid this, and for boards with complex power trees, the different voltage rails on the PCBA must be tested one by one. This involves applying power to the different branches of the power tree and measuring the voltage levels at various points to ensure not only that they match the design specifications, but also that they are stable and regulate accordingly. Ideally, the voltage rails are brought up without loads, and they are incrementally loaded as the bring-up matures and the tester gains confidence things work as expected. ## Firmware/Software Loading Because the PCBA most likely includes a programmable device or devices like a processor, microcontroller, SoC, or FPGA, the next step is to verify that the board is capable of loading firmware (or firmwares if there's more than one processor on board) and/or bitstreams into programmable logic. This might involve verifying JTAG interfaces and signals from the connector to the device (if the package allows, BGAs won't for example), or other programming interfaces. If the programming interface works, one of the typical next steps is to flash a [[Printed Circuit Boards#Bootloader|bootloader]]. Problems with loading software may indicate issues in the processors/FPGAs/microcontrollers with clocks, JTAG emulation (if present), missing or shorted components like crystals, etc. ## Zone Testing PCBs are typically divided into subsystems or functional zones (like communication interfaces, memory, input/output systems, sensing, etc.). Each subsystem is tested individually to validate its operation. This might require loading specific test firmware to load test patterns on memories or to transfer through interfaces, for verification purposes. ## Breakout Boards In some PCBA types, access to signals might be tricky. For instance, when the PCBA is a plug-in module that is supposed to be connected to a backplane, we might need to expose the signals destined for the backplane for testing purposes even before we get the real backplane, which might have a different manufacturing schedule and not be present at the same time. For this, we may need to design supporting PCBs like breakout boards, which are testing boards that are strictly designed to support the bring-up process of a fellow board. Another case for using a breakout is when two boards need to communicate; adding a breakout in between the boards may give better physical access to the signals for oscilloscopes or logic analyzers. Another case is for systems that are too small and packed to be easily accessible during development, for example, small consumer electronic items like smart watches (see figure below). ![](wrist.jpeg) > [!Figure] > Apple Watch S2 development board. Credit: User dosdude1 in Twitter(X) https://twitter.com/dosdude1 ## Interface and Connectivity Testing If the PCB is designed to communicate with other systems (like sensors, displays, or external processors), these interfaces are tested for proper connectivity and data exchange. If the board includes [[High-Speed Standard Serial Interfaces|high-speed interfaces]], here's where eye patterns must be measured, along with bit-error rates (BER), jitter, and the like. ## Performance Testing This step involves testing the PCBA under various operational conditions to ensure it meets performance requirements. This can include testing under different temperatures, voltages, and loads. It may also include speed tests which will require eye pattern measurements on different channels, and time domain reflectometry (TDR). ## Environmental and Stress Testing The PCB may undergo testing in environmental chambers to simulate extreme temperatures, humidity, and other stress factors it might encounter in its operational environment. Some of the tests listed above might be repeated inside the chamber. ## Final Verification and Documentation Once all tests are passed, final verification is done to ensure the PCB meets all design specifications. Documentation of the testing process, issues found, and how they were resolved is also important. Failed boards are sent for deeper debugging and rework, depending on the issue found. Note that the bring-up process feeds quite some information for the designers, considering that some of the issues found might be related to the manufacturability or testability of the design. ## Iterative Improvements As said, based on the findings from the bring-up process, the PCB design might be revised and improved for better performance or to fix issues. Needless to say, issues during the bring-up process must be documented so they are taken into account in the new iteration. The process is then repeated for the next iteration