The Computational Design Era: Engineering at the speed of light
Written by Bradley Rothenberg | CEO and Founder at nTop
Published on June 21, 2024
And, of course, the world today is changing much faster than it did in 500 BC. Among the many changes, there are some that have a significant impact on engineering and engineering software:
Computing today is more powerful and less costly than ever before and technologies like cloud computing are democratizing access to this power. Most CAD/CAE/CAM was developed in the 1980s before accessible and massively parallel compute. Thousands of compute cores are now accessible beneath our fingertips1 in enormously powerful GPUs. The end of Moore’s law sparked the scale-up of new GPU architectures and has opened paths for new engineering software tools.
AI & Machine Learning is enabling humans to extend what's possible. Applying AI-based systems to product design will enable engineering teams to accelerate innovation by generating better performing product designs, providing more realistic predictions and performance simulations, and ensuring these designs can be manufactured economically.
Digital Manufacturing is industrializing, leading to orders of magnitude more performance in manufactured parts that can be sourced faster and more economically than ever. nTop customers are now designing everything from FDA certified 3D-printed medical parts to FAA process qualification of safety critical metal 3D printed aircraft parts. Additionally, traditional manufacturing processes are increasingly becoming more digital, automated, and connected to further decrease costs and increase throughput2.
These changes are fueling a new paradigm of engineering tools where human — computer collaboration enabled by cloud + GPUs + AI/ML will fundamentally restructure the engineering world creating new winners.
Paradigms of engineering tools
Engineering is an iterative process of exploring a design space and looking for a solution that’s optimal3. Like in basketball where getting more high-quality shots from your best players likely leads to better outcomes, in engineering this translates to empowering top engineers to iterate more, leading to better outcomes – so, improving the number and pace of iterations is critical for delivering better designs and more innovation.
The changes in engineering that have happened over the last few decades can loosely be categorized into four eras. Each era marks a step change in the speed of iteration enabling new types of components and systems that were not previously possible.
The Paper Era: The engineer makes a drawing, on paper. In rare cases, some computer-based analysis is performed. The engineer studies the results of the analysis, and decides whether or not the design needs to change. If so, the drawing will need to be modified or recreated. Iteration happens at the speed of a pen and an eraser.
The 1980s CAD Era: The process here is essentially the same as in the paper era, except that the drawing is now a digital CAD file. You draw lines by clicking with a mouse instead of using a pen, and the lines are green instead of black. Iteration happens at the speed of the mouse – fast compared to redrawing an entire design on paper.
The Parametric Era: CAD models developed bits of intelligence in their construction – with Pro/ENGINEER, PTC introduced the “replay” capability to allow CAD models to sequentially update themselves when design changes are made. To modify a design, you just change a few numbers4 and wait for the model to rebuild sequentially, typically taking several minutes to hours. If a feature fails to rebuild, which is common, you’ll need to spend time figuring out why, and finding a workaround. Analysis happens after a given design is generated via Finite Element Analysis (FEA) and/or Computational Fluid Dynamics (CFD), which might take several hours to days. Iteration now happens at the speed of sequentially rebuilding a CAD model (or at the pace of the FEA/CFD analysis included in the loop) – exponentially faster than having to mouse click every line. Simple parametric parts, like bolts, valves, etc can be re-used from design to design, though most new designs start from a blank CAD file.
The Computational Design (CD) Era: In this emerging era, the design itself is a program that captures an entire design space5 and compute is used to explore that space. While it’s an engineer that defines design intent and constructs this program, it’s a computer that’s deployed to explore the best input parameters to achieve the optimal outcome. Analysis, now baked into the program itself, drives key parameters while utilizing simulation + AI/ML to further accelerate the process. Iteration happens literally at the speed of electrons flowing through chips6 – exponentially faster than sequential CAD model rebuilds. Because design intent is baked into the program, computational models are easily version-controlled, reusable, and extensible to solve similar design problems.
Lightspeed iteration is enabling engineers to optimize products to incorporate manufacturability and process earlier into design. In the end this means engineers can find better solutions that balance production economics and quality. This will dramatically reduce time-to-market.
The computational design era brings many of the benefits seen with increased iteration speed – the key question is whether or not current software architectures can effectively support computational design. What capabilities do those architectures need to provide?
Capabilities needed for Computational Design
Computational design requires rich computer models capable of responding intelligently to parameter changes, regardless of whether those changes come from an engineer or computer. The models must be:
Fast: Either an engineer or a computer is iterating to improve the design. In general being fast enables more design iterations which leads to better designs. Features like automation and real-time model updates promote faster iteration and enable engineers to more quickly visually assess design changes.
Flexible: The models need to have a high degree of morphability, so that they capture a large design space. In particular, they need to support broad changes in shape and topology as well as changes in material distribution and composition. In past decades, material composition could be regarded as constant within a part, but this has changed with the industrialization of composites and 3D printing.
Reliable: The iterations described above are based on modifying inputs and recomputing the model. If computations are unreliable, the model will fail to update7. For engineers, failure is annoying, because they have to diagnose the problem and hunt for workarounds. For a computer, this failure kills a design exploration – in fact more time goes into the writing of rules to prevent model failures than actually building the model itself when trying to run automated design studies on top of CAD8.
Closed Loop: For a computational model to be useful for engineers, the physics must be integrated into the model. The physics should be wired up to facilitate closed-loop optimization. For example, results from a structural analysis might indicate the stress value at each point in a design – these stress values form a scalar field that can be used to automatically modify the design, making it stronger in areas of high stress by adding material while reducing the weight by removing unnecessary material in low-stress areas.
Differentiable: Suppose an engineer or a program is trying to minimize some property of a design, like its manufacturing cost or its weight. To find a minimum, the “driver” (engineer or computer) needs to know which direction is “downhill”. To identify downhill directions, you need to be able to perform mathematical differentiation. Derivatives tell you how changes in parameters will affect the design, and, in particular, tell you which changes will improve it., thus having derivatives can make the design search process more effective/efficient.
Comparing Modeling Technologies for Computational Design
From first principles, it starts at the core: a shape model that’s capable of robust and lightning fast updates.
Boundary representations (B-reps) are the geometric modeling technology used by all mainstream CAD systems. As its name implies, a B-rep describes a shape by modeling its boundary – the outer skin of the part is wrapped in a collection of faces joined together by edges.
Though they continue to be the most widely used and successful representation for geometry, B-rep systems have some significant flaws that make them unsuitable for computational design. First, their architecture has remained mostly unchanged since the 1980s, so they are not suited to parallel computing, especially on GPUs. In a typical B-rep system, the GPU is relegated to rendering triangles spit out from a geometry kernel running single-threaded on one core of a CPU. More importantly, B-rep calculations can fail for a variety of reasons. Heroic efforts over the past four decades have improved reliability, but it’s still unsatisfactory, and it’s not likely to get much better.
In implicit modeling, the approach is completely different: the shape of an object is described by a mathematical function that returns the distance to the closest point on its surface. The function is constructed so that it is negative inside the object, positive outside, and zero on its surface, so it's called a Signed Distance Function (SDF).
A key step in any modeling algorithm is deciding whether a given point P is inside or outside an object. If you have an SDF, say F, this is easy – you just have to check the sign of F(P). And this calculation is obviously easy to parallelize, since the results for different points are independent of each other. This means that implicit modeling calculations are blazingly fast on GPUs, delivering real-time interaction even for the most complex shapes.
Implicit modeling systems are also more reliable, because they can avoid doing the fragile types of calculations that lead to failures in B-rep systems.
The following table summarizes how well the two modeling technologies support the system capabilities we listed earlier:
B-reps | Implicit (SDFs) | |
Fast | No. Designed for 1980s computers, doesn’t take advantage of GPUs or multi-core CPUs | Designed for and well suited to modern computers with multiple cores and powerful GPUs |
Flexible | Not really. Large topology changes cause trouble. No knowledge of interior of parts | Yes. Support any topology or no topology. Explicit representation of interior space. |
Reliable | Inherently fragile. Still fail too often, even after four decades of improvement efforts. | Geometric reliability achieved by avoiding the computations that make B-reps fragile. |
Closed loop | The engineer studies analysis results, decides how to modify the design, and makes changes manually. | Analysis results (fields) can be used to directly modify the design, enabling automated optimization based on design goals defined by engineers |
Differentiable | Difficult. Little optimization guidance for human users or programs. | Just a mathematical formula, essentially, so well suited to automatic differentiation. |
Additionally, integration with simulation is necessary to drive the design parameters of a computational model – thus a tight connection between modeling technology and the simulation tools are necessary. Faster and more robust modeling technology enables a more tightly integrated geometry and analysis roundtrip workflow.
Computational Design at nTop
Based on the lightning speed at which nTop customers iterate through vast and complex design spaces, implicit models (using SDFs) are the right core tech for computational design. In fact two nTop customers, Ocado and Siemens Energy are pioneering this new approach in production. Ocado Technology has reduced the development time of their Series 600 Bot9 from months to weeks10, and Siemens Energy have built out computational models for the design of advanced heat exchangers and components of their gas turbine engines enabling the transition away from fossil fuels. This is the early days and is the start of what’s possible in the computational design era.
We’ve learned a great deal from our customers over the last 5+ yrs of their use of implicit modeling – from the early days on site at the US Air Force's first metal 3D printing lab to visiting Sikorsky and seeing their computational models of safety-critical helicopter parts – this has put our core modeling tech through the wringer. Over 400 companies using nTop plus >25K students have exported ~300K computational models in the last year alone.
We call our core modeling tech for nTop 5.0 “Sequoia” (because its basic data structure is a tree – technically an “Abstract Semantic Graph” or ASG).Unlike traditional CAD modeling kernels like Parasolid/ACIS/CGM, a Sequoia model is a program, in the form of an Abstract Semantic Graph (ASG), that describes the relationship between the elements of a design. The model itself contains all possible designs, and evaluations of the design can be executed in parallel to better understand different aspects of the design.
Having a programmatic representation of a design enables shape derivatives faster than our previous generation core modeling technology and is one key enabler for optimization & integration of geometry into AI/ML pipelines.
Having the right model is foundational to computational design. It enables the human:computer collaboration necessary to engineer at the speed of light, but it’s not enough. The application stack built on top of the model is how engineers interface with the model to design components.
1. nTop is for engineers to run locally for interactive part design. Its purpose is to make it easy for engineers to build computational models capturing design spaces. This includes setting up geometric relationships, deploying geometric features to regions, etc.
2. nTop Automate executes nTop workflows at scale, with a range of inputs – it’s a robotic nTop wingman to help engineers understand massive and complex design spaces.
3. nTop Core is a lightweight library and format built for software developers at partner companies & researchers to exchange nTop implicit models into their engineering tools. It’s already integrated into EOSPrint, enabling mesh-free direct-to-print workflows on EOS metal & polymer 3D printers. Several incumbents in the CAD, CAE and AM domains, including Autodesk, Hexagon, and Materialise, are in the process of building out product integrations.
This application stack enables engineers to collaborate with computers:
1. Engineers build computational models in nTop — rather than manually drafting, a model is created by assembling blocks and interacting with geometry to establish a set of parametric relationships to define a design space. This workflow is a “living document” — goals are defined via custom objective functions and user-defined constraints.
2. Computers explore design space in a computational model: nTop “robots” (compute) run through variations of the design to help engineers understand what’s possible or to train up a surrogate model.
3. Engineers collaborate with computers to find the optimal design through defining and measuring the fitness of the different design points.
Foundational implicit modeling tech and the application infrastructure built around it power engineer-to-computer collaboration sparking lightning fast design iteration in the computational design era.
What is needed to move into the CD era?
1. Foster an implicit ecosystem:
a. Implicit models must integrate seamlessly with existing tools. In order for people to adopt new tools more rapidly, they need to fit within their existing ways of working.
b. Make it easy and seamless for new tools to be built on implicits to foster an open ecosystem of new implicit-first engineering software: simulation and machining are two critical components of engineering workflows that nTop probably will not solve alone.
2. Implicits must be easy for engineers to use: operations need to be repeatable, measurable (defined by engineering coordinates), and robust – this is not “freeform” modeling, but precise and interactive definition of an engineering model.
3. Simulation / analysis should not be a bottleneck, be it the meshing or the solve. There is still a lot of work to be done here, and we are working with a number of partners from industry leaders like ANSYS and Hexagon to emerging technologies like Intact Solutions and cloudfluid.
Summary
The CD Era is one of highly optimized designs that are generated with lightning speed, driven by human:computer collaboration (fueled by Compute/GPUs + AI/ML + Digital Manufacturing). It’s Iteration at the speed of light. Implicits serve as the new foundation. In short, nTop is foundational tech that provides the application layer necessary to exponentially speed up engineering/manufacturing, bringing forth the era of computational design.
Special thanks to George Allen for his invaluable help in drafting this post.
Footnotes
1 Quite literally for me as I type this
2 e.g. El Segundo startup Rangeview for investment castings, Hadrian for CNC machined parts, Machina Labs for sheet metal forming
3 or, at least, one that’s feasible and “good enough”
4 parameters
5 A computational model captures all possible designs in a design space instead of just one version of a design
6 almost
7 The dreaded “replay error” of all parametric CAD tools
8 See Lockheed Martin Overview of the AFRL EXPEDITE Program by Clifton C. Davies
9 Ocado Series 600 youtube video here
10 design iteration time from weeks to < a day – nTop Robots crank through hundreds of design options overnight to find the optimal solid model to make the series 600 light and easy to assemble
Bradley Rothenberg
CEO and Founder at nTop
Bradley Rothenberg is the CEO and founder of nTop, an engineering design software company based in New York City. Since its founding in 2015, nTop has served the aerospace, automotive, medical, and consumer products industries with advanced engineering software that enables users to design, test, and iterate faster on highly complex parts for production with additive manufacturing. Bradley has been developing computational design tools for additive manufacturing for more than 15 years. He actively works to advance the industry, often speaking at industry events around the world including Develop3DLive, Talk3D and formnext. He is often quoted in trade publications, interviewed on industry podcasts, and has been included in Forbes Magazine. He studied architecture at Pratt Institute in Brooklyn, New York.