The automation of design

Physical and biological design are about to get much more digital, says Autodesk’s CTO.


A titanium chair designed through iterative generation and optimization by Autodesk software. Photo courtesy of Autodesk and The Living.

One of the core ideas behind our Solid Conference is that software can replace physical complexity, and that it’s getting easier for it to do so because the relationship between the physical and virtual worlds is becoming more fluid. Input tools like 3D scanners and computer vision software, and output tools like CNC machines and 3D printers are essentially translators between digital and physical. They make it possible to extract information from physical objects, compute on it, transform it, combine it with other data, and then “rematerialize” it.

I recently spoke with Autodesk CTO Jeff Kowalski about this convergence between physical and digital, and its impact on design. In his view, computers are about to go from mere drafting tables to full partners in the design process. They’ll automate the tedious cycle of trial and error, and leave designers to guide aesthetics and experience. “Decades ago, someone came up with the term ‘computer-aided design,’ but what we’ve had up to now is really computer-aided documentation,” he says. “Design has been accomplished solely in the head of the designer, and then the computer is used to document the outcome.”

In Kowalski’s telling, we’re approaching the final stage of a decades-long process that’s separated information from its physical embodiment. First, pure information — newspapers, movies, music — went digital. Then more abstract network relationships and communications went digital. Services like AirBnB and Uber have broken certain kinds of transactions and relationships free of human intermediaries, and now the information contained in physical objects is undergoing a similar transformation.

“The last thing to get really digitized is the design that is inherent in the physical objects that surround us,” says Kowalski. “There is information embodied in every physical thing around us, and we’re just on this very exciting threshold of making all of those things computable.”

That will transform the process of physical design, which involves repeated translation: an object starts as an idea, becomes a digital design, and then becomes a physical prototype. Refining the prototype means carrying it back to ideation, making changes to the design, and then producing a new prototype. Each iteration entails a trial-and-error process of testing against constraints, imagining solutions, and looking for ways to execute them.

That process is slow, and as it progresses, new constraints get introduced. Perhaps a manufacturer isn’t able to produce a desired shape at a workable cost, or the product ends up too heavy to ship. “Design right now, as it’s been put on the earth — whether it’s buildings or cars or hardware — is pretty much the first thing that worked, as opposed to the best one that could be found,” says Kowalski.

If designers were able to keep the development process digital, as software developers can, the process could run faster — no need to constantly translate between digital and physical — and parts of it could be automated and optimized.

Autodesk’s automated design software works through three layers. Generators populate physical space with matter, creating thousands of design suggestions in a split-second. Analyzers test each design against constraints developed by the designer: does it fit in the desired shape? Does it satisfy strength, thermal, material, and mechanical requirements? Above those, an optimization system finds the best designs, returns them to the starting point, and refines them repeatedly. “What we’re offering is to use the computer as a way to explore options,” says Kowalski. “It’s getting us closer to the optimal designs, the ones we wish we had.”

That optimization engine can go all the way down to individual toolpaths in the manufacturing process, thanks to closer linkages between software and industrial tools, so a designer might ask the software to minimize waste or balance cost against environmental impact from materials.

Rather than spend their time trying out different technical solutions, designers can summon an optimized solution and focus on aspects of design that are harder to quantify, like aesthetics and human experience. Kowalski thinks the change will make design much more broadly accessible. “Think of the human capital and monetary capital required to bootstrap these things,” he says. “To the extent we’re able to reduce or limit those, we make design more available to more interested parties.”

In the past, sophisticated designs were only available to big companies with a lot of resources; now, they’re accessible to small-scale entrepreneurs and people who don’t necessarily have design or technical expertise. “They don’t even need a computer; they just need a Chromebook,” says Kowalski, “and that follows exactly the trajectory of people who create music, videos, and so forth on the web.”

Once there’s a framework for encapsulating and refining physical designs in digital terms, the next step is biology, where the challenge is similar to that of physical design: investigation in vitro is slower than investigation in silico; doing as much of it as possible in digitally defined terms will speed up the process of finding outcomes that meet constraints. A computer can develop and test a practically unlimited number of solutions across any number of factors that need to be optimized.

Biological processes can then become ways to translate between physical and virtual, with processes designed digitally and executed biologically. In order to, say, clean up an oil spill, a biological designer might specify a particular chemical transformation and then let software find the best way to bring it about. “In the same way that, when I start designing a chair, I don’t know if I’m targeting 3D printing or CNC milling, I can start designing a metabolic pathway without knowing if I’m targeting yeast or E. coli,” says Kowalski.

As far-fetched as these design mechanisms sound, they’re well-defined and accessible, and just require development work of the sort we’ve seen in earlier big efforts to build digital models of the physical world. “Twenty-five years ago, if you asked people whether a computer could help you navigate around the city, they’d have said that’s ridiculous — you can’t digitize and encapsulate all of those streets in a single system,” says Kowalski. “But all of that simply takes work, and it’s available to us now.”

This post is part of a collaboration between O’Reilly and Autodesk exploring the convergence of hardware and software. See our statement of editorial independence.


Get the O’Reilly Data Newsletter

Stay informed. Receive weekly insight from industry insiders.

Get the O’Reilly Web Ops and Performance Newsletter

Weekly insight from industry insiders. Plus exclusive content and offers.

Get the O’Reilly Programming Newsletter

Weekly insight from industry insiders. Plus exclusive content and offers.

Get the O’Reilly Hardware Newsletter

Get weekly insight and knowledge on how to design, prototype, manufacture, and market great connected devices.

Get Four Short Links in Your Inbox

Sign up to receive Nat’s eclectic collection of curated links every weekday.

Get the O’Reilly Design Newsletter

Stay informed. Receive weekly insight from industry insiders.

Get the O’Reilly Web Platform Newsletter

Stay informed. Receive weekly insight from industry insiders—plus exclusive content and offers.