bnew

Veteran
Joined
Nov 1, 2015
Messages
43,685
Reputation
7,332
Daps
133,105

If it can be designed on a computer, it can be built by robots​

Powerful new software rewrites the rule of mass production​

20230812_STD001.jpg
image: shira inbar
Aug 9th 2023 | FORT MILL, SOUTH CAROLINA, AND DEVENS, MASSACHUSETTS

In a factory on the Carolinas’ border, Stanley Black & Decker is assembling cordless electric drills. As part-finished drills travel in boxes along a conveyor belt, a robotic arm photographs and scans them for defects. Another robot nestles electric motors into the drills’ casings. A third one places and tightens screws. A single piece of software oversees the entire production line, which is capable of pumping out 130 cordless power tools every hour under the supervision of just seven humans. The assembly line it replaced in China needed up to 40 workers and rarely produced more than 100 an hour.

“Thirty years from now we will laugh at our generation of humans, putting products together by hand,” predicts Lior Susan, the boss of Bright Machines, a San Francisco-based company that installed the plant’s software. It is not that the design of the electric drills or the various steps involved in making them have changed. Rather, it is the way the automated machines doing the work are being driven by instructions that have been encoded into software having been in effect copied from the brains of Chinese factory workers, who mostly did the job manually.

Making things this way resembles a model used by the semiconductor industry, where chips are designed using software that directly links to the automated hardware which fabricates them. For the Fort Mill plant, and other firms starting to employ such software-defined manufacturing systems, it promises to transform the factory of the future by allowing more-sophisticated products to be designed and put into production more quickly. All of which promises big cost savings.

Make this please​

To understand why, consider a simplified version of how a new power tool is made. A team of designers come up with a fresh feature, say a longer-lasting battery. They map out every element of the new product, from the battery compartment to the circuitry, that needs to be changed as a result. It is complex work, not least because a small change to one component can have a big impact on another, and so on.

The design is then “thrown over the wall” to the people responsible for making it. Sometimes that is a third-party factory, often in China. Engineers, designers and production staff exchange information and meet up, constantly tweaking the design in response to the various successes or failures involved in making a series of prototypes. Little things, such as a screw than cannot be tightened correctly because it is hard to reach with an electric screwdriver, might result in a return to the drawing board—which nowadays is mostly a computer-aided-design (cad) program.

Eventually, all the kinks are ironed out (hopefully) and the new product is ready for production. The finer details of how all this was achieved, however, are likely to remain locked up in the minds of the workers assembling the prototypes. Humans are, after all, incredibly flexible and often come up with workarounds.

This process has been employed for decades, yet is inherently uncertain and messy. Designers cannot predict with any confidence what things the factory can or cannot easily accommodate. As a consequence, the design team may purposely leave some features a bit vague, and be put off innovative ideas for fear of being told it cannot be made or is impossibly costly.

When the hardware is controlled by software, rather than by humans, all this changes. Designers can dream up new products with a far greater certainty that they are manufacturable. This is because the constraints of the production line—even fiddly details like the positioning of screws—are encoded in their cad programs. Those programs, in turn, are directly connected to the software which controls the machines in the factory. So, if a design works in a digital simulation, there is a good chance it will also “run” on the production line.

An illustration of two robot hands with a conveyor belt and laptop between them.

This tight integration of manufacturing hardware and cad software has been a boon in semiconductor manufacturing, where vast machines etch circuits into silicon just a few nanometres (billionths of a metre) wide. Chip designers with firms such as Apple, Nvidia or Qualcomm use specialised programs, largely produced by two companies, Cadence and Synopsys, to sketch out circuits. The design files are then sent directly to silicon foundries, such as tsmc, in Taiwan, for production.

“Until the advent of those tools, people were laying out integrated circuits by hand,” says Willy Shih of Harvard Business School. Mr Shih imagines the impossibility of attempting to do that today with, for instance, Apple’s m1 chip, which contains 114bn transistors. Producing such complexity is only possible in a system where software allows humans to ignore the detail and focus on function.

Stanley Black & Decker has not yet turned its cad tools loose on Bright Machines’ system to design new products. But the idea is that they soon will. “What Cadence and Synopsys did to semiconductors is what we will do to product design,” says Bright Machines’ Mr Susan.

Layer by layer​

Some companies have already started designing products this way. VulcanForms is a foundry, but one that makes metal components rather than chips. It operates out of a former aircraft hangar in northern Massachusetts, where its vast computer-controlled machines focus 100,000 watts of invisible laser light onto a bed of powdered metal. The powder melts and fuses into intricate patterns, layer by layer, until a component with dimensions specified to within a hundredth of a centimetre emerges. It could be part of the engine in a military drone, or a perfectly formed hip-replacement joint. This is a type of additive manufacturing, more popularly known as 3d-printing. VulcanForms’ machines are driven by cad software and can produce any metal component with a diameter up to about half a metre.

“When I became familiar with what VulcanForms was doing, I could see predictable patterns that mirrored some of the learning with semiconductors,” says Ray Stata, the founder of Analog Devices, an American chipmaker, and a member of the foundry’s board. In chipmaking, he says, the software linking designer and manufacturer has produced huge gains in efficiency and economies of scale.

VulcanForms uses software made by nTopology. This lets people without the skills required to operate lasers, to design objects for production by the foundry. It can result in components with previously unmatched levels of performance, because they can be produced as complex geometric structures which are impossible to manufacture any other way, says John Hart, chief technology officer of VulcanForms. Objects can be created at high volumes, such as forging 1,000 spinal implants from a single powder bed. With additive manufacturing, products can also be produced in one go, as single components, rather than being assembled from individual parts. This reduces the amount of material required as the parts tend to be lighter. It also cuts down on assembly costs.

Software-defined manufacturing has an impact on some of the big trade and political challenges faced by companies. For firms that are increasingly uncomfortable with relying on Chinese manufacturers, it can make reshoring production a more viable option. Mr Susan puts it in martial terms: “Manufacturing is a weapon. When we give design files to China, we give the source code of that weapon to our enemy.”

There will be implications for manufacturing jobs. Although automation usually means a reduction in the number of people assembling things on the shop floor, it also creates some jobs. Technicians are required to program and maintain production systems, and in offices successful companies are likely to boost the numbers working in design, marketing and sales. These jobs, though, require different skills so retraining will be necessary.

Mr Shih also notes that factories themselves, not just the machine tools and processes within them, are coming under the thrall of software. He cites Tecnomatix, a subsidiary of Siemens, a German industrial giant, whose software lets designers lay out an entire factory so that the making of new products can be simulated in a virtual environment, known as a digital twin, before manufacture begins in its physical counterpart.

If the future of manufacturing is following semiconductors, then there is still some way to go. Producing mechanical objects is not the same as etching elaborate circuits that have no moving parts. For a start, things are far less standardised, with components having all sorts of end uses. “We’re just at the beginning with mechanical structures,” says Mr Stata. “The whole process of putting materials together in an additive method is in its very early stages. The flexibility and possibility that opens up is mind-boggling.”

Yet some of the implications are becoming apparent. Products could reach a level of performance and precision which is simply unachievable when their production is limited by human hands. Laying out a factory floor in two dimensions to accommodate human workers will become a thing of the past. Factories designed by software will be denser, much more complex three-dimensional places, full of clusters of highly productive, highly automated machinery.

These factories of the future may be almost deserted places, attended to by a handful of technicians. But with software also taking care of the intricacies of production, they will be easier to use by people developing and designing new products. That should free their imaginations to soar to new levels.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
43,685
Reputation
7,332
Daps
133,105

Instant Evolution: AI Designs New Robot from Scratch in Seconds​

First AI capable of intelligently designing new robots that work in the real world​

AI robot

Video: Northwestern Engineering’s Sam Kriegman reveals an "instant-evolution" algorithm, the first AI program capable of designing new robots that work in the real world.

Oct 3, 2023
Amanda Morris

A team led by Northwestern Engineering researchers has developed the first artificial intelligence (AI) to date that can intelligently design robots from scratch.

To test the new AI, the researchers gave the system a simple prompt: Design a robot that can walk across a flat surface. While it took nature billions of years to evolve the first walking species, the new algorithm compressed evolution to lightning speed — designing a successfully walking robot in mere seconds.

But the AI program is not just fast. It also runs on a lightweight personal computer and designs wholly novel structures from scratch. This stands in sharp contrast to other AI systems, which often require energy-hungry supercomputers and colossally large datasets. And even after crunching all that data, those systems are tethered to the constraints of human creativity — only mimicking humans’ past works without an ability to generate new ideas.

The study published Oct. 3 in the Proceedings of the National Academy of Sciences.
“We discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers,” said Northwestern’s Sam Kriegman, who led the work. “We told the AI that we wanted a robot that could walk across land. Then we simply pressed a button and presto! It generated a blueprint for a robot in the blink of an eye that looks nothing like any animal that has ever walked the earth. I call this process ‘instant evolution.’”
Sam Kriegman

We discovered a very fast AI-driven design algorithm that bypasses the traffic jams of evolution, without falling back on the bias of human designers.​

Sam KriegmanAssistant Professor of Computer Science, Mechanical Engineering, and Chemical and Biological Engineering

Kriegman is an assistant professor of computer science, mechanical engineering, and chemical and biological engineering at McCormick School of Engineering, where he is a member of the Center for Robotics and Biosystems. David Matthews, a scientist in Kriegman’s laboratory, is the paper’s first author. Kriegman and Matthews worked closely with co-authors Andrew Spielberg and Daniela Rus (Massachusetts Institute of Technology) and Josh Bongard (University of Vermont) for several years before their breakthrough discovery.

From xenobots to new organisms​

In early 2020, Kriegman garnered widespread media attention for developing xenobots, the first living robots made entirely from biological cells. Now, Kriegman and his team view their new AI as the next advance in their quest to explore the potential of artificial life. The robot itself is unassuming — small, squishy, and misshapen. And, for now, it is made of inorganic materials.

But Kriegman says it represents the first step in a new era of AI-designed tools that, like animals, can act directly on the world.
“When people look at this robot, they might see a useless gadget,” Kriegman said. “I see the birth of a brand-new organism.”

Zero to walking within seconds​

While the AI program can start with any prompt, Kriegman and his team began with a simple request to design a physical machine capable of walking on land. That’s where the researchers’ input ended and the AI took over.

The computer started with a block about the size of a bar of soap. It could jiggle but definitely not walk. Knowing that it had not yet achieved its goal, AI quickly iterated on the design. With each iteration, the AI assessed its design, identified flaws, and whittled away at the simulated block to update its structure. Eventually, the simulated robot could bounce in place, then hop forward and then shuffle. Finally, after just nine tries, it generated a robot that could walk half its body length per second — about half the speed of an average human stride.

The entire design process — from a shapeless block with zero movement to a full-on walking robot — took just 26 seconds on a laptop.
Holes

AI punched holes throughout the robot’s body in seemingly random places, and Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.
Muscles

The inside of the robot contains "air muscles," as shown on the left.
Molding the robot

Using the AI-designed blueprint, a 3D printer prints molds for the robots.
Holding the robot

Sam Kriegman holds one of the robots.
Pumping air

David Matthews pumps air into a robot, causing it to walk.
Holes

AI punched holes throughout the robot’s body in seemingly random places, and Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.
Muscles

The inside of the robot contains "air muscles," as shown on the left.

“Now anyone can watch evolution in action as AI generates better and better robot bodies in real time,” Kriegman said. “Evolving robots previously required weeks of trial and error on a supercomputer, and of course before any animals could run, swim, or fly around our world, there were billions upon billions of years of trial and error. This is because evolution has no foresight. It cannot see into the future to know if a specific mutation will be beneficial or catastrophic. We found a way to remove this blindfold, thereby compressing billions of years of evolution into an instant.”

Rediscovering legs​

All on its own, AI surprisingly came up with the same solution for walking as nature: Legs. But unlike nature’s decidedly symmetrical designs, AI took a different approach. The resulting robot has three legs, fins along its back, a flat face and is riddled with holes.

“It’s interesting because we didn’t tell the AI that a robot should have legs,” Kriegman said. “It rediscovered that legs are a good way to move around on land. Legged locomotion is, in fact, the most efficient form of terrestrial movement.”

To see if the simulated robot could work in real life, Kriegman and his team used the AI-designed robot as a blueprint. First, they 3D printed a mold of the negative space around the robot’s body. Then, they filled the mold with liquid silicone rubber and let it cure for a couple hours. When the team popped the solidified silicone out of the mold, it was squishy and flexible.

Now, it was time to see if the robot’s simulated behavior — walking — was retained in the physical world. The researchers filled the rubber robot body with air, making its three legs expand. When the air deflated from the robot’s body, the legs contracted. By continually pumping air into the robot, it repeatedly expanded then contracted — causing slow but steady locomotion.

AI can create new possibilities and new paths forward that humans have never even considered. It could help us think and dream differently.Sam Kriegman

Unfamiliar design​

While the evolution of legs makes sense, the holes are a curious addition. AI punched holes throughout the robot’s body in seemingly random places. Kriegman hypothesizes that porosity removes weight and adds flexibility, enabling the robot to bend its legs for walking.

“We don’t really know what these holes do, but we know that they are important,” he said. “Because when we take them away, the robot either can’t walk anymore or can’t walk as well.”

Overall, Kriegman is surprised and fascinated by the robot’s design, noting that most human-designed robots either look like humans, dogs, or hockey pucks.

“When humans design robots, we tend to design them to look like familiar objects,” Kriegman said. “But AI can create new possibilities and new paths forward that humans have never even considered. It could help us think and dream differently. And this might help us solve some of the most difficult problems we face.”

Potential future applications​

Although the AI’s first robot can do little more than shuffle forward, Kriegman imagines a world of possibilities for tools designed by the same program. Someday, similar robots might be able to navigate the rubble of a collapsed building, following thermal and vibrational signatures to search for trapped people and animals, or they might traverse sewer systems to diagnose problems, unclog pipes and repair damage. The AI also might be able to design nano-robots that enter the human body and steer through the blood stream to unclog arteries, diagnose illnesses or kill cancer cells.

“The only thing standing in our way of these new tools and therapies is that we have no idea how to design them,” Kriegman said. “Lucky for us, AI has ideas of its own.”
 
Top