Car Buyers, Choose Your Engines: How Assembly Lines Are Gearing Up for Made-to-Order Vehicles


Startup arculus is pioneering modular vehicle assembly using autonomous machines.

Buying a car isn’t as simple as it used to be.

Beyond cloth or leather seats and deciding on a color, purchasers are swarmed with a myriad of choices: from trim lines and engine types to various kinds of safety and driver assistance systems.

By the time you’re ready to put money down, you’ve created a vehicle that’s as unique as you are. But the conventional assembly line wasn’t designed to handle so many variations — and production efficiency suffers as a result.

arculus, a Germany-based startup and member of the NVIDIA Inception program, is tackling this by pioneering a modular production model. At the heart of it are driverless transport systems — powered by the Jetson TX2 supercomputer on a module for AI at the edge — that can navigate independently between vehicle production stations.

Replacing the Traditional Assembly Line

Built-to-order vehicles may improve customer satisfaction, but they pose production challenges.

In serial assembly lines, every vehicle visits every stop along the way, no matter what their specific configuration is. This results in delays and inefficiencies.

As consumer choices expand, the problem is only going to worsen — especially with an increasing number of cars being kitted out with autonomous technology. According to the Boston Consulting Group, around one-quarter of cars in the U.S. will make use of automated driving technologies by 2030.

Manufacturers that convert to a modular approach can increase efficiency by at least 17 percent, according to arculus. They’ll also reduce operational costs, leaving more to invest in new technologies and mobility concepts.

arculus’s modular model employs assembly stations served by autonomous machines. This enables the production line to be more flexible — each station is kitted out to perform any number of tasks and the machines don’t have to form an orderly queue nor stop at every station.

Instead, they can skip stations that are irrelevant to the car they’re transporting. And, if one station is busy, they can simply bypass it and head to one that is free.

A central control platform coordinates these machines, which the company calls arculees. The platform uses AI to find the most efficient route for each vehicle through the production chain.

Autonomous “arculees” travel between the modular production islands using object detection algorithms.

Using object detection algorithms, each member of the autonomous fleet follows this route to its target station. The integrated Jetson TX2 module processes video data captured by the machine’s network of cameras, as well as lidar, encoder and IMU data, in real time, enabling an arculee to recognize any obstacles in its path and navigate safely around them.

“One of the main challenges we faced when developing our autonomous machines was ensuring that they had enough computing power to implement the decisions made by the central control platform,” explains Max St?hr, CTO Robotics at arculus. “We chose to equip our arculees with NVIDIA Jetson TX2 modules due to their ability to process significantly more data than industrial PCs, without consuming tons of energy. Plus, the ability to scale is of huge benefit to us.”

Next in Line

arculus is already working on developing the next stage of its modular production model.

By moving to NVIDIA Jetson AGX Xavier, the AI platform for autonomous machines, the company is training its systems to perform simultaneous mlm localization and mapping (SLAM).

This enables the machines to identify and avoid obstacles as well as create 2D and 3D models of their environment from data supplied by stereo cameras and image processing systems.

Startup arculus is pioneering modular vehicle assembly using autonomous machines.

Buying a car isn’t as simple as it used to be.

Beyond cloth or leather seats and deciding on a color, purchasers are swarmed with a myriad of choices: from trim lines and engine types to various kinds of safety and driver assistance systems.

By the time you’re ready to put money down, you’ve created a vehicle that’s as unique as you are. But the conventional assembly line wasn’t designed to handle so many variations — and production efficiency suffers as a result.

arculus, a Germany-based startup and member of the NVIDIA Inception program, is tackling this by pioneering a modular production model. At the heart of it are driverless transport systems — powered by the Jetson TX2 supercomputer on a module for AI at the edge — that can navigate independently between vehicle production stations.

Replacing the Traditional Assembly Line

Built-to-order vehicles may improve customer satisfaction, but they pose production challenges.

In serial assembly lines, every vehicle visits every stop along the way, no matter what their specific configuration is. This results in delays and inefficiencies.

As consumer choices expand, the problem is only going to worsen — especially with an increasing number of cars being kitted out with autonomous technology. According to the Boston Consulting Group, around one-quarter of cars in the U.S. will make use of automated driving technologies by 2030.

Manufacturers that convert to a modular approach can increase efficiency by at least 17 percent, according to arculus. They’ll also reduce operational costs, leaving more to invest in new technologies and mobility concepts.

arculus’s modular model employs assembly stations served by autonomous machines. This enables the production line to be more flexible — each station is kitted out to perform any number of tasks and the machines don’t have to form an orderly queue nor stop at every station.

Instead, they can skip stations that are irrelevant to the car they’re transporting. And, if one station is busy, they can simply bypass it and head to one that is free.

A central control platform coordinates these machines, which the company calls arculees. The platform uses AI to find the most efficient route for each vehicle through the production chain.

Autonomous “arculees” travel between the modular production islands using object detection algorithms.

Using object detection algorithms, each member of the autonomous fleet follows this route to its target station. The integrated Jetson TX2 module processes video data captured by the machine’s network of cameras, as well as lidar, encoder and IMU data, in real time, enabling an arculee to recognize any obstacles in its path and navigate safely around them.

“One of the main challenges we faced when developing our autonomous machines was ensuring that they had enough computing power to implement the decisions made by the central control platform,” explains Max St?hr, CTO Robotics at arculus. “We chose to equip our arculees with NVIDIA Jetson TX2 modules due to their ability to process significantly more data than industrial PCs, without consuming tons of energy. Plus, the ability to scale is of huge benefit to us.”

Next in Line

arculus is already working on developing the next stage of its modular production model.

By moving to NVIDIA Jetson AGX Xavier, the AI platform for autonomous machines, the company is training its systems to perform simultaneous mlm localization and mapping (SLAM).

This enables the machines to identify and avoid obstacles as well as create 2D and 3D models of their environment from data supplied by stereo cameras and image processing systems.

etetewtgae

Top Rated

error: Content is protected !!