top of page
Open Site Navigation
  • heikowirtz

How to use AI powered machine vision to detect errors in manual assembly processes

Have you ever wondered how Artificial Intelligence (AI) is used for quality inspection? How it can help analyze real-time cycle time extraction and support in the optimization of workstation utilization? Errors that occur in manual tasks in assembly, particularly those recognized late in the production line, do not only affect product quality, but also reduce company’s profit margins due to scrap and repair rates. It makes sense to detect these as early as possible and have a process in place where this doesn’t have to be supervised manually to reduce scrap rates and optimize the assembly process. Let’s take a typical packaging process in a factory, such as that at the Digital Capability Center (DCC) in Aachen. Here, wristbands are produced and equipped with an RFID chip (see Figure 1).

Sewing machines at Digital Capability Center Aachen
Figure 1: ANTICIPATE's manual assembly AI powered inspection

Each assembly step is made up of specific, chronological hand motions through the following steps:

  1. The operator takes the folded shipping box and unfolds it on the table

  2. They then take a new wristband and scan the RFID chip

  3. The quality of the wristband is inspected

  4. They proceed to pack it into a single paper box and drop this into a shipping box

  5. The next step involves printing the shipping label and sticking it on to the box

  6. And finally, the shipping box is closed and handed out.

Each of these steps are highly error-prone because they are performed manually (as opposed to by a machine). If someone was to stand next to them and supervise, this would be a costly and complex endeavor. On top of this, errors in this part of the process are so late in the production line, that they lead to high costs.

This is where AI comes in handy. The ANTICIPATE team worked together to develop a tailored motion classification process that could be applied to this assembly process. In order to collect the insights needed to make the above recommendations, the solution had to be able to automatically separate video segments, recognize hand coordinates and movements and collect as well as visualize this data in a quality dashboard.

Packaging Station Hand Movement Detection
Figure 2: Motion Classification Process at Packaging Station

How did they do this? First, they placed a camera above the operator like seen in Figure 2, ensuring a complete view of the operator's hand. The video was transformed into a hand coordinate stream, which are lots of separate images of the operator’s hand in different motions. A state-of-the-art machine learning algorithm extracted the hand coordinates out of the frames, which allowed for motion classes to be detected out of the hand coordinate stream. These motion-class labels can be used for all sorts of scenarios such as real-time extraction, quality inspection, optimized station utilization and many more (drop us a message if this sparked an interest in you).


So what’s the result? Whenever the operator makes a mistake, they are alerted through the dashboard of missing or otherwise faulty actions with real-time alerts. Using this solution, the process saw a cost reduction of 20% through early fault detection and the detection of defects also reduces the number of returns by 40%. Additionally, the improved worker guidance led to an enhanced throughput of 15%.


All this was done within a week: In as little as 5 days the team had trained the solution on specific equipment, which includes the required time for the on-premise assembly of the hardware, the implementation of the Artificial Neural Network (ANN) as well as the video acquisition and labeling.


Keen to hear more? Recognize any of these problems as a typical pain in your factory? We had a great time implementing this and are always looking for new opportunities to improve assembly processes. Reach out to us through the contact form on our homepage.

279 Ansichten1 Kommentar
bottom of page