Motion Control & Drives


Using machine vision to sense falling objects

August 2007 Motion Control & Drives

Challenged by John Deere, V I Engineering designed and developed a machine vision system that counts and measures falling objects smaller than 1 mm at more than 450 objects per second. The system measures time and XY position of each object as it is falling through a sensing plane. Line scan cameras and back illumination units are used in the system. Special algorithms were developed to distinguish objects in a clump.

There is a strong demand for systems that accurately measure counts, times, and positions of objects falling at high speed and at high rates in industries that make ball bearings, chemical pellets, seeds, pharmaceuticals, and other products. Such systems can serve as tools to improve manufacturing processes, as well as quality control in these industries.

Previously developed techniques such as grease belt systems and LED or photo-detector grids have been used to measure the distribution of falling objects. The limitation of grease belt techniques is that they do not take realtime measurements and they require extensive post-measurement processing. Addressing the limitations of the grease belt technique, the LED or photodetector grid provides realtime, high-speed measurement, but suffers some critical shortcomings, such as poor spatial resolution which restricts use of the technology to measuring objects larger than 4 mm. This technique is also unable to resolve multiple objects when they appear too close to each other as a clump.

A machine vision-based system using one line scan camera demonstrated better results than the grease belt and LED or photodetector grid methods, but the one-camera design did not solve the problem of distinguishing object clumps, or multiple objects that are too close and appear to be one object.

Vision system design

V I Engineering designed a machine vision system based on two line scan cameras and two linear backlighting units. One camera and one back illumination light are centred on the region of interest, or the area the objects fall through. Another camera and backlight pair is oriented perpendicular to the first set. It is aligned so that the scan lines of both cameras and the centre lines of backlights are in a same plane.

System specifications and machine vision were improved through use of PXI Express technology
System specifications and machine vision were improved through use of PXI Express technology

With the backlighting, each falling object appears as a black particle in a white background regardless of surface condition, brightness, and colour of the falling object. This means the vision algorithm does not need to be adjusted according to the appearance of the different objects. Two identical pairs of camera and backlight are orthogonally placed, so that with a specially designed algorithm they can measure the XY coordinates of the objects when they fall through the image plane. The image plane is a virtual plane that is constructed by the two cameras' sensor lines and the linear backlights when they are properly aligned. A purpose-designed alignment fixture is used to align the camera sensor line's position and angle so that the line scan sensors and the two linear backlights are aligned in a single image plane. The orthogonally oriented cameras provide another advantage over the one-camera configuration in that it allows the software to distinguish clump objects. When multiple objects are very close spatially, they may appear as one single object in one camera image. However, from a perpendicular angle, the other camera most likely sees these objects as separate particles in the image. With a specially designed algorithm, the system is able to match and identify all the objects in two images and separate the clumped objects.

Two IEEE 1394 line scan cameras with a resolution of 1024 pixels were selected for this application. Matched with two linear lights manufactured the cameras provide object resolution of better than 1 mm in a 150 mm x 150 mm image area where all the falling objects are measured. The bright backlighting creates a challenge when detecting smaller objects since they may not appear dark enough in the image due to an edge diffraction effect that reduces the contrast of the small object. To overcome this, the threshold value for smaller objects was adjusted and the system was then able to measure objects 1 mm or less in diameter.

The synchronisation of the two cameras is critical to accurately counting and measuring falling objects. The cameras are externally triggered using a pulse train signal from an NI PCI-6601 counter/timer card. With this single-source triggering of the scanning lines and the precise physical alignment of the two cameras, an object appears in both camera images at the same vertical position.

One PC was used to acquire images, process images, and run an object classification algorithm, as well as to display, generate, and report results. A NI PCI-8252 IEEE 1394 interface card plugs in to the PC and connects the two line scan cameras.

Calibration

The budget constraint of the project prevented the use of telecentric lenses for the cameras. Instead, low-cost machine vision lenses were used in the system. Due to the system's space limitation the camera lens was close to the inspection region, causing the images from the cameras to appear severely distorted. The main sources of this distortion are lens distortion at the edge of the field of view, and the perspective error of the lens due to the lens' proximity to the objects. The lens distortion causes an object's image to change size and shape when it is closer to the edge. The perspective error causes an image to change size when the object is at a different distance from the lens. Both distortions can cause time and position measurement error and a miscount of falling objects.

To overcome these distortions a novel calibration method was developed using a calibration target to mimic a calibration grid. A thin cylindrical target is placed and moved in a uniformly distanced grid map. Images are acquired and the positions of the target in both cameras are measured when the target is in each position. This yields a field calibration map that is basically a distorted grid image. The calibration function in the NI vision library was then used to convert all images to a uniform, undistorted image. All pixels are converted to real-world coordinates that are in millimetres. The size of the object in the image is also calibrated against its distance from the lens. After the calibration process, all objects' coordinates and sizes are corrected in the measurement result.

Clump objects

When a large number of objects fall at a high rate, some objects may appear as one clump. The two-camera approach largely overcomes this. Two or more objects which appear connected in one camera image will most likely show up as separate individual objects in the image from another camera that looks from a perpendicular angle. By counting and crosschecking the objects between the two camera images at the same vertical locations in the synchronised images, the algorithm can identify and distinguish objects in clumps. In rare situations that multiple objects appear in both cameras as a single object, the clump has a larger size, and the number of objects can be approximated from the size of the clump. Because this situation has a very low probability of happening, the approximation has proven to have little effect on counting accuracy.

System performance

Spatial resolution and accuracy

The spatial resolution of the system is determined by camera resolution, lens quality, lighting condition, scanning rate, and the physical dimension of the field of view. The camera selected for this application has a resolution of 1024 pixels. It covers more than a 150 mm field of view. Each pixel covers about 150 micrometres, which equates to a spatial resolution of 150 micrometres. 1 mm ball bearings were used to test the minimum detectable object size of the system. The system can easily count and measure these ball bearings.

Time resolution and accuracy

The time resolution of the system is determined by the line scan rate of the camera and the speed of the image processing. In this system, the camera has a maximum line rate of 10 kHz, which translates to 100 microseconds of time spacing between scan lines. Using different line scan cameras available in the market that have faster line scan rates, the time resolution of the system could easily be improved.

Because the cameras' line scan is triggered by an external precision pulse signal, the accuracy of the object timing measurement is mainly determined by the time resolution of that pulse. It is estimated to be 200 microseconds.

Counting accuracy

With the selected equipment the system is able to count objects with variable sizes at up to 450 objects per second with 99% accuracy. At the rate lower than 200 objects per second, the system has 99,5% counting accuracy.

For more information contact National Instruments South Africa, 0800 203 199, [email protected], www.ni.com





Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

Robotic filling systems for the pharmaceutical industry
Motion Control & Drives
Pharma Integration, a leading pharmaceutical manufacturer, aims to replace traditional mechanical filling lines with compact, fully automated systems that are 100% robot-driven using machines known as Azzurra. Their integrated Faulhaber drives play a crucial role in the fill-finish process, ensuring the highest precision and safety across multiple production steps.

Read more...
New generation soft starter ranges
Motion Control & Drives
Schneider Electric has launched its new generation Altivar ATS430 and ATS490 soft starter ranges in Anglophone Africa, the latest innovations in motor control technology.

Read more...
Machinery maintenance and the hidden cost of fuel adulteration
Motion Control & Drives
Fuel adulteration is one of the most insidious threats to industrial machinery, safety and environmental compliance. Craig FitzGerald, chief executive officer of ISO-Reliability Partners, discusses how this widespread issue undermines mechanical performance and operational safety, and also poses significant legal and financial risks.

Read more...
Sensorless control of brushless
Motion Control & Drives
Many applications would benefit from a brushless motor without a sensor. A method developed by maxon is now setting new standards for precision and reliability.

Read more...
Precise information in the cockpit with FAULHABER stepper motors
Motion Control & Drives
For the display of Bugatti’s upcoming luxury model, Tourbillon, something truly special will be presented. Instead of a digital version, the driver will see a genuine Swiss timepiece behind the steering wheel.

Read more...
Complete mine hoist systems
Motion Control & Drives
From friction to single and double drum hoists, ABB is a complete supplier of various types of mine hoist systems.

Read more...
Innovative braking technology for heavy-duty hoists
Motion Control & Drives
The electro-hydraulic disc brakes in the DX series from RINGSPANN have been re-engineered, and are proving to be a trendsetter in the holding and emergency stop systems in the hoists of heavy-duty and container cranes.

Read more...
Largest private wind farm in South Africa
Motion Control & Drives
The Witberg wind farm will prevent the emission of more than 420 000 tons of CO2 per year in 122 000 households in the Western Cape.

Read more...
The environmental benefits of correct lubrication storage
Motion Control & Drives
While selecting the right lubricant for an application is key, how that lubricant is stored between applications is an often overlooked but critical aspect of reducing contaminants in machinery across a plant or site.

Read more...
Sustainability is transforming fluid power
Editor's Choice Motion Control & Drives
Sustainability is reshaping the future of fluid power. With the growing demand for cleaner, more efficient technologies and tightening global regulations, fluid power systems are being re-engineered for higher efficiency, lower emissions and reduced material usage.

Read more...









While every effort has been made to ensure the accuracy of the information contained herein, the publisher and its agents cannot be held responsible for any errors contained, or any loss incurred as a result. Articles published do not necessarily reflect the views of the publishers. The editor reserves the right to alter or cut copy. Articles submitted are deemed to have been cleared for publication. Advertisements and company contact details are published as provided by the advertiser. Technews Publishing (Pty) Ltd cannot be held responsible for the accuracy or veracity of supplied material.




© Technews Publishing (Pty) Ltd | All Rights Reserved