The evolution of machine vision systems is allowing manufacturers to learn more about their production processes and how to maximise their efficiency, writes Greg Pitcher.
Human beings have long been using their innate expertise to create things even more skilful than themselves. In the 21st century this means using machines for most aspects of daily life, and quality control is no different.
Technological progression means that just as we don’t use our hands to manufacture complex pharmaceutical products, nor do we use our eyes to judge those products – or even to evaluate the processes making them.
Machine vision systems not only make split-second verdicts on the suitability of finished items to go out to market, they increasingly give highly-detailed information that can be used to assess the plant itself and maximise the efficiency of the production line.
Vision sensors now produce more useful information and allow it to be shared and interpreted
Neil Sandhu, national product manager – imaging, Sick UK
“Sensors of every type are, in some way, the eyes of operating machinery in the manufacturing environment,” says Neil Sandhu, national product manager – imaging, measurement, ranging and systems at sensor manufacturer Sick UK.
Vision sensors – often containing software and data transfer capabilities – are used in manufacturing settings to undertake repetitive inspection tasks at level of speed and accuracy far beyond the human eye.
With their performance on these key attributes now at such high levels, a key differentiating factor when specifying smart vision systems is what they can tell you about the process itself.
“Vision sensors now produce more useful information and allow it to be shared and interpreted,” says Sandhu.
Data demand
This means smart systems using vision sensors need to be able to create outputs beyond ejecting an incorrectly filled tube from a conveyor belt.
“Machines need to produce data – for example, in the last shift there were X number of fails, one bad label, one bad cap, for X or Y reasons.
“If the machine can tell you what the reasons are for failures and what specific inspection occurred, the operator has access to much more flexible information than a simple ‘pass or fail’.
“Now the operator can look at why things are going well, and where they are going wrong.”
This all means that, if they are used correctly, machine vision systems can help prevent failures as well as identify them when they occur.
Preventing downtime and wastage in this way is hugely valuable to manufacturers. So of course is the core use of any quality control system – ensuring faulty products don’t go out to market.
“In some cases [poor] quality control, for example in food packaging or pharmaceuticals, could have regulatory or legal consequences,” says Sandhu. “Failure to detect problems before a product reaches the customer could lead to costly recalls and supplier fines, not to mention loss of reputation.”
Now the operator can look at why things are going well, and where they are going wrong
Fortuitously, alongside such strong incentives to maximise quality control standards and process efficiency in a competitive marketplace during an economic squeeze, there have been huge advances in technology.
These have been driven by two key developments. Chip miniaturisation has allowed for more processing power to be embedded in the vision sensors themselves, while developments in connectivity, as we approach Industry 4.0, have broadened the opportunity to access and share the data that sensors provide.
Ultimately this all means using machine vision has become easier and more accessible. “For a long time machine vision, and especially 3D vision, was the domain of a few, requiring expert programming skills and complex set up,” says Sandhu.
“Now intelligent or ‘smart’ sensors offer all-in-one vision solutions that have opened up the opportunity for imaging technologies to be applied cost-effectively to many more applications.
“Product engineers and machine builders can configure and commission vision systems quickly, simply and cost-effectively without huge amounts of specialist skill.”
TriSpector calls
Earlier this year, Sick launched its TriSpector 1000, which combines smart camera technology and processing software to deliver real-time quality inspection.
The self-contained device can check presence, position, labels, contents and absence, dimensioning and height, orientation and fill levels, and is tolerant of variations in product positioning on the conveying line. The company says the TriSpector’s “intuitive, graphical interface” guides purchasers through installation.
The march of smartphones and tablets has inevitably had an influence on the way people choose to analyse and use data from machine vision sensors.
Apps can be used to allow interaction between vision sensors, plant and managers. ]
“We can custom make the application to control the inspection and then get the data out to monitor how the machine is working,” says Sandhu.
Sick this year launched AppSpace, which it describes as an ‘ecosystem’ encompassing app development, implementation and management.
The technology can be found to suit a range of needs though. Smart vision systems are available in 1D, 2D and 3D and with varying fields of vision and capabilities.
Sandhu cites a few examples, including deodorant containers being mass produced in a wide range of colours and configurations before being shipped to international destinations to be filled.
The embedded system is free to evolve with minimum changes to the whole solution and relatively low investment
Ivan Klimkovic, key account manager, Ximea
“Despite the packaging lines herding the containers a metre abreast to the bulk shipping packs, the Sick IVC-3D Smart Vision uses its advanced algorithms to check for damage and verify that the right number and can type is being packed for each destination.”
Another example is the use of LED machine vision technology to detect fluorescent components of high strength adhesives used in car factories.
These adhesives and coatings can then be inspected by the sensors to ensure they have been used in the right places in the correct way.
Cross-border uniformity can be promoted by using standardised vision sensors rather than local eyes and minds.
“Maintaining exactly the same dimensions to ensure that the product made in one continent appears identical to any other is a critical factor,” says Sandhu.
He expects use of advanced vision sensors to become more common. “In particular this democratisation of vision will mean more shared data and intelligence across industry,” he predicts.
“As time moves on, environments such as Sick AppSpace will make vision system set up as easy as downloading an app from your mobile phone.”
One firm that is exploring new options for smart vision is camera maker Ximea.
“The idea is simple,” says key account manager Ivan Klimkovic. “Instead of using a smart camera with limited specification and a restricted set of components, embedded vision [pictured below] allows you to be modular and grow the requirements.”
The company previously used sensors and a processor set in housing to provide machine vision.
“It was almost a perfect solution for certain applications for that period of time from 2010 to around 2015,” says Klimkovic.
“The problem was that the whole system was basically set in stone – it was suitable with its processing power and cost effectiveness for some projects, but too complex and inflexible for most.”
The firm has moved to a system it calls embedded vision, essentially using a processor with peripherals connected to a camera with embedded qualities. It believes this gives customers valuable flexibility to change and grow the system to fit their changing needs.
“In such case you are free to use down the line a more powerful processor or different camera sensors or several cameras or a different interface,” says Klimkovic.
“The embedded system is free to evolve with minimum changes to the whole solution and relatively low investment. It is getting more and more popular.”