E-Book: Conveyors and Other Plant Equipment
We've packaged together some stories about the selection of conveyors, as well as some other plant equipment for this free e-book
Technology is getting closer to replicating human sight, although it still has a ways to go. Similarly, multi-axis machine motion offers new options in manufacturing, though it’s still no match for hyper-flexible people.
On the other hand, vision systems never blink, and robots seldom take a day off. Robots, vision systems and robots with vision may never duplicate multifunctional workers, but the trade-offs are shrinking and counterbalanced by other capabilities.
Synchronized LED lighting helps improve pixel resolution necessary to get the most out of multiple cameras and lasers in Key’s sortation machines. Photo: Key Technology
Replacing human eyes to perform high-speed product inspections was the logical starting point for vision technology. These applications arguably are the most industrially hardened and are where the latest advancements are first implemented. An example is a new digital sorting platform from Key Technology Inc., Walla Walla, Wash. After two years of development and beta-site shakedowns, Key recently presided over the commercial introduction of Veryx, described as a “multi-sensor pixel fusion” platform that can be incorporated in both chute and belt sorters.
Greater versatility, higher throughput and fewer rejections of good product are the key deliverables to manufacturers, but what’s under the hood also is notable. Signals from up to four camera and eight laser-scanner channels are joined simultaneously for sub-millimeter pixel level accept/reject analysis. The prior art required layering feedback from cameras and scanners, resulting in false rejects of good product suspected of being bad. Synchronized LED lighting and a powerful processor support the high-resolution digital cameras and lasers with up to four wavelengths.
Food companies that process potatoes, dried fruit, nuts and other products may not know or care about Key’s vision advancements, which Steve Johnson, senior director-sales and marketing, calls “one of the most significant the company has made in sorting technology.”
Depending on the application, they may opt for fewer cameras and lasers or even stick with older vision systems on existing belt and chute sorters — though that would deprive them of what Johnson refers to as “cool time processing” advances. These include simplified controls, self diagnostics and a degree of artificial intelligence that allows the system to react to color changes that occur because of raw-material variability.
Four cameras are enough to provide a 360-degree view of a nut or other product as it drops down a chute. Like Key’s Tegra machine, one camera is capturing an image from a bottom view. Unlike Tegra, the sensor has a vertical window positioned at a low angle, where dirt and moisture is less likely to build up and obscure the view. “The machine was designed around the location of the bottom sensor,” Johnson asserts.
Vision guidance is expanding robotics beyond point-to-point applications and giving machines the ability to work with the irregularities of food products. Photo: ABB Robotics
Vision is an option or a standard offering on a growing number of robots. Auburn Hills, Mich.-based ABB Robotics began integrating vision-guided technology in some machines two years ago, using array-style cameras from Cognex in new ones and as a retrofit in late-model robots. Motion is slower than with a point-to-point machine, allows Rick Tallian, manager-packaging products and applications, but functionality is expanded.
Integrated vision is useful in automated palletizing to verify the correct shipping label is affixed and is in the correct orientation. It’s even more useful for depalletizing. “A pallet stack is never perfect,” he says, “so when you pick off of a pallet, the machine needs help identifying where the material really is.” Less mechanical fixturing is required, helping to shrink the cost differential.
“There is a tremendous amount of work going on with 3D cameras,” Tallian adds. “I’ve been involved in vision guidance since the early 1990s, and it’s a really exciting time.” Plummeting camera costs, the ability to extract and analyze huge amounts of controls data and technical advances like structured-light 3D scanners are pushing vision well beyond simple shape recognition. “It’s going to change a lot of things,” he believes.
A wink and a nod
ABB’s April acquisition of Gomtec GmbH is expected to rejuvenate its efforts in the area of human-robot collaborative automation technologies, with a new generation of “safe by design” machines expected to debut early next year. ABB will be playing catch-up with firms like Rethink Robotics Inc. in Boston. Still a modest $100 million segment of the global robotics market, collaborative robot sales are expected to grow 50 percent this year.
Animated eyes and eyebrows provide a visual cue to nearby workers of where a collaborative robot’s likely movement will be. Photo: Rethink Robotics
Founded by the inventor of the Roomba vacuum, Rethink began shipping machines in early 2013 and numbers American Licorice Co. in La Porte, Ind., among its end-users. Two Cognex cameras are on board: one in proximity to the end-of-arm tool and the other above a video screen with illustrated eyes and eyebrows that impart a human-like look.
“We wanted to make people working near the robot feel comfortable and be drawn in,” Jim Lawton, chief product and marketing officer, says of the video screen. “If you see the ‘eyes’ move, you look in that direction to see what they’re looking at and might do next.” Think of telegraphed motion as an added safety feature.
Seven series elastic actuators are built into the robot’s joint, part of the safe-by-design engineering. Light curtains, motion sensors and other safety devices are absent. “If the machine shuts down, it is not collaborative robotics,” says Lawton. Risk is a function of the application, he adds, and that’s best addressed at set-up with a rigorous risk assessment.
Rethink’s robots are made affordable thanks to consumer electronics and off-the-shelf components. A standard Dell PC controls the unit, and Lawton envisions a time when “a portion of the brain will be resident in the cloud.” In time, images of millions of objects will be archived and accessed by robots, adding another dimension to collaboration.
“By deploying a robot in the environment you already have and allowing it to apply more logic and more and more smarts, it will become a candidate for tasks that couldn’t be considered before,” he says. “The Internet of Things is more hype than business value, and a lot of work remains to be done, but ultimately robots’ latent computational power will be shifted to the cloud. Interconnectedness will allow them to learn from other robots and share experiences to improve performance.”
Safe torque-off, safe motion and other terms convey the safe-by-design concept. Robotics are an excellent application, although it can be beneficial when operating any machine in motion, points out Joaquin Ocampo, product manager-electric drives & controls at Bosch Rexroth Corp., Hoffman Estates, Ill.
Safe motion mode is a must when humans interact with high-speed machines. Photo: Bosch Rexroth
If motion is monitored in each drive, a machine can respond to a safety input and switch to safe mode, he says. A second safety signal might bring the machine to a full stop. A person might then interact with the machine to clear a jam, with the controls allowing incremental motion. Only after the person leaves the immediate vicinity would full motion be restored.
Slow vs. stop
Not every OEM is comfortable with safe motion, so they still power down equipment or use drives with Safe Torque Off. Others are beginning to accept the redundant controls of safe motion instead, “and I think that trend will continue,” Ocampo says.
Rexroth’s drives are most likely found in the packaging department of food and beverage plants, in both servo-motion machines without vision and pick-and-place Delta robots with vision. Conceivably, they could be used with a collaborative robot, with a pressure sensor in the robot’s arm providing the stop-state signal.
Torque or force measurement in the arm of a collaborative robot is the signal that lets the controller know it is bumping into someone or something. In the safe motion scenario, a signal from a camera might trigger safe mode state and a pressure sensor in the arm signals safe stop.
Not all collaborative robots are equipped with vision. Machines from Universal Robots USA Inc., Ann Arbor, Mich., do not. Nonetheless, more than 4,000 are operating in industrial environments, including food plants. While safe operation is a must, durability also is a concern. Universal’s track record suggests collaborative robots are sufficiently hardened to withstand the manufacturing environment.
Orkla Foods installed Universal’s UR10 machine in its Kumla, Sweden, facility three years ago. The machine, which can handle a payload of 10kg (22 lbs.), picks pillow packs of vanilla cream and other products off a conveyor and places them in a carton, in coordination with a carton erector and sealer.
The plant has operated conventional articulated arm robots for palletizing since 1999. Fencing is necessary with those machines, and light curtains originally were installed around the UR10 unit. “Now we’re at the point where we don’t think we need them,” says Johan Linné, plant manager.
An operator can safely work in the same cell, though the current application doesn’t involve human-and-machine interaction on an ongoing basis. “There is slow to medium arm movement, and no one has been struck by it,” reports Linné. If motion needs to be interrupted, “we give it a pat on the shoulder, and it stops,” he adds.
A stop button is the recommended way to pause the unit. Shoulder taps may have been the root cause of the only break down to date. Replacement of wear parts was necessary, but other than that and routine lubrication, “virtually no maintenance has been done on it,” he says.
The end-of-arm tooling can lose control of pouches, particularly if the product is very viscous, but the robot itself has proven to be robust. “In the future, we might use these types of robots to handle different products in the same work cell with an operator,” Linné concludes.
Just as vision systems are improving automatic inspection, collaborative robots are adding manufacturing flexibility. Regardless of whether both technologies are present in a machine or separately deployed, the advancements are beneficial for food processors.