With the advent of easily obtainable
individual LEDs, Artemis Vision can
now purchase as few as four and produce
an evaluation solution. If that’s successful then the lighting can be mounted in a
housing and then, perhaps, turned into a
standard part type that the company can
“It helps a lot in terms of cost and lead
time,” Brennan said of the new lighting
The lighting technology advances have
been complemented by improvements
in the capabilities of sensors, processors
and algorithms. These vision systems
components have also seen price drops
that have made it possible for smaller
machine vision solutions suppliers to
keep systems on hand strictly for testing
When asked about vision innovations,
Eric Jalufka, product manager for vision hardware and software at National
Instruments Co. of Austin, Texas, began
by discussing sensor progress, such as increases in available resolution. Whereas a
few years ago a 5-MP resolution would be
state of the art, today sensors are moving
into the 20+ MP range for area scan cameras. More pixels and higher resolution
make it easier to detect fine details.
Further, sensors offering higher
dynamic range with very low noise are
appearing. Those sensors can also handle
higher frame rate imaging, meaning that
they can capture events that take place in
a shorter time period than was possible
“Traditionally, it was hard to find those
qualities all in one sensor. But now we’re
starting to see these nice benefits offered
in a single sensor and it’s something that’s
available to the machine vision market.
It’s not just constrained to a high performance, scientific lab camera,” Jalufka said.
The combination of characteristics can
be advantageous in opening up new mass
market applications. For instance, cars
routinely travel from bright sunshine to a
dark tunnel or vice versa in a fraction of a
second. They also are used day and night.
A driver assist solution must be able to
deal with this. A high dynamic range sensor helps because it means that the camera
performs well when going from light to
dark or back again.
More bits, more challenges
Such sensor innovations are not an
unmitigated benefit. A higher dynamic
range, a greater frame rate and higher
resolution all mean that the sensor produces more bits in a given period of time.
Those bits have to be transmitted. For that
reason, Jalufka sees sensor technology
driving the adoption of higher bandwidth
communication standards, like USB 3.0.
The increased number of bits also puts
a strain on processors, he added. That data
has to be run through calculations and
algorithms, and as the number of bits goes
up that burden increases.
One way to address this problem is
to use higher performance processors.
Another approach, which is increasingly
employed, is heterogeneous processing.
Here, a traditional processor handles part
unit (GPU) or a field programmable gate
array (FPGA) handles the rest. The key
is to know which mathematical methods
that turn image data into numbers and
actionable information, or algorithms,
should go through which calculation
“We can put the algorithms that are
best suited for the FPGA on the FPGA
and the ones that are better suited to the
CPU [central processing unit] on the CPU.
Those two elements can work together to
increase overall throughput so you can
process that data faster. You can make decisions faster and increase your throughput,” Jalufka said.
Some tasks, like thresholding for particle analysis, work well on an FPGA, he
said. On the other hand, pattern matching
and more advanced algorithms are more
efficiently dealt with by a processor.
Vision innovations based on lower-cost components make it possible to tackle an inspection station (a),
a pit crew helmet with mounted cameras (b, c), or other lower-volume applications.