Learn how an Allied Vision Mako camera can control your LED light source

camera as controller

In this article we discuss when and why one might want to strobe a light instead of using continuous lighting. While strobing traditionally required a dedicated controller, we go on to introduce that CCS and AVT have published an Application Note showing how the Allied Vision Mako camera can serve as the controller!

While LED lights are often used for continuous lighting, since that’s an easy mode of deployment, sometimes an application is best served with a well-timed strobe effect. This might be for one or more of the following reasons:

  • to “freeze motion” via light timing rather than shutter control alone;
  • to avoid the heat buildup from continuously-on lights
  • overwhelm ambient lighting
  • maximize lamp lifetime
Effilux LED lights

Let’s suppose you’ve already decided that you require strobe lighting in your application. You’re past “whether” and on to “how to”.

Since you are moving into the realm tight timing tolerances, it’s clear that the following are going to need to be coordinated and controlled:

  • the strobe light start and stop timing, possibly including any ramp-up delays to full intensity
  • the camera shutter or exposure timing, including any signal delays to start and stop
  • possibly the physical position of real world objects or actuators or sensors detecting these

Traditionally, one used and external controller, an additional device, to control both the camera and the lighting. It’s a dedicated device that can be programmed to manage the logical control signals and the appropriate power, in the sequence required. This remains a common approach today – buy the right controller and configure it all, tuning parameters through calculations and empirical testing.

Effilux pulse controller: controls up to 4 lights; output current can reach up to 1A @ 30V in continuous and 10A @ 200V in strobe mode – courtesy Effilux

Call us if you want help designing your application and choosing a controller matched to your camera and lighting requirements.

But wait! Sometimes, thanks to feature-rich lighting equipment and cameras, with the right set of input/output (I/O) connections, and corresponding firmware-supported functionality, one can achieve the necessary control – without a separate controller. That’s attractive if it can reduce the number of components one needs to purchase. Even better, it can reduce the number of manuals one has to read, the number of cables to connect, and the overall complexity of the application.

Let’s look at examples of “controller free” applications, or more accurately, cameras and lights that can effect the necessary controls – without a separate device.

Consider the following timing diagram, which shows the behavior of the Effi-Ring when used in auto-strobe mode. That doesn’t mean it strobes randomly at times of its own choosing! Rather it means that when triggered, it strobes at 300% of continuous intensity until the trigger pulse falls low again, OR 2 seconds elapse, whichever comes first. Then if steps down to continuous mode at 100% intensity. This “2 seconds max” feature, far longer than most strobed applications require, is a design feature to prevent overheating.

Courtesy Allied Vision Technologies

OK, cool. So where to obtain that nice square wave trigger pulse? Well, one could use a controller as discussed above. But in the illustration below, where’s the controller?!? All we see are the host computer, an Allied Vision Mako GigE Vision camera, an Effilux LED, a power supply, and some cabling.

Camera exposure signal controls strobe light – courtesy Allied Vision Technologies

How is this achieved without a controller? In this example, the AVT Mako camera and the Effilux light are “smart enough” to create the necessary control. While neither device is “smart” in the sense of so-called smart cameras that eliminate the host computer for certain imaging tasks, the Mako is equipped with opto-isolated general purpose input output (GPIO) connections. These GPIOs are programmable along with many other camera features such as shutter (exposure), gain, binning, and so forth. By knowing the desired relationship between start of exposure, start of lighting, and end of exposure, and the status signals generated for such events, one can configure the camera to provide the trigger pulse to the light, so that both are in perfect synchronization.

Note: During application implementation, it can be helpful to use an oscilloscope to monitor and tune the timing and duration of the triggers and status signals.

Whether your particular application is best served with a controller, or with a camera that doubles as a controller, depends on the application and camera options available. 1stVision carries a wide range of Effilux LED lights in bar, ring, backlight, and dome configurations, together with the ability to be used on continuous or strobe modes.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of lensescablesNIC card and industrial computers, we can provide a full vision solution!

Machine vision lights as important as sensors and optics

Lighting matters as much or more than camera (sensor) selection and optics (lensing). A sensor and lens that are “good enough”, when used with good lighting, are often all one needs. Conversely, a superior sensor and lens, with poor lighting, can underperform. Read further for clear examples why machine vision lights are as important as sensors and optics!

Assorted white and color LED lights – courtesy of Advanced Illumination

Why is lighting so important? Contrast is essential for human vision and machine vision alike. Nighttime hiking isn’t very popular – for a reason – it’s not safe and it’s no fun if one can’t see rocks, roots, or vistas. In machine vision, for the software to interpret the image, one first has to obtain a good image. And a good image is one with maximum contrast – such that photons corresponding to real-world coordinates are saturated, not-saturated, or “in between”, with the best spread of intensity achievable.

Only with contrast can one detect edges, identify features, and effectively interpret an image. Choosing a camera with a good sensor is important. So is an appropriately matched lens. But just as important is good lighting, well-aligned – to set up your application for success.

What’s the best light source? Unless you can count on the sun or ambient lighting, or have no other option, one may choose from various potential types of light:

  • Fluorescent
  • Quartz Halogen – Fiber Optics
  • LED – Light Emitting Diode
  • Metal Halide (Mercury)
  • Xenon (Strobe)
Courtesy of Advanced Illumination

By far the most popular light source is LED, as it is affordable, available in diverse wavelengths and shapes (bar lights, ring lights, etc.), stable, long-life, and checks most of the key boxes.

The other light types each have their place, but those places are more specialized. For comprehensive treatment of the topics summarized here, see “A Practical Guide to Machine Vision Lighting” in our Knowledgebase, courtesy of Advanced Illumination.

Download whitepaper
Download whitepaper

Lighting geometry and techniques: There’s a tendency among newcomers to machine vision lighting to underestimate lighting design for an application. Buying an LED and lighting up the target may fill up sensor pixel wells, but not all images are equally useful. Consider images (b) and (c) below – the bar code in (c) shows high contrast between the black bars and the white field. Image (b) is somewhere between unusable or marginally usable, with reflection obscuring portions of the target, and portions of the (should be) white field appearing more grey than white.

Courtesy of Advanced Illumination

As shown in diagram (a) of Figure 22 above, understanding bright field vs dark field concepts, as well as the specular qualities of the surface being imaged, can lead to radically different outcomes. A little bit of lighting theory – together with some experimentation and tuning, is well worth the effort.

Now for a more complex example – below we could characterize images (a), (b), (c) and (d) as poor, marginal, good, and superior, respectively. Component cost is invariant, but the outcomes are sure different!

Courtesy of Advanced Illumination

To learn more, download the whitepaper or call us at (978) 474-0044.

Contact us

Color light – above we showed monochrome examples – black and white… and grey levels in between. Many machine vision applications are in fact best addressed in the monochrome space, with no benefit from using color. But understanding what surfaces will reflect or absorb certain wavelengths is crucial to optimizing outcomes – regardless of whether working in monochrome, color, infrared (IR), or ultraviolet (UV).

Beating the same drum throughout, it’s about maximizing contrast. Consider the color wheel shown below. The most contrast is generated by taking advantage of opposing colors on the wheel. For example, green light best suppresses red reflection.

Courtesy of Advanced Illumination

On can use actual color light sources, or white light together with well-chosen wavelength “pass” or “block” filters. This is nicely illustrated in Fig. 36 below. Take a moment to correlate the configurations used for each of images (a) – (f), relative to the color wheel above. Depending on one’s application goals, sometimes there are several possible combinations of sensor, lighting, and filters to achieve the desired result.

Courtesy of Advanced Illumination

Filters – can help. Consider images (a) and (b) in Fig. 63 below. The same plastic 6-pack holder shown is shown in both images, but only the image in figure (b) reveals stress fields that, were the product to be shipped, might cause dropped product, reduced consumer confidence in one’s brand. By designing in polarizing filters, this can be the basis for a value-added application, automating quality control in a way that might not have been otherwise achievable – or not at such a low cost.

Courtesy of Advanced Illumination

For more comprehensive treatment of filter applications, see either or both Knowledgebase documents:


Powering the lights – should the be voltage-driven or current-driven? How are LEDs powered? When to strobe vs running in continuous modes? How to integrate light controller with the camera and software. These are all worth understanding – or having someone in your team – whether in-house or a trusted partner – who does.

For comprehensive treatment of the topics summarized here, see Advanced Illumination’s “A Practical Guide to Machine Vision Lighting” in our Knowledgebase:

Download whitepaper
Download whitepaper

This blog is intended to whet the appetite for interest in lighting – but it only skims the surface. Machine vision lights as important as sensors and optics. Please download the guide linked just above – to deepen your knowledge. Or if you want help with a specific application, you may draw on the experience of our sales engineers and trusted partners.

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of lensescablesNIC card and industrial computers, we can provide a full vision solution!

What can you do with 3D from Automation Technology?

Automation Technology GmbH C6 Laser Sensor

When new technologies or product offerings are introduced, it can help get the creative juices flowing by seeing example applications. In this case, 3D laser triangulation isn’t new, and Automation Technology (AT) has more than 20 years’ experience developing and supporting their products. But 1stVision has now been appointed by AT as their North American distributor – a strategic partnership for both organizations bring new opportunities to joint customers.

Laser Triangulation overview – courtesy Automation Technology

The short video above provides a nice overview of how laser triangulation provides the basis for 3D imaging in Automation Technology GmbH’s C6 series of 3D imagers.

With no ranking implied by the order, we highlight applications of 3D imaging using Automation Technology products in each of:


Weld inspection

Weld inspection is essential for quality control, whether pro-actively for customer assurance and materials optimization or to archive against potential litigation.

Weld inspection – courtesy of Automation Technology
  • 3D Inspections provide robust, reliable, reproducible measured data largely independent of ambient light effects, reflection and the exact positioning of the part to be tested
  • High resolution, continuous inspection of height, width and volume
  • Control of shape and position of weld seams
  • Surface / substrate shine has no influence on the measurement

Optionally combine with an IR inspection system for identification of surface imperfections and geometric defects.


Rail tracks and train wheels

Drive-by 3D maintenance inspection of train wheel components and track condition:

  • Detect missing, loose, or deformed items
  • Precision to 1mm
  • Speeds up to 250km/hr
Train components and rail images – courtesy Automation Technology

Rolling 3D scan of railway tracks:

  • Measure rail condition relative to norms
  • Log image data to GPS position for maintenance scheduling and safety compliance
  • Precision to 1mm
  • Speeds up to 120km/hr

Additional rail industry applications: Tunnel wall inspection; catenary wire inspection.


Adhesive glue beads

Similar in many ways to the weld inspection segment above, automated glue bead application also seeks to document quality standards are met, optimize materials usage, and maximize effective application rates.

Glue bead – courtesy of Automation Technology

Noteworthy characteristics of 3D inspection and control of glue bead application include:

  • Control shape and position of adhesive bead on the supporting surface
  • Inspect height, width and volume
  • Control both inner and outer contour
  • Application continuity check
  • Volumetric control of dispensing system
  • Delivers robust, reliable, reproducible measured data largely independent of ambient light effects, reflection and exact positioning of the items being tested

Automation Technology C6 3D sensor

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of lensescablesNIC card and industrial computers, we can provide a full vision solution!

New IDS XLS cameras – tiny cameras – low-price category

IDS XLS board level cameras

The smallest board-level cameras in the IDS portfolio, the uEye XLS cameras have very low power consumption and heat generation. They are ideal for embedded applications and device engineering. Sensors are available for monochrome, color, and NIR.

XLS board-level with no lens mount; with S-mount; with C-mount – courtesy of IDS

The “S” in the name means “small”, as the series is a compact version of the uEye XLE series. As small as 29 x 29 x 7 mm in size! Each USB3 camera in the series is Vision Standard compliant, has a Micro-B connector, and offers a choice of either C/CS lens mount, S-mount, or no-mount DIY.

IDS uEye XLS camera familycourtesy of IDS

Positioned in the low-price portfolio, the XLS cameras are most likely to be adopted by customers requiring high volumes for which basic – but still impressive – functions are sufficient. The XLS launch family of sensors include ON Semi AR0234, ON Semi AR0521, ON Semi AR0522, Sony IMX415, and Sony IMX412. These span a wide range of resolutions, framerates, and frequency responses. Each sensor appears in 3 board-level variants per the last digit in each part number corresponding as follows: 1 = S-mount, 2 = no-mount, 4 = C, CS-mount.

SensorResolutionFramerateMonochromeColorNIR
ON Semi AR02341920
x
1200
102 fpsU3-356(1/2/4)
XLS-M
U3-356(1/2/4)
XLS-C
ON Semi AR05212592
x
1944
48 fpsU3-
368(1/2/4)
XLS-M
U3-
368(1/2/4)
XLS-C
ON Semi AR05222592
x
1944
48 fpsU3-368(1/2/4)
XLS-NIR
Sony
IMX415
3864
x
2176
25 fpsU3-38J(1/2/4)
XLS-M
U3-38J(1/2/4)
XLS-C
Sony
IMX412
4056
x
3040
18 fpsU3-38L(1/2/4)
XLS-C
XLS family spans 5 sensors covering a range of requirements
XLS dimensions, mounts, and connections – courtesy of IDS

Uses are wide-ranging, skewing towards high-volume embedded applications:

Example applications for XLS board-level cameras – courtesy of IDS

In a nutshell, these are cost-effective cameras with basic functions. The uEye XLS cameras are small, easy to integrate with IDS or industry-standard software, cost-optimized and equipped with the fundamental functions for high-quality image evaluation

1st Vision’s sales engineers have over 100 years of combined experience to assist in your camera and components selection.  With a large portfolio of lensescablesNIC card and industrial computers, we can provide a full vision solution!