Professional Documents
Culture Documents
www.vision-systems.com
Smart
camera WORLDWIDE
OCR success
Through font
classification library INDUSTRIAL
CAMERA
experimentation
Novel
thermal DIRECTORY
imagers
Help fill gap in
autonomous vehicle
sensor suit
ORIGINAL
INTRODUCING
™
Xtium 2 PCIe Gen3 Series
High Performance, Multi-Interface
Frame Grabbers
The Xtium2 series leverages the PCIe Gen3 x8 platform
to deliver high-speed image transfers to host memory
without CPU overhead. Featuring CoaXpress® 12, Xtium2 CLHS PX8
Camera Link® and Camera Link HS® interface standards, (AOC ready)
Xtium2 supports Active Optic Cable and SFP+ modules .
XTIUM2 FEATURES
» Long cable lengths at maximum image
acquisition rates
» Data forwarding rate for distributed image
processing Xtium2 CLHS FX8
» Fully supported by Sapera Vision Software SDKs (quad port SFP+
connector)
November 2018
VOL. 23 | NO. 10
Vision and Automation Solutions for
Engineers and Integrators Worldwide A P EN N W EL L P U B L I C AT I O N
WORLDWIDE
INDUSTRIAL
f e ature s CAMERA
DIRECTORY
MARK E T SUR V E Y
Yakov Shaharabani
d ep ar tment s
17 The makings of a successful imaging lens
3 Inside Vision
Part three: testing and metrology, ensuring you
get what you asked for. 4 Snapshots
High in
Quality and
Features
SHR series 7fps
to create the world’s first affordable, character-
rich robot capable of surprising and delight-
ing humans,” said Boris Sofman, CEO and co-
communicate through a unique sound palette.
Useful or fun features touted by Anki in-
clude the ability of the robot to dance when
47 MP
MP
7fps
In addition to the Blue ROV2, the team researchers to do the important work
also deployed the hyperspectral camera on a of looking at how to help support these
drone. This was the first time that the team reefs,” she said. EXO series
deployed ROVs and drones simultaneously
during night-time missions.
Additionally, the robots enable the mon-
itoring of aspects of coral reefs the team
1.6Ð20
“We did some revolutionary stuff during has not been able to accomplish previ-
MP
MP
this trial, we also flew the 900g hyperspectral ously, while also keeping human divers out
camera under our large aerial drone off our re- of harm’s way, in the form of crocodiles,
search vessel (RV) Cape Ferguson, over a coral marine stingers, or sharks.
transect on John Brewer Reef, which is one of Using these technologies during the two-
our long-term monitoring sites,” said Olsen. week trial showed that the team was able to The perfect Picture for
Technologies such as robots and hyper- perform missions at night, while also allow- your Application.
spectral cameras, according to Olsen, help ing them to go deeper, according to Olsen.
the team stay competitive and improve their Related: In the October issue, an article > CMOS or CCD sensor
> Four LED light controller
research endeavors. detailing the use of underwater robots for > 256 – 512 MB of Burst Mode Buffer (GigE only)
“We want to remain globally competi- the monitoring and protection of the Great > Sequencer, PLC, Safe Trigger
> Extended operating temperature range: -10 up to 60°C
tive and so we are boosting our technologi- Barrier Reef was also described. (http://bit.
cal capabilities. Robotics helps us to monitor ly/VSD-SFA). In this application, local re-
larger and new sections of the reef in areas searchers used an underwater vision-guided www.svs-vistek.com
that would otherwise be dangerous to divers.” robot called the RangerBot to identify and
She added, “These robots will soon destroy crown-of-thorns starfish, which de- SVS-Vistek GmbH
info@svs-vistek.com
be helping to free up our marine science stroy coral in the Great Barrier Reef.
YOU
APP YOUR
DECIDE
SENSOR®!
WHAT’S
!
powered by
Industrial camera
survey highlights
an embedded trend
Manufacturers and users review industrial camera
market status and future trends.
ce
gin e
g
an ry ro n
cs
nt
ics
ter rma
e s nt
ms
re
on
tic
tiv
nc
rin
ctu
Au d me boti
e
ran
ra
cti
cti
i
t
Sm T/ho yste
ag
em
sis tainm
y & logis
bo
t a veilla
ati
mo
ee
a
ve
du
Ind spe
tru
ph
su
ck
ro
lic
ma asur
be
to
as
no d pa
as
pp
nd
in
me
in
r
Au
en
su
np
me tanc
nd
nfr
t
ty
ca
od
a
Se tion
us
en
n
da
ali
al
Sp stics
ni
y/i
/fo
nic
Qu
nd
cit
tio
rit
/Io
ng
ica Prin
ien
as
Fo
ha
ta
ma
cu
to
art
mi
ics
Sm and
Sc
ag
ec
far
on
ho
M
i
ld
Au
les
ctr
al
art
Ele
hic
ed
us
ve
M
Ind
Camera pricing
While 2017 saw a stabilization in the cost of mid- and high-priced
cameras, responses from manufacturers and users suggest that prices
are falling again. Sixty-two percent of users say they would be prepared
to spend $350 or less on a camera, compared to 19% the previous year.
(Figure 2). Thirty-one percent of respondents would pay between $350
and $1000 for a camera, a fall of seven percentage points from 2017,
while 8% of users would invest over $1000 in their camera, a fall of
34 percentage points.
The price segment below $150 increased by just three percentage
points, while the losses in the over-$1000 segment are largely due to
the benefit of the mid-price segment, which increased from 33% to
44%. This shift can likely be explained through a combination of
the altered participant sampling, and the increased competition from
Asian manufacturers.
the discrepancy is becoming clearer. Sixteen To which pricing sector can your Figure 2: While 62% of users indicated that
industrial cameras be allocated? they would be willing to pay $350 or less
percent of users still use VGA (640 x 480) res-
for an industrial camera, only 8% of users
olutions, with 62% of all sensors fitted having
4% User indicated that they would be willing to pay
between one and five megapixels. <$150 $1,000 or more.
2% Manufacturer
In contrast, already 21% of image sensors
have resolutions of between five and twenty 58% speeds have failed to materialize. The smallest
$150–350
megapixels. This statistic is consistent with the 28% class up to 25 fps suffered losses among both
forecasts of last year. Respondents predict that users and manufacturers of eight and seven per-
the next two years will see only a slight rise in 18% centage points, respectively. The segment from
$351–650
23%
resolution. Despite this prediction, resolutions 25 fps to 100 fps is growing by ten percentage
starting from five megapixels will continue to points among users, and twenty percentage
13%
be used for almost a third of all applications. $651–1000 points among manufacturers. The classes over
21%
Seventy-five percent of all users use an image 100 fps are forecast to grow over the next two
sensor format between 1/3 and 2/3 of an inch, 4% years among both users and manufacturers.
unchanged from last year. Manufacturers show $1001–3000 GigE Vision is by far the leading transmis-
13%
stability and a trend towards miniaturization. sion standard among both manufacturers and
Depending on the area of application, gener- 4% users (42% and 43%, respectively). (Figure 3).
>$3000
ally they choose either very large sensors mea- 13% Analog connections are still used by some users.
suring over one inch, which achieve a share of LVDS and HDMI are additionally used trans-
33%, or small image sensors measuring up to Image rates, speed and interfaces mission standards, while the use of USB is still
1/2 an inch, which have risen to 41% because of Image rates, as with resolutions, have reached evident with a rate of 8%. Both users and manu-
embedded vision. Both users and manufactur- the next-highest group of speeds, although facturers mainly select the 10GigE and USB 3.1
ers use global shutter scanning image sensors. the expected massive increases towards mega- standards for bandwidths above 5 Gb/s.
Which interfaces do you use today? Looking forward ded vision and modularity. Manufactur-
The 2018 survey surfaces various trends ers see great potential in embedded vision
GigE 43% among users and manufacturers. Users are solutions for automotive and infrastructure
(Gigabit Ethernet) 42% seeking greater requirements, including applications. Manufacturers, however, also
faster and higher-performing image sensors, see themselves faced with a change that they
3% standard interfaces, simple integration, lower must confront together with their customers,
USB 3.0
27%
prices, and high on-board processing power. both in terms of machine vision and inno-
Another item of note brought to light in vative embedded vision applications in the
5%
USB 2.0 this year’s survey is the trend towards embed- industrial and consumer sectors.
15%
1%
Firewire
8%
1%
Ethernet
5%
0%
Camera Link
1% User
Manufacturer
0%
Dual GigE
1%
46%
Other
1%
Maximizing smart
camera optical character
recognition success
Font classification library experimentation leads to higher read rates
Alan L. Lockard
ers and OCR cameras produced a bump in accuracy, but the read rate Also unique to our product is the
inclusion of a copper conductor
was always short of 100%. to provide power to the device.
Since the laser marking equipment was deriving the OCR-A font Traditional AO cables consist only of
glass or plastic fibers. The Intercon
from the Microsoft Windows operating system controlling the laser HAO cable is a hybrid that contains
both copper and fiber allowing for
marking system, an experiment involving using the same source was both power and data transmission
performed. A Microsoft Word document using the OCR-A font set was over extended distances.
created, and a string of alphanumeric characters used in the batch mark- In addition, this HAO solution is
designed for high flex robotic
ing was typed. The OCR-A font set was limited to only those characters applications, specifically rolling
used in the alphanumeric batch number. (C-Track) applications. It’s unique
design utilizes specialized materials
Including the entire alphabet—including characters that may never be developed for the most demanding
applications. The thumbscrew
encountered—increased the chance of misclassifying one of those never- locking feature insures a secure
to-be-used characters included in the font library. Further experimenta- robust connection to the device.
tion was performed, finding that a font size that produced 2 mm-high
characters matching the laser marking system. The snipping tool was used
to save the character string as a .jpeg file, and from there, the image could
be opened in the smart camera environment to train the pristine OCR-A Precision Cable
Assemblies for the
natives into the OCR classification library. Previously-trained characters Vision and High Speed
Manufacturing Industries
were deleted, and after training the MS Word-generated characters, the
system was able to have a working set consisting of a single-trained artifact (218) 828-3157 | intercon@nortechsys.com | www.intercon-1.com
for each character (Figure 2.) The resulting font classification library pro-
cost down, but this only results in lower-resolu- night, and we may never see the deployment of able coverage needed to make AVs safe and
tion sensors that cannot provide the coverage Level-5 AVs. To safely detect and classify every functional in any environment—in day or night.
needed for Level-5 autonomy. pedestrian, autonomous vehicles require a new Unlike radar and LiDAR sensors that must
perception solution: thermal sensors. transmit and receive signals, an FIR camera
The added challenge of night driving senses signals from objects radiating heat,
To compensate for the weaknesses of the cur- The only solution to safe night driving is making it a “passive” technology. Because it
rently-used sensors, many automakers use mul- thermal sensors scans the infrared spectrum just above visible
tiple sensor types, creating a redundant net- A new type of sensor using far infrared (FIR) light, a far-infrared camera generates a new
work of sensing solutions. In this practice, technology can provide the complete and reli- layer of information, detecting objects that
where one sensor may fail at detection, it is
backed up by the other(s).
New 3-CMOS industrial prism area scan cameras from JAI JAI.COM
But even with several sensors working
together, today’s AVs still cannot achieve
Level-5 autonomy. The problem is primarily
one of classification. Together, radar, LiDAR,
and cameras may be able to sufficiently detect
all objects in the vehicle’s surroundings, but
they might not properly classify the objects.
When color matters….
When you add in the challenge of driving at
night, this issue becomes even more serious—
and even more gravely dangerous.
Consider the Uber crash. According to
a report from the National Transportation
Safety Board (http://bit.ly/VSD-NTSB), the
vehicle detected the pedestrian six seconds
before the accident, but the autonomous driv-
ing system classified the pedestrian as an
unidentified object, first as a car and then as R G B Download
WHITE PAPER
a bicycle. In other words, the vehicle’s sensors
www.jai.com/apex
detected the victim, but its software wrongly …The Apex Series
determined that she wasn’t in danger and that is the clear choice The new Apex cameras
no evasive action was required. In nature, no one handles color better than 3-CMOS prism technology (PregiusTM sensors)
This is called a false positive, which is when the chameleon. And for color vision sys- Exceptionally accurate color image capture
an AV successfully detects an object, but tems, no industrial cameras handle color Built-in color space conversion
better than the Apex Series. With their new
wrongly classifies it. The software in autono- Color and edge enhancement
3-CMOS, prism-based technology, these
mous vehicles is programmed to ignore certain Apex cameras provide significantly higher
objects, like an errant plastic bag or newspaper color accuracy and spatial precision than
typical Bayer-filter cameras, and offer more
flicking across the street. These accommoda-
than 3 times the throughput of previous
tions must be made for autonomous vehicles 3-CCD models – up to 3.2 megapixels at 55
to drive smoothly, especially on high-speed fps. Add to that advanced features like edge
roads. However, Uber’s fatal incident proves enhancement, color enhancement, and
built-in color space conversions for a price AP-3200T AP-1600T
that autonomous vehicle software is still chal- that’s well-below previous prism cameras, 3 x 3.2 megapixels 3 x 1.6 megapixels
lenged by false positives—to perilous results. It and it’s easy to see why system designers Sony PregiusTM IMX265 Sony PregiusTM IMX273
need not be stated that, with pedestrians coex- are choosing Apex Series cameras for their 55.6 fps (PMCL) 126 fps (PMCL)
color-critical applications in life sciences,
isting on roads with AVs, there is no room for USB3, GigE, PMCL USB3, GigE, PMCL
print inspection, paint matching, darkfield
error in classification of objects, as any error is color wafer inspection, and much more. If
extremely dangerous and, possibly, fatal. color is critical to your vision system, don’t
settle for less than the best. To learn more,
Until the detection and classification pro-
visit www.jai.com/apex
cess of current sensing solutions improves,
autonomous vehicles will not be able to oper-
ate safely and reliably amongst pedestrians at Europe, Middle East & Africa - JAI A/S Asia Pacific - JAI Ltd. Americas - JAI Inc.
camerasales.emea@jai.com camerasales.apac@jai.com camerasales.americas@jai.com
+45 4457 8888 +81 45-440-0154 + 1 408 383 0300
www.vision-s y stems .com
The makings of a
successful imaging lens
Part Three: Testing and metrology, ensuring you get what you asked for
Greg Hollows
make your final lens ally, it can require a more sophisticated oper-
not meet the desired ator than reverse projection.
performance criteria. Systems that can perform these types of tests
Ultimately, the test- are available, but many optical companies also
ing of the completed have developed their own in-house versions of
lens assembly mea- these systems. However, these systems all use
sures how sensitive the different software, hardware, light sources,
design is to the com- and algorithms to perform their tests, which
bination of all com- can make it difficult to correlate one system
ponent tolerances. As to another. This is especially true when there
mentioned before, it is are complex requirements relating resolution
critical to fully define in different areas across the image and at dif-
the final acceptance ference working distances.
Figure 2: The raw material used to make individual lens components
criteria. Additionally, This is where outbound testing from the
must meet certain requirements for homogeneity, refractive index, dis-
correlating all parties’ persion, and other properties. supplier and inbound testing at the customer
testing methodology site can get complicated if not sorted out well
and techniques can take time, creating consid- it is fundamentally a subjective pass/fail test before parts are delivered. For extremely high-
erable issues if it is rushed at the end of the pro- and will not produce absolute detail. As reso- volume production; streamlined, fully auto-
cess. There are a number of system-level tests lutions improve, it becomes more difficult to mated MTF systems can be developed to get
often used to evaluate the final lens, including use this technique. the benefits of speed and fidelity, but they are
modulation transfer function (MTF), depth of The next method of MFT testing utilizes usually highly-tailored systems working with a
field, distortion, and stray light. cameras or sensors, tightly-aligned targets narrow group or specific product.
and objects, and software to produce more
MTF testing accurate detail at any given position of inter- Depth of field testing
MTF testing is performed on every lens est (Figure 5). This form of testing is more Testing depth of field is done by evaluating the
assembled by nearly all manufacturers. This precise than reverse projection but much MTF at different distances above and below
test quantifies how much contrast is present slower since it looks at discrete field points best focus. This can be done with many, if not
at different levels of resolution. Basically, it one at a time. Automation has increased all, of the techniques described in the MTF
describes the crispness and clearness of an throughput, but this can add cost. Addition- section, but it is not always a designed-in
image. The first step is understanding where
in the field of view (FOV) the testing should
be done and how many field points should be
reviewed to ensure conformance. Additionally,
the wavelengths of light used can have a sig-
nificant influence on the test. Different light
sources have varying spectral content and will
produce mixed MTF results on the same lens.
The most basic, and widely-used version of
MTF testing is reverse projection (Figure 4).
This is done by back illuminating a precision
target that is placed at the imaging plane of
the lens (the sensor location). This creates a
high-contrast object in that location, which is
then projected through the lens onto a wall or
screen, creating a projected image in the loca-
tion of the object the lens is designed to look
at. This is running the lens in reverse to see
its performance. The advantage of this tech-
nique is that gives a great deal of information
across the entire FOV at the same time and has
high-volume throughput. The downside is that Figure 3: Interferometry is used to quantify the surface accuracy of individual glass components.
Figure 4: Reverse projection is the most common MTF testing method. Figure 5: The Trioptics ImageMaster® is a camera-based MTF mea-
surement technique that is more precise but more time-consuming
than reverse projection.
capability of the test equipment. Typically, a specific set of criteria that
matches the limitations of the test equipment needs to be developed to tionally, it will require tightly controlling antireflection coatings on the
get accurate results. Understanding how to align with your supplier on lens elements, which increases costs.
MTF testing will make defining the depth of field testing much easier Understanding your desired end result, of course, is required to con-
to achieve. Custom test setups may need to be developed for this test. struct the proper lens specification. Testing and validation is critical to
success and should be discussed up front in the project development,
Distortion and at the latest, before parts are ordered. Ensure that you and your sup-
Testing distortion usually requires automated systems, as visual sys- plier are aligned on your joint test strategy. This is best way to shorten
tems that work like reverse projects struggle to create an easy pass/fail lead times, reduce headaches, and achieve the best final results.
distortion condition. Many software packages are available that can
be used to do this testing, but careful alignment of the sensor to the
optics and mechanical alignment to the target are required to achieve
proper results. Additionally, some MTF testing systems can produce
distortion tests if configured correctly. Testing at the correct positions
in the field is critical to ensuring that accurate results are obtained. It Another Great
is easy to fail good product if this is not specified correctly. Product from
KAYA Instruments
Stray light
Stray light is the amount of unwanted light that makes it to the
sensor. It is present in all lens systems and manifests itself in a couple
of different ways.
It can occur when light enters the lens at angles outside the lens’
FOV and scatters off a glass or metal surface/edge onto the sensor. The
result is a soft spot or a bright hot spot across the image. Reduction of • 40Gbps fiber optic interface
this issue is done by blackening the edges of lenses, oversizing lens ele- • Up to 10 km cable length
• 3G SDI output for local monitor
ments, adding sophisticated lens coatings, and inserting baffles. The
Up to 2400 fps • Nikon F mount, Canon EF mount,
trade-off will be size, weight, and likely some amount of cost. Another at 2.1 MP (Full HD) C- mount, B4 2/3” mount
form of stray light can be ghost images, in which light intended to go • RS232 interface for periphery and
Up to 300 fps at 8.9MP (4K) lens control
through the lens as part of the FOV reflects between individual lens Up to 80 fps at 25MP • Compatible with KAYA Vision Point™ SDK
elements and eventually arrives at the sensor plane. For most machine • Compatible with KAYA Komodo™
Frame Grabbers
price!
Unbeatable
vision applications, an intensity of 10 -4 will not create ghost images, but • Customization as per customer
applications with high intensity lights, laser-based illumination, or the requirements
presence of the Sun can experience detectable ghost images that would
not affect most other applications. info@kayainstruments.com
Correcting for ghost images requires desensitizing the lens design to
these issues. This is a best practice, but it reduces the degrees of free- www.kayainstruments.com
dom the optical designer can use to improve other specifications. Addi-
Extruded Enclosure
Use with popular 29mm camera models, and for
Teledyne DALSA’s Genie Nano, camera models.
• Able to change and adjust focusing of the optics
while mounted.
• MIDOPT Polycarbonate – AR coated window.
• Large variety of options: Mounts, Visors,
Air Curtains, Windows & Filters.
• Custom Sizes available.
Light Weight,
Only 160 g.
MATRIX 120
Smallest in size, Giant in performance.
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
WWW.LUMENERA.COM
MASTERMIND
Visit us in
nd E12
Hall 1, Sta
Need Depth?
Go the distance with the Trilogy
series of 3D video cameras by
Munro Design & Technologies
Non-stereoscopic Non-scanning
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
136 mm
+ 640 x 480 x 12 µm + IP & Analog Video + 5x Continuous Zoom
+ Compact + 15-75 mm Lens + 60 Hz
finding
weakness
in strength The new Polarsens sensor technology
in our XCG-CP510 polarising
camera reveals tell-tale clues that
can lurk unseen – from subtle
imperfections to dangerous defects.
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
35 mm 75mm
50mm NEW
NEW
25 mm
12 mm 16 mm 8 mm NEW
Full 5-Mega-Pixel resolution optimized for both 1/1.2” and 2/3” 1 / 1. 2 " 8mm F/2 .4 ( M o d e l M 11 2 F M 0 8 ) NEW
imagers. The industry’s most compact* body and leading-edge 1 / 1. 2 " 12 m m F/2 .0 ( M o d e l M 11 2 F M 1 2 )
1 / 1. 2 " 16 m m F/2 .0 ( M o d e l M 11 2 F M 1 6 )
optical performance employing diameter 𝝓𝝓𝝓𝝓29mm lens.
1 / 1. 2 " 25mm F / 1. 8 ( M o d e l M 11 2 F M 2 5 )
Newly-released four additional models expand the current lens 1 / 1. 2 " 35mm F / 2 .1 ( M o d e l M 11 2 F M 3 5 ) NEW
portfolio for 1/1.2" imagers to seven lenses that accomodate 1 / 1. 2 " 50mm F/2 .8 ( M o d e l M 11 2 F M 5 0 ) NEW
*In machine-vision lenses that are compatible with a 1/1.2” imager and 5-Mega- Compatible with
Pixel resolution (survey by Tamron, effective as of May, 2017) IMX174/IMX249 (1/1.2” 2.4-Mega-Pixel), IMX250/IMX264 (2/3” 5-Mega-Pixel)
www.tamron-usa.com
SUBSCRIBE NOW AT
WWW.VSD-SUBSCRIBE.COM
C AM ER A DISTRIBUTORS
Sensor Color/ Spectrum
Product Scan type Image format Interface Frame/s Data rate
type Mono digitized
1stVision Inc Andover, MA, USA; 978-474-0044; www.1stvision.com; info@1stvision.com
320 x 256,
Allied Vision Goldeye SWIR Series InGaAs M Area array SWIR Digital I/O, Ethernet, GigE 344 106 MB/s
640 x 480
Allied Vision Mako Series CMOS C+M Area array VGA–5 Mpixels NIR, VIS Digital I/O, Ethernet, GigE, USB3 ≥1000 110 MB/s
CCD &
IDS Imaging RE/FA IP65/67 GigE
CMOS C+M Area array — NIR, VIS GigE 205 110 MB/s
Series
versions
CCD &
Allied Vision Manta Series CMOS C+M Area array — NIR, VIS Digital I/O, Ethernet, GigE ≥1000 110 MB/s
versions
Teledyne Dalsa Linea Line Scan 2048 x 1– Camera Link, Camera Link Full,
CMOS C+M Linescan NIR, VIS — ≥80 kHz
Series 16,384 x 1 Camera Link HS, Digital I/O, GigE
Teledyne Dalsa Genie Nano Series CMOS C+M Area array VGA–25 Mpixels NIR, VIS Digital I/O, GigE ≥1000 110 MB/s
IDS Imaging LE/ML Series CMOS C+M Area array — NIR, VIS Digital I/O, GigE, USB 1.0/2.0, USB3 ≥1000 400 MB/s
IDS Imaging uEye CP Series CMOS C+M Area array VGA–18 Mpixels NIR, VIS Digital I/O, USB3 ≥1000 400 MB/s
Teledyne Dalsa Nano 5GigE Series CMOS C+M Area array — NIR, VIS 5 GigE 190 985 MB/s
Teledyne Dalsa Calibir LWIR Microbo-
M Area array — LWIR Analog, Digital I/O, GigE, RS-170 50 16 MB/s
Camera Series lometer
Teledyne Dalsa Genie Nano XL
CMOS C+M Area array — NIR, VIS GigE 7.1 225 MB/s
Series
JAI Spark Series CMOS C+M Area array — NIR, VIS Camera Link, Camera Link Full, GigE 253 590 MB/s
JAI GO Series CMOS C+M — — NIR, VIS Camera Link Full 107 535 MB/s
CCD &
Allied Vision Prosilica GT Series CMOS C+M Area array — NIR, VIS GigE 57 110 MB/s
versions
Allied Vision Bonito Pro CoaXPress
CMOS C+M Area array — NIR, VIS CoaXPress 142.6 2.3 GB/s
Series
A&B Software New London, CT, USA; 860-823-8301; www.ab-soft.com; sales@ab-soft.com
Amazon GigE Vision series CCD C+M Area array — VIS Ethernet, GigE — —
Prosilica GC series CCD C+M Area array — VIS GigE — —
Prosilica GE series CCD C+M Area array — VIS GigE — —
Prosilica GX series CCD C+M Area array — VIS GigE — —
Prosilica GT series CCD C+M Area array — VIS GigE — —
Manta GigE series CCD C+M Area array — — GigE — —
Pearl 1394 series CCD C+M Area array — VIS FireWire — —
Nile 1394 series CCD C+M Area array — VIS FireWire — —
Guppy 1394 series CMOS C+M Area array — — FireWire — —
Pike 1394b series CCD C+M Area array — VIS FireWire — —
ATO Automation Los Angeles, CA, USA; 800-585-1519; www.ato.com; sales@ato.com
Hyperspectral,
IR, LWIR,
Multispectral,
GigE Vision Industrial Camera CMOS C+M Linescan — GigE — —
MWIR, SWIR,
NIR, X-ray,
VIS
FRAMOS Taufkirchen/Munich, Germany; 49-89-710667-0; www.framos.com; info@framos.com
Tattile S200 Smart Camera HYP CMOS M Linescan — Hyperspectral GigE 180 —
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
This table is published as a service. The publisher does not assume liabilities for errors or omissions.
Please visit www.vision-systems.com for the comprehensive online Camera Directory listings.
100
90
80
70
60
Transmission (%)
50
40
30
20
10
0
560 570 580 590 600 610 620 630 640 650 660 670 680 690 700
Wavelength (nm)