-
Basler Sales
Europe, Middle East, Africa
The Americas
Asia-Pacific
Find here your local Basler representative and contact details.
Basler Technical Support -
Request Cart
Your request cart is currently empty. Please add at least one product to send a quote request. If you would like to send a message directly to our sales team, please use this form.
Request Cart -
- Products
- Solutions
- Downloads
- Sales & Support
-
Vision Campus
-
Camera Technology
- Comparison between Sony's IMX CMOS sensor series
- Why should I color calibrate my camera?
- Expert Tips to Find the Right Lens for a Vision System
- Expert Tips for Embedded Vision Systems
- What Is Multispectral Imaging?
- Comparison of CMOS cameras
- Color in Image Processing
- Processing Boards in Embedded Vision
- What Is Image Processing
- 3D Technologies in Image Processing
- What Is Embedded Vision
- Why CMOS Image Sensors?
- What Is Time of Flight?
- What Is Image Quality?
- Camera Sizes
- How does a Digital Camera Work?
- CMOS vs. CCD: Sensor Technology
- Real-Time Capability
- NIR: Seeing Clearly Even in Low Light
- High-Sensitivity Industrial Cameras
- Interfaces and Standards
-
Markets and Applications
- Cameras for Fluorescence Microscopy
- What Is the Role of Computer Vision for Industry 4.0?
- What Is Deep Learning?
- Robots with Vision Technology
- Blockchain for IoT Sensor Producers
- How will IoT change retail?
- How Do Machines Learn?
- IoT Applications in the Smart City
- Benefits of Cameras in Medicine
- Lab Automation with Vision
- Medicine with Vision
- Image Processing in Industry 4.0
- Machine Vision
- Automated Optical Inspection
- Color Calibration in Medical Technology
-
Vision Systems and Components
- Software in Image Processing
- Image Pre-processing Strengthens and Streamlines Image Processing Systems
- How to Find the Right Lighting for Your Vision System?
- What is a Machine Vision SDK
- Lighting
- How Can I Find the Right Lens?
- Components of a Vision System
- Cutting Through the Noise: Camera Selection
- Image Processing Systems — The Basics
-
Camera Technology
- Company
-
Products
Computer Vision HardwareCameras Lenses Lighting Acquisition Cards Cables Accessories Network & Peripheral DevicesComputer Vision Softwarepylon Camera Software Suite VisualApplets Other softwarePortfoliosEmbedded Vision Portfolio Medical & Life Sciences Portfolio CoaXPress Product PortfolioCamerasArea Scan Cameras Line Scan Cameras 3D Cameras CoaXPress 2.0-Cameras Customized Basler CamerasArea Scan CamerasBasler ace 2 Basler ace Basler MED ace Basler boost Basler beat Basler dart Basler pulse Basler scoutBasler aceNew ace featuresLine Scan CamerasBasler racerBasler racerBasler racer - the right choiceBasler blazeblaze FAQ Use Case: 3D image processing for object inspection Basler blaze Software Use Case: Time-of-Flight Technology Meets Labeling Robot Use-Case-Kine-RoboticsBasler Stereo CameraQuote Basler Stereo CamerasLensesFixed Focal LensesAcquisition CardsUSB 3.0 Interface Cards GigE Interface Cards CXP-12 Interface Cards imaWorx microEnable 5 marathon microEnable 5 ironmanpylon Camera Software Suitepylon Open Source Projects pylon Data FAQ pylon Drivers & GenTL pylon SDKs pylon ViewerOther softwareBasler blaze Software Frame Grabber Software Basler Microscopy Software Basler Video Recording Software Basler Application Software for Robotics drag&bot Robotics Software
-
Solutions
MarketsFactory Automation Medical & Life Sciences Vision-guided Robotics Warehouse Automation Further ApplicationsFactory AutomationAutomotive Electronics & Semiconductors Photovoltaics Food & Beverages Pharma & Healthcare Products Printing & Webbed MaterialsFurther ApplicationsSport & Entertainment Security and Surveillance Agriculture Retail Traffic & TransportationRetailATMs / VTMs Check-Out Systems Reverse Vending Machines Vending Machines Recognition/Biometrics People Counting, Tracking & Profiling Shelf InspectionTraffic & TransportationANPR Enforcement Tolling In-Vehicle Monitoring and Transportation Rail and RoadEmbedded VisionEmbedded Vision Portfolio Embedded Vision Services Embedded Vision Ecosystem & SupportEmbedded Vision PortfolioCameras for Embedded Vision Embedded Vision Kits Software for Embedded Vision Embedded Vision for NVIDIA Embedded Vision for NXP
- Downloads
-
Sales & Support
Sales & SupportSales Support Contact Knowledge Base Aftersales Services Basler Product Documentation Frame Grabber Services Tools
-
Vision Campus
Vision CampusCamera Technology Interfaces and Standards Markets and Applications Vision Systems and ComponentsCamera TechnologyComparison between Sony's IMX CMOS sensor series Why should I color calibrate my camera? Expert Tips to Find the Right Lens for a Vision System Expert Tips for Embedded Vision Systems What Is Multispectral Imaging? Comparison of CMOS cameras Color in Image Processing Processing Boards in Embedded Vision What Is Image Processing 3D Technologies in Image Processing What Is Embedded Vision Why CMOS Image Sensors? What Is Time of Flight? What Is Image Quality? Camera Sizes How does a Digital Camera Work? CMOS vs. CCD: Sensor Technology Real-Time Capability NIR: Seeing Clearly Even in Low Light High-Sensitivity Industrial CamerasShow moreShow lessInterfaces and StandardsSystem Setup with CoaXPress 2.0 What Is CoaXPress? Which interface for Embedded Vision? Multi-Camera Systems with GigE 2.0 USB 3.0 – Interface of the Future What Is an Interface? Camera Link Gigabit Ethernet (GigE) GenICam Standard USB 3.0 and USB3 VisionMarkets and ApplicationsCameras for Fluorescence Microscopy What Is the Role of Computer Vision for Industry 4.0? What Is Deep Learning? Robots with Vision Technology Blockchain for IoT Sensor Producers How will IoT change retail? How Do Machines Learn? IoT Applications in the Smart City Benefits of Cameras in Medicine Lab Automation with Vision Medicine with Vision Image Processing in Industry 4.0 Machine Vision Automated Optical Inspection Color Calibration in Medical TechnologyShow moreShow lessVision Systems and ComponentsSoftware in Image Processing Image Pre-processing Strengthens and Streamlines Image Processing Systems How to Find the Right Lighting for Your Vision System? What is a Machine Vision SDK Lighting How Can I Find the Right Lens? Components of a Vision System Cutting Through the Noise: Camera Selection Image Processing Systems — The Basics
-
Company
Social ResponsibilityCareer and Family Health Management HanseBelt Initiative Promoting Young Talents Environmental PoliciesInvestorsShare Financial Information Financial Calendar Annual General Meeting Corporate Governance Contact Investor Relations SustainabilityShareCurrent Price Information Basic Data Shareholder Structure Analysts Recommendations Share Buyback Program Dividend PolicyFinancial InformationAd Hoc Disclosures Corporate News Financial Reports Voting Rights Presentations Equity StoryEquity StoryBusiness ModelCorporate GovernanceDeclaration of Compliance Management Board Supervisory Board Director's Dealings Auditors Articles of Association Remuneration Report
Glossary
-
1080p
1080p is short form term designating a set of HDTV characterized by 1080 horizontal lines resolution and progressive scan. In contrast to the 1080i standard, in 1080p the image is not interlaced. The term 1080p in most cases refers as well to the widescreen aspect ratio of 16:9, which implies a horizontal resolution of 1,920 pixels. The supported frame rate can be added to the term, e.g. 1080p24. Sometimes 1080p is referred to as Full HD.
More about 1080p -
5x5-Debayering
Debayering algorithms work by drawing in the color values from neighboring pixels to estimate missing colors. The dimension for this analysis depends on the number of surrounding pixels reviewed for each pixel. A radius of 2x2 neighboring pixels would be denominated as 2x2 environment, a radius of 4x4 pixels would be a 4x4 environment, respectively. The more neighboring pixels are taken into account, the more precise the color rendering in an image becomes, preventing color errors. The PGI debayering algorithm works with a 5x5 environment.
For technical reasons, a 4x4 debayering is used for some sensors / models.
More about 5x5-Debayering -
720p
720p describes the lowest resolution defined in the television standard for High Definition TV. "720" stands for the vertical resolution in lines (or rows of pixels). The letter p defines the transmission and display via progressive scan (in contrast to interlaced scan). Assuming an aspect ration of 16:9, the horizontal resolution of 720p is 1280 pixels.
More about 720p -
ADC | Analog-to-digital converter
Analog-to-digital converter (ADC) - an electronic device that converts a voltage to a proportional digital number.
More about ADC -
API
Application Programming Interface (API) - a set of specifications allowing application software to communicate with the camera.
More about API -
Area Scan Camera
Area Scan cameras contain a rectangular sensor with more than one line of pixels, which are exposed simultaneously.
More about Area Scan Camera -
Basler design-in sample | design-in sample
Basler design-in samples are early versions of products. They can be used to evaluate a product’s basic performance and consider if it is suitable to be integrated into an application.
Until the series production of the product, the software can change. In rare cases the hardware will also be changed.
Once series products are available, Basler ships only those products.
The number of available design-in samples is limited, so there are no set delivery times.
Basler design-in samples offer a 1-year instead of 3-year warranty.
In some manuals the word “prototype” is used for design-in samples.
More about Basler design-in sample -
Bayer BG 12 / Bayer GB 12
Both the “Bayer BG 12” and “Bayer GB 12” are formats for transmitting image data. When a color camera is configured for BG 12 / Bayer GB 12 pixel data format, 2-bytes (16 bits) of data per pixel are issued, with 12 effective bits. The brightness information is included in the 12 lowest-value bits, while the 4 highest-value bits are filled with zeros. The pixel data are not processed or interpolated in the Bayer BG12 format, i.e. the raw data is used.
The sequence of data in the Bayer BG12 format orients itself in the process to the alignment of color pixels on the sensor in the form of the Bayer pattern. Within the Bayer pattern, the rows alternate between lines of green and blue pixels and lines of red and green pixels. That means for example in the Bayer BG 12 format that one 12-bit value is transmitted for each blue and green pixel along a row. It is then followed by a 12-bit value for each green and red pixel in the next row, and so on, until all rows of the color sensor have been read out.
The sole difference between the Bayer BG 12 format and Bayer GB 12 format is that the data transferred in the Bayer BG 12 format starts with a blue pixel, while the GB12 format starts with a green one.
More about Bayer BG 12 / Bayer GB 12 -
Bayer BG 12 Packed / Bayer BG 12 p
The “Bayer BG 12 Packed / Bayer BG 12 p” is a data format for transmitting image data. When a color camera is configured for BG 12 Packed pixel data format, 12 bits of data per pixel are issued. The data for 2 pixels is packed into 3 bytes. This saves 4 bits of bandwidth per pixel compared with the unpacked Bayer BG12 format. In the Bayer BG 12 Packed format, the pixel data is neither processed nor interpolated, meaning the raw data is used.
The sequence of data in the Bayer BG12 Packed format orients itself in the process to the alignment of color pixels on the sensor in the form of the Bayer pattern. Within the Bayer pattern, the rows alternate between lines of green and blue pixels and lines of red and green pixels. That means for example in the BG 12 Packed format that one 12-bit value is transmitted for each green and blue pixel along a row. It is then followed by a 12-bit value for each red and green pixel in the next row, and so on, until all rows of the color sensor have been read out.
More about Bayer BG 12 Packed / Bayer BG 12 p -
Bayer BG 8 / Bayer GB 8
Both the “Bayer BG 8” and “Bayer GB 8” are formats for transmitting image data. When a color camera is configured for the BG 8 / Bayer GB 8 pixel data format, 8 bits of data per pixel are issued. The sequence of data orients itself in the process to the alignment of color pixels on the sensor in the form of the Bayer pattern. Within the Bayer pattern, the rows alternate between lines of green and blue pixels and lines of red and green pixels. That means for example in the Bayer BG8 format that one 8-bit value is transmitted for each blue and green pixel along a row. It is then followed by an 8 bit value for each green and red pixel in the next row, and so on, until all rows of the color sensor have been read out. The pixel data are not processed or interpolated in the process. As such, the Bayer BG8 and GB8 formats are so-called raw data formats.
The sole difference between the Bayer BG 8 format and Bayer GB 8 format is that the data transferred in the Bayer BG 8 format starts with a blue pixel, while the GB8 format starts with a green one.
More about Bayer BG 8 / Bayer GB 8 -
Bayer pattern
A regular array of red, green and blue color filters covering a sensor's pixels. Each pixel is covered by a color filter of one color and the closest neighbors are covered by color filters of the other colors. For example, the nearest neighbors of a "green" pixel are "red" and "blue" pixels. See also color creation and color interpolation.
More about Bayer pattern -
BCON for LVDS Interface
The BCON for LVDS interface also includes the camera controller via I²C and various trigger options.
More about BCON for LVDS Interface -
Binning
Binning describes the technique of summarizing several neighboring pixels in the sensor of an industrial camera to a block of pixels with the goal of improving the signal to noise ratio.
More about Binning -
Blooming
An artifact occurring in images that were acquired by a CCD image sensor.
More about Blooming -
Bright field | bright field
The term bright field is used to describe the installation position of the illumination: The light shines directly on the object, almost parallel to the optical axis of the camera.
More about Bright field -
Camera Link
Camera Link is an extremely robust and powerful interface designed for industrial cameras in all performance categories. The bandwidth available with Camera Link can accommodate very small cameras as well as cameras with several megapixel resolution and often several hundred frames per second speed. Camera Link is currently the recommended standard interface for data rates from 100 MB/s to about 800 MB/s.
Because Camera Link was specifically designed for use with industrial cameras, it can handle large amounts of data easily and securely. All components in a Camera Link solution must meet the Camera Link standard.
More about Camera Link -
Camera Link HS
The Camera Link HS was released as last high-speed standard. It defines both the CX4 connectors as well as the SFP+ connector types as supported interfaces. The cable length of the optical connection is several hundred meters. Especially for line scan camera applications, this results in highly accurate image acquisition and control.
More about Camera Link HS -
Carrier Board
More about Carrier Board -
CCD
A charge-coupled device (CCD) is a device for movement of electrical charge.
It is often integrated with an image sensor, so a two-dimensial picture can be captured.
More about CCD -
CCD Sensor
The basic job of CCD and CMOS sensors is to convert light (photons) into electronic signals (electrons). In general, the low noise level, high fill factor, and good signal-to-noise ratio exhibited by CCD sensors result in very high quality images. These characteristics make cameras based on CCD sensors a good fit for machine vision applications. For more information, have a look at our white paper "Small Differences Along the Way from Light to a Signal".
More about CCD Sensor -
CCT+
The Basler Camera Configuration Tool Plus (CCT+) is a Windows based tool for configuring Basler cameras that adhere to the Camera Link standard. The CCT+ tool is available for 32 and 64 bit operating systems and frame grabbers. For Basler's newer Camera Link cameras, the CCT+ tool was superseded by the Basler pylon software.
More about CCT+ -
Charge-Coupled Device
A charge-coupled device (CCD) is a device for movement of electrical charge. It is often integrated with an image sensor, so a two-dimensial picture can be captured.
More about Charge-Coupled Device -
CMOS
Complementary metal-oxide-semiconductor (CMOS) is a technology for constructing integrated circuits. CMOS technology is used in microprocessors, microcontrollers, static RAM, and other digital logic circuits. CMOS technology is also used for several analog circuits such as image sensors (CMOS sensor), data converters, and highly integrated transceivers for many types of communication. Frank Wanlass patented CMOS in 1967 (US patent 3,356,858).
More about CMOS -
CMOS Sensor
The basic job of CCD and CMOS (complementary metal oxide semiconductor) sensors is to convert light (photons) into electronic signals (electrons).
CMOS sensors have made a path into machine vision based largely on their advantage in speed (frame rate) and resolution (number of pixels) compared to CCD imagers. Improvements in CMOS technology and demand from high volume users such as the automotive market are making CMOS image sensors more and more attractive for machine vision applications. For more information, have a look at our white paper "Small Differences Along the Way from Light to a Signal".
More about CMOS Sensor -
C-Mount
A lens mount for screw in lenses. The thread has a nominal diameter of 24.5 mm (1 inch) and 32 turns per inch. The lens mount employs a flange to image plane distance of 17.526 mm (0.69 inch).
More about C-Mount -
CoaXPress | CXP
The CoaXPress (CXP) standard was originally launched by various companies in the industrial image processing sector. The goal was to develop a fast data interface that also made it possible to bridge a large data volume across greater distances. The first CoaXPress interfaces were introduced at “Vision”, the leading trade fair for industrial image processing in 2008 in Stuttgart. After three more years of development, CXP 1.0 was officially released as a standard in 2011. Since then, the standard has established itself in industrial image processing. This standard was then developed further, into CoaXPress 2.0. An interface with the CoaXPress 1.0/1.1 standard supports data rates as high as 6.25 Gbps. The transmission speed of the CoaXPress 2.0 standard is twice as high, at up to 12.5 Gbps. This allows for even higher resolutions and frame rates compared to other efficient standards. In contrast to the preceding interface, the new standard only needs half as many cables to transfer the same amount of data.
Thanks to the combined triggering and power supply (power over CXP), only one CoaXPress cable is needed, which can have a maximum cable length of 40 meters – another benefit of this update.
More about CoaXPress -
Cobot
A Cobot is a new type of smaller and cost-effective robot that can work side by side with humans. Cobots are especially advantageous for smaller companies because they are much easier to install, program and maintain. Vision Guidance - control via optical sensors - is particularly important in this context in order to reliably detect objects.
More about Cobot -
Color-Anti-Aliasing
Color errors, especially on sharp edges, are a common side effect of less effective debayering algorithms. They become much more frequent and prominent in sections where an image has high spatial frequencies. These arise for example when the distances between black and white lines are so tight that they are placed in neighboring pixels, or at sharp edges with one light and one dark side. PGI Color-Anti-Aliasing analyzes and corrects discolorations for all potential frequencies below the theoretical limit, the Nyquist frequency.
More about Color-Anti-Aliasing -
Color Creation
An image sensor is only capable of delivering grey values. To obtain color information, each pixel of the sensor is covered by a color filter allowing the pixel to deliver the gray value for the color of its color filter (primary colors). To obtain full color information, color filters of different colors (e.g. red, green, blue) are used. They are assigned to the sensor's pixels in a regular pattern (e.g. forming the Bayer pattern) such that neighboring pixels are covered by color filters of different primary colors. For example, the nearest neighbors of a "green" pixel are "red" and "blue" pixels. The gray values for the primary colors not measured can then be interpolated for each pixel from the gray values delivered by the neighboring pixels. As an alternative to using color filters of different colors on a single sensor, different sensors can be used where each one is covered by a color filter of only one color.
More about Color Creation -
Color filter
Colored transparent cover for a pixel. Only light of the color filter's color can strike the pixel and accordingly, the only the grey value of the color filter's color is measured. See also "color creation".
More about Color filter -
Color Interpolation
Refers to the method of calculating full color information for a pixel from the color information measured by the pixel and from the color information delivered by its neighboring pixels. Also called demosaicing.
More about Color Interpolation -
CS-Mount
A lens mount for screw in lenses. It is identical to the C-Mount except for the shorter flange to image plane distance of 12.526 mm (0.493 inch). C-mount lenses can also be used on CS-mount cameras with the addition of a 5mm spacer ring (C-CS adapter).
More about CS-Mount -
Dark field | dark field
The term bright field is used to describe the installation position of the illumination. This makes scratches, edges, marks or notches on surfaces visible, among other applications.
More about Dark field -
Day/Night Functionality
True day/night functionality via an automatically retractable IR-cut filter provides a high image quality color mode for daylight applications and a black and white mode for night and low light conditions.
More about Day/Night Functionality -
Debayering
A standard image sensor will initially only produce black-and-white images. Color images require the use of a color matrix. The most well-known and frequently used matrix is the Bayer pattern, featured in a process known as Bayering. It involves the composition of a mosaic using the color pixels, although the color values for that mosaic are actually incomplete. To produce a proper color image, those color values are reconstructed using a complicated algorithm. This process is known as demosaicing or, named after the Bayer pattern, debayering.
Debayering can either be performed directly within the camera's firmware or afterward in post-production using the raw data. For industrial applications, debayering via firmware is recommended.
More about Debayering -
Deep Learning
Deep learning refers to so-called artificial neural networks (ANN). In order to process image data, an ANN is structured in different layers. If it consists of more than three layers, it is called "deep". The most common ANNs are Convolutional Neural Networks (CNN), which are based on mathematical convolutional operations.
Deep learning enables many new applications that were previously not possible, such as cell classification in medicine & life sciences. Another example is the increase of precision and robustness in industrial applications.
More about Deep Learning -
Denoising
Noise is an unavoidable part of any image and originates from several sources such as Photon Shot Noise, image sensor noise, or local noise. Typically, there are many calculation steps required between a raw image and a finished color image. Each of these computational steps can reduce, or - in most cases - reinforce the noise. PGI denoising avoids this magnification by executing these operations parallel to one another.
More about Denoising -
DirectShow
Image editing software with a DirectShow interface can receive data from any DirectShow-compliant image capture device.
More about DirectShow -
DNS
Domain Name System
More about DNS -
Dynamic Range
Dynamic Range describes the ratio of the largest signal to the smallest signal (that can be distinguished from noise) in an image. Dynamic range is the ability of a camera to produce an image of an area that includes both very low light (shadowed) and full light situations simultaneously with minimum noise or interference.
More about Dynamic Range -
Embedded System
There, an embedded system handles particular tasks for its associated device.
More about Embedded System -
Embedded Vision System
It can technically be based on a single board computer, a system on module (SoM), or an individually adapted processing board, each of which are supplemented with additional camera technology.
More about Embedded Vision System -
EMVA
The European Machine Vision Association (EMVA) is an association of companies and corporations involved in machine vision and standardization in that field. The EMVA has, for instance, defined and released the EMVA1288 and the GenICam standard.
More about EMVA -
EMVA1288
The EMVA1288 standard describes a unified method to measure, compute, and present the specification parameters for cameras and image sensors used in machine vision applications. The EMVA 1288 standard includes a well defined method for measuring most common noise sources. It also includes a mandatory and detailed description of measurement setups, environmental conditions, and test requirements.
A downloadable version of the EMVA 1288 standard is available from the EMVA web site at www.standard1288.org.
Basler is leading the effort to standardize image quality and sensitivity measurement for machine vision cameras and sensors. All measurements done by Basler will be in 100% compliance with the European Machine Vision Association "EMVA 1288" standard. Basler is giving this standard the strongest support.
More about EMVA1288 -
Ethernet
Ethernet is a family of computer networking technologies for local area networks (LANs) commercially introduced in 1980. Standardized in IEEE 802.3, Ethernet has largely replaced competing wired LAN technologies.
Systems communicating over Ethernet divide a stream of data into individual packets called frames. Each frame contains source and destination addresses and error-checking data so that damaged data can be detected and re-transmitted.
The standards define several wiring and signaling variants. The original 10BASE5 Ethernet used coaxial cable as a shared medium. Later the coaxial cables were replaced by twisted pair and fiber optic links in conjunction with hubs or switches. Data rates were periodically increased from the original 10 megabits per second, to 100 gigabits per second.
Since its commercial release, Ethernet has retained a good degree of compatibility. Features such as the 48-bit MAC address and Ethernet frame format have influenced other networking protocols.
More about Ethernet -
European Machine Vision Association
The European Machine Vision Association (EMVA) is an association of companies and corporations involved in machine vision and standardization in that field. The EMVA has, for instance, defined and released the EMVA1288 and the GenICam standard.
More about European Machine Vision Association -
Event Reporting
"Event Reporting" is a feature used by the camera under certain conditions to produce an event while controlling image capture. A corresponding event report is then sent to the PC.
Events can include "overtriggering," for example. The condition for that event would be that the camera receives a trigger signal to record an image but is not yet ready to capture the image. In this case the camera sends a corresponding event notification to the PC.
More about Event Reporting -
Exposure Active Signal
The "Exposure Active Signal" is a digital signal issued via the camera's output port to control the image capture.
The Exposure Active Signal is issued (generally speaking: set to "High") once the exposure time for the image capture starts. The signal is deactivated again when the exposure time ends. The Exposure Active Signal can for example be used as the trigger for a flash and is also helpful if used with a system where the camera or the object is moved between two shots. As a measure against motion blur, the signal can be evaluated to avoid movement during exposure.
More about Exposure Active Signal -
Field Programmable Gate Array (FPGA)
This makes it possible to optimize the FPGA for particular applications, which makes the FPGA particularly suitable for vision systems.
More about Field Programmable Gate Array (FPGA) -
Fixed Box Camera
Fixed camera e.g. for video surveillance. Basler's IP fixed box cameras are characterized by exceptional image quality and high sensitivity, but also for the high frame rates they can achieve using different compression formats.
More about Fixed Box Camera -
Fixed Dome Camera
Fixed camera with dome housing. Basler IP Fixed Dome Cameras have vandal-resistant aluminum housings, allowing video surveillance applications outdoors and under tough indoor conditions. Basler IP Fixed Dome Cameras can be easily mounted to a wall or ceiling, and an internal three axis camera bracket allows complete flexibility when aiming the camera.
More about Fixed Dome Camera -
Flash Window Signal
The “Flash Window Signal” is a digital signal send by the camera to control external light sources. The Flash Window Signals indicates the state of image capture for a camera using a rolling shutter. The signal serves to mark the time window within which a light pulse (such as from a flash device) must be triggered to be effective for capturing the image on the entire sensor.
The signal activates when the time window for the light pulse begins and switches back to inactive when it ends. For a rolling shutter, flash-style lighting can help avoid the rolling shutter effect on moving objects.
More about Flash Window Signal -
F-Mount
A bayonet type lens mount introduced by Nikon. The lens mount employs a flange to image plane distance of 46.5 mm (1.83 inch).
More about F-Mount -
FPS
Unit of the frame rate. The frame rate describes the frequency that is needed to update a videostream. It is measured in fps (frames per second). A higher frame rate is an advantage if the videostram shows movements, because it provides a continously high image quality.
More about FPS -
Frame Grabber
An electronic device installed in a PC for interfacing to a camera. The frame grabber (FG) accepts video streams transmitted from a camera, can manipulate them and makes them available to the PC. Control signals to and from the camera are also passed by the frame grabber. Used for cameras that adhere to the Camera Link standard.
More about Frame Grabber -
Frame Rate
The number of frames per second that are acquired or transmitted.
More about Frame Rate -
Frames per Second
Unit of the frame rate. The frame rate describes the frequency that is needed to update a videostream. It is measured in fps (frames per second). A higher frame rate is an advantage if the videostram shows movements, because it provides a continously high image quality.
More about Frames per Second -
Frame Start Trigger Delay
"Frame Start Trigger Delay" is a feature used to control the image capture. Users can deploy this feature to define a time interval by which a camera's image capture is delayed of microseconds. After receipt of the trigger signal by the camera, the camera delays by the defined period before recording the picture. This allows for the image area on a camera recording images of a conveyor belt to be adjusted for better results, instead of trying to adjust the band's existing trigger signal.
More about Frame Start Trigger Delay -
Full HD
Full HD or 1080p is a short form term designating a set of HDTV characterized by 1080 horizontal lines resolution and progressive scan. In contrast to the 1080i standard, in 1080p the image is not interlaced. The term 1080p in most cases refers as well to the widescreen aspect ratio of 16:9, which implies a horizontal resolution of 1,920 pixels. The supported frame rate can be added to the term, e.g. 1080p24.
More about Full HD -
Full well capacity
This quantity refers to the largest charge that a pixel can hold before an overflow of charge into adjacent pixels within the sensor causes the so-called blooming. Both, full well capacity and dark noise are decisive for the dynamic range of a sensor or camera.
More about Full well capacity -
GenApi
An API for configuring machine vision cameras. The API adheres to the GenICam standard.
More about GenApi -
GenICam Explorer
The GenICam Explorer is integrated into the software microDisplay X. The tool discovers connected cameras automatically and provides direct access to the GenICam interface of the camera. By a graphical user interface (GUI), the camera connection, link topology and the camera itself along with the frame grabber firmware can be configured and controlled and the settings be saved. The new GUI options are an alternative to configuration via SDK.
The GenICam Explorer is available for the camera interfaces GigE Vision, CoaXPress and Camera Link HS and can be used for all prevalent camera models.
More about GenICam Explorer -
GenICam™
GenICam is a standard of the European Machine Vision Association (EMVA) allowing to control machine vision cameras via a generic programming interface independent of the type of interface used for data transmission (e.g. GigE Vision, FireWire, Camera Link, USB), camera type and image format. This approach makes it easy to connect cameras compliant with the GenICam standard without the need for camera-specific configurations.
The core of GenICam is a description of the camera's properties in an XML Descriptor File. Using this file, a translator directly generates an Application Programming Interface called GenAPI or the elements of a Graphical User Interface (GUI). This lets the user easily access the features and functions available on the camera. GenTL, a module of GenICam, provides a unified mechanism for grabbing and streaming images from the camera. The GigE Vision standard requires that cameras with a GigE interface provide the XML Descriptor File. A Descriptor File for Basler’s IEEE 1394 compliant cameras is available as well.
You can find more information about GenICam at www.GenICam.org.
More about GenICam™ -
GenTL
A module of GenICam providing a unified mechanism for grabbing and streaming images from the camera.
More about GenTL -
Gigabit Ethernet
Gigabit Ethernet (GigE) defines a tethered protocol for data transfer based on the widespread Ethernet standard. Gigabit Ethernet currently offers the widest technological flexibility with regard to bandwidth, cable length, and multi-camera functionality. It allows for up to 100 m length for individual cables, power supply via the data cable (PoE), and a data rate of 1000 Mbit/s. Gigabit Ethernet is the fastest growing interface for digital cameras in the field of industrial image processing. It is a universally applicable digital interface, which for the first time provides the potential to produce cameras that can replace analog devices in almost every application.
In addition to the physical interface defined by the Gigabit Ethernet standard (among others), the especially clear and logical implementation of the GigE Vision Standard supports easy integration in all image processing programs via the use of software “libraries”. Also, because the exchange of GigE Vision compatible cameras can be performed without changing the application software, new investments and follow-up costs can be estimated and well-planned with cameras based on the GigE Vision Standard. It is the right choice for data rates of up to 100 MByte/s and makes complex setups with several cameras very simple.
More about Gigabit Ethernet -
GigE Vision
The use of Gigabit-Ethernet standards allows industrial cameras to integrate seamlessly into existing network systems.
More about GigE Vision -
H.264
H.264 is a standard for video compression. H.264 compression has become mainstream and is offered on almost every IP camera and recording server today. H.264 has bridged the large gap between MJPEG (high quality, high bandwidth, and high storage consumption) and MPEG-4 (lower quality, low bandwidth, and low storage consumption). H.264 also has a second “layer” of options known as “profiles”. These profiles offer different codec efficiencies and will affect the overall quality of the transmitted video as well as the bandwidth and storage consumption. Today, different H.264 profiles are available. Only three of these are common in surveillance applications: Baseline profile (BP), Main profile (MP), and High profile (HiP). Each manufacturer must decide which of these profiles will be offered with their cameras.
More about H.264 -
HD
HD or 720p describes the lowest resolution defined in the television standard for High Definition TV. "720" stands for the vertical resolution in lines (or rows of pixels). The letter p defines the transmission and display via progressive scan (in contrast to interlaced scan). Assuming an aspect ration of 16:9, the horizontal resolution of 720p is 1280 pixels.
More about HD -
High Profile
A specific H.264 profile used in high definition Blu-Ray movies. It is much more efficient than either Baseline or Main profile. High profile will produce very robust video and fewer compression artifacts than either of the other two profiles. The processing requirements for the video encoder in the camera are higher than the other profiles, but the bandwidth and storage savings are very high.
More about High Profile -
I2C
This is a serial data bus inside devices used for communication between different modules.
More about I2C -
ICX618 Replacement Camera
A replacement camera lets you replace an older camera in the system or end application without necessarily requiring further hardware changes or major adaptations in the configuration or imaging geometry.
To make this possible, the sensor size and resolution of the replacement camera must correspond to those of the old camera. Additionally, the replacement camera must generate nearly identical image data in the same imaging conditions as the camera it replaces. Of course, the mechanical interfaces of the replacement camera must also be compatible with those of camera to be replaced.
The Basler ace U camera acA640-121gm is the first 1:1 replacement camera in the market that has the same image properties of cameras with the discontinued Sony CCD sensor ICX618.
More about ICX618 Replacement Camera -
IEEE1394 (FireWire)
Cameras with FireWire or IEEE1394 ports have been on the market since the 90s and are used in many machine vision applications. The FireWire name is actually Apple's trademarked designator for the standard. The cameras were originally outfitted based on IEEE1394a. This standard set a broadband limit of 32 MB/s, enough for smooth transfer of a data volume of roughly 30s images per second at 1 megapixel of resolution. The IEEE1394b current standard works with twice that much bandwidth. Cables and hardware are standardized and widely available. For this reason the use of IEEE1394 interfaces can in many cases reduce system costs.
In the medium term IEEE1394 is expected to lose its importance for the PC market, and compliant hardware will correspondingly become more difficult to find.
More about IEEE1394 (FireWire) -
IEEE1588 (PTP)
PTP allows for greater precision in timekeeping — down to the nanosecond level when implemented as hardware and down to several microseconds when implemented as software.
More about IEEE1588 (PTP) -
IIoT
IIoT describes interconnected and communicating sensors, instruments or other network devices in industrial applications. Through this network, data can be collected and evaluated, thus increasing productivity. Under the term IIoT, Basler combines all vision-relevant sensors that generate and share information within a factory automation environment. IIoT sensors will support industry 4.0 environments.
More about IIoT -
Image Sensor
An electronic device containing a large number of small light sensitive areas (pixels), where photons generate electric charges that are transformed to an electric signal.
More about Image Sensor -
Improved Sharpness
Further improvement can be pursued by a supplemental sharpening factor. The Improved Sharpness feature is particularly helpful in applications that work with OCR (Optical Character Recognition).
More about Improved Sharpness -
Industry 4.0
In terms of Vision Solutions, this means that camera and PC/embedded boards can actively communicate with other components along the production line. Standards such as OPC UA and Time Sensitive Networks, as well as the appropriate combination of vision hardware (cameras, smart lighting, etc.), image processing software, communication standards and cloud or edge functions are important to ensure smooth operations.
More about Industry 4.0 -
Interlaced scan
Readout of an image sensor or update of an image frame occurs in an alternating fashion for odd-numbered and even-numbered horizontal lines of pixels.
More about Interlaced scan -
Internet Protocol
The Internet Protocol (IP) is the principal communications protocol used for relaying datagrams (packets) across an internetwork using the Internet Protocol Suite. Responsible for routing packets across network boundaries, it is the primary protocol that establishes the Internet.
More about Internet Protocol -
Internet Protocol address
Each device on a network is assigned a unique address, known as an Internet Protocol (IP) address, that works in a manner similar to a telephone number.
More about Internet Protocol address -
IoT - Internet of Things
IoT is a collective term for various technologies that enable networking between electronic systems. In the Internet of Things, fixed, physical objects (things) are linked and connected with each other and thus get a virtual representation.
The industrial application of IoT is called IIot (Industrial Internet of Things).
More about IoT - Internet of Things -
IR-cut filter
An infrared cut filter (IR-cut filter) is used to block light with wavelengths longer than visible light while transmitting visible light. IR cut filters can operate by either reflecting or absorbing the light to be blocked. IR cut filters are often used in solid state (CCD or CMOS) video cameras to block infrared light, which otherwise causes a lowering of the contrast due to the high sensitivity of many camera sensors to near-infrared light. IR cut filters for this purpose are mostly operating by reflecting the IR portion of the light.
More about IR-cut filter -
Line Scan
Line Scan cameras have a sensor consisting of 1 to 3 pixel lines.
More about Line Scan -
LVDS
More about LVDS -
MIPI Alliance
More about MIPI Alliance -
MIPI Camera Serial Interface 2 (MIPI CSI-2)
More about MIPI Camera Serial Interface 2 (MIPI CSI-2) -
Mono 12
The “Mono 12” is a data format for transmission of image data on monochrome cameras. When a monochrome camera is configured for Mono 12 format, 2 bytes (16 bits) of brightness data per pixel are issued, with 12 effective bits. The brightness information is included in the 12 lowest-value bits, while the 4 highest-value bits are filled with zeros.The pixel data are not processed or interpolated in the Mono 12 format, i.e. the raw data is used.
More about Mono 12 -
Mono 12 Packed / Mono 12 p
The “Mono 12 Packed / Mono 12 p” is a data format for transmission of image data on monochrome cameras. When a monochrome camera is configured for Mono 12 Packed format, 12 bits of brightness data per pixel are issued. The data for 2 pixels is packed into 3 bytes. This saves 4 bits of bandwidth per pixel compared with the unpacked Mono 12 format. In the Mono 12 Packed format, the pixel data is neither processed nor interpolated, meaning the raw data is used.
The data in the Mono 12 Packed format is output line for line in the sequence of the pixels on the camera.
More about Mono 12 Packed / Mono 12 p -
Multicast
Multicasting allows an IP camera to simultaneously transmit images to multiple devices attached to the network while only using the bandwidth that is normally required to transmit to a single device.
More about Multicast -
Multi-Streaming and Multi-Encoding
Multi-streaming and multi-encoding provide up to four image streams using any encoder type combination, e.g., one stream with H.264 compression, another one with MJPEG compression, and a third and fourth stream with MPEG-4 compression. It is also possible to encode up to four streams using the same encoder type, such as H.264.
More about Multi-Streaming and Multi-Encoding -
OEM
An original equipment manufacturer (OEM) manufactures products or components. These products are then purchased by a company and retailed under that purchasing company's brand name. OEM refers to the company that originally manufactured the product.
More about OEM -
Original Equipment Manufacturer
An original equipment manufacturer (OEM) manufactures products or components. These products are then purchased by a company and retailed under that purchasing company's brand name. OEM refers to the company that originally manufactured the product.
More about Original Equipment Manufacturer -
PGI
Basler’s PGI feature set is an in-camera image optimization for color cameras based on a unique combination of 5x5 Debayering (for some models 4x4 Debayering), Color-Anti-Aliasing, Denoising and Improved Sharpness.
PGI works directly inside the camera and is designed to get the best possible image quality without any loss of speed or increase of processor load.
More about PGI -
PoE | Power over Ethernet
PoE (Power over Ethernet) technology describes a system to pass low voltage electrical power safely, along with data, on Ethernet cabling. The IEEE standard for PoE requires category 5 cable or higher for high power levels, but can operate with category 3 cable for low power levels.[1] Power is supplied in common mode over two or more of the differential pairs of wires found in the Ethernet cables and comes from a power supply within a PoE-enabled networking device such as an Ethernet switch or can be injected into a cable run with a midspan power supply.
More about PoE -
Primary Colors
A set colors from which other colors can be derived. For example, if light of red and green as primary colors is mixed at equal intensities, yellow light results (a secondary color).
More about Primary Colors -
PTP
PTP allows for greater precision in timekeeping — down to the nanosecond level when implemented as hardware and down to several microseconds when implemented as software.
More about PTP -
RGB-D (Red/Green/Blue-Depth)
By merging both data, color 3D point clouds are created that can be useful for image processing and interpretation in a number of applications.
More about RGB-D (Red/Green/Blue-Depth) -
RoHs
Collective term for Directive (European Union) 2002/95/EG and its realization into national right. With RoHS (Restriction of Hazardous substances) the usage of environmentally hazardous substances in products should be prevented. Such substances are e.g. lead, cadmium, hexavalent chromium, mercury, polybrominated biphenyls, polybrominated diphenyl ether.
More about RoHs -
ROI | Region of Interest
Region of Interest (ROI), formerly known as AOI - a portion on the camera's sensor. Its position on the sensor and the number of included pixel lines and rows can be specified. During operation, only the pixel information within the specified portion is transmitted out of the camera. A reduced ROI leads in general to higher maximum frame rates.
More about ROI -
ROI-Level 1
ROI integration level that supports certain resolutions (e.g. CIF, PAL etc.).
More about ROI-Level 1 -
ROI-Level 2
ROI integration level that supports any resolution. Configuration possible via WebInterface only.
More about ROI-Level 2 -
ROI-Level 3
ROI integration level that supports any resolution. Configuration possible via WebInterface or video management software.
More about ROI-Level 3 -
ROI-Level 4
ROI integration level that supports digital pan/tilt by moving an ROI.
More about ROI-Level 4 -
ROI-Level 5
ROI integration level that supports output scaling configuration via video management software.
More about ROI-Level 5 -
SDK
Software Development Kit
More about SDK -
Sensor
An electronic device containing a large number of small light sensitive areas (pixels), where photons generate electric charges that are transformed to an electric signal.
More about Sensor -
Sequencer
While the camera is in the process of taking a series of images, the configuration of the camera changes based on the sequence sets in between shots. The sequencer function can be used to react to a broad range of specifications and conditions during image capture, such as different lighting situations or areas of interest (AOI).
The preconfiguration of the various image settings allows for significantly improved frame rates.
More about Sequencer -
Signal-to-Noise Ratio
Signal-to-noise ratio (often abbreviated SNR or S/N) is a measure used in science and engineering that compares the level of a desired signal to the level of background noise. It is defined as the ratio of signal power to the noise power. A ratio higher than 1:1 indicates more signal than noise. While SNR is commonly quoted for electrical signals, it can be applied to any form of signal (such as isotope levels in an ice core or biochemical signaling between cells).
More about Signal-to-Noise Ratio -
Single Board Computer (SBC)
More about Single Board Computer (SBC) -
Smart Factory
More about Smart Factory -
SoC (System on Chip)
In addition to main processor(s), the SoC contains many integrated circuits for specific functions, such as interface or I/O functions on a chip.
More about SoC (System on Chip) -
SoM (System on Module)
It is available as a complete module and can be adapted to the individual user’s needs by means of a carrier board, which the SoM is plugged into.
More about SoM (System on Module) -
TCP
Transmission Control Protocol
More about TCP -
TWAIN
Image editing software with a TWAIN interface can draw data from any TWAIN-compliant image capture device.
More about TWAIN -
UDP
User Datagram Protocol
More about UDP -
Unicast
Unicasting is the common capability that lets an IP camera transmit images directly to a single device attached to the network.
More about Unicast -
Vandal resistance
Ability to withstand deliberate attempts of damage. Basler IP Fixed Dome Cameras have vandal-resistant aluminum housings, allowing video surveillance applications outdoors and under tough indoor conditions.
More about Vandal resistance -
VBR
Variable Bit Rate
More about VBR -
VGA | Video Graphics Array
A graphics card driver standard. VGA (Video Graphics Array) defines a certain combination of image resolution and color depth as well as repetition rate. VGA comprises, among others, a mode with 640x480 pixels.
More about VGA -
VoIP
Voice over IP
More about VoIP -
White Balance
The white balance helps to adjust a color camera to the ligting conditions.
More about White Balance