Frame Rate in Industrial Cameras

Frame rate is the frequency at which an industrial camera captures and transmits distinct images, measured in frames per second (FPS). When specifying frame rate in industrial cameras, you are defining the absolute speed limit of a production line. If a conveyor moves 50 parts past the lens every second, the system must reliably acquire, transfer, and process at least 50 frames in that same window without dropping data. Pushing for higher throughput requires engineers to balance the sensor's resolution against the hard physical limits of interface bandwidth and exposure time.

The physics of camera speed

Frame rate is not simply a target number you type into a software slider; it is a physical constraint governed by three sequential processes: exposure, sensor readout, and data transfer.

The total time required to produce a single frame is primarily dictated by the exposure time (how long the image sensor is exposed to light) and the readout time (how long it takes the CMOS sensor to convert the accumulated charge into a digital signal). If an inspection task is light-starved and requires a 20-millisecond exposure to achieve an acceptable signal-to-noise ratio, the theoretical maximum frame rate cannot exceed 50 FPS, regardless of how fast the sensor architecture or the cable might be.

The interface bandwidth bottleneck

Once the image data leaves the sensor, it must travel to the host PC or embedded vision board. In high-resolution applications, the hardware interface is almost always the primary bottleneck.

A 20-megapixel sensor generates a massive amount of data per frame. If the interface cannot push that data across the cable fast enough, the camera must slow down its capture rate to prevent the internal buffer from overflowing.

Interface Standard

Practical Bandwidth

Impact on Frame Rate

GigE Vision (1 GigE)

~115 MB/s

Limits high-resolution cameras to lower frame rates. Excellent for long cable runs.

USB3 Vision

~400 MB/s

The standard workhorse. Supports high frame rates for medium-to-high resolutions.

MIPI CSI-2

Scales by lane (e.g., ~1.25 GB/s on 4 lanes)

Direct board-to-board connection. Bypasses standard cabling for maximum FPS in embedded systems.

How to increase frame rate without changing hardware

If your current camera is hitting a bandwidth limit but your production line needs to move faster, you have two primary software-driven methods to increase the FPS without upgrading the hardware:

1. Region of Interest (ROI)

Instead of reading the entire sensor, you can configure the camera to only read a specific subset of pixels. If you are tracking a small barcode in the center of the frame, cropping a 5 MP sensor down to a 1000 x 1000 pixel ROI drastically reduces the data payload. The interface clears the smaller frame much faster, allowing the camera's frame rate to skyrocket.

2. Binning / Decimation

If you need to maintain the full optical field of view but want higher speeds, you can use binning (combining adjacent pixels) or decimation (skipping pixels). This reduces the total resolution of the transmitted image, lowering the bandwidth requirement and increasing the frame rate, though at a direct cost to spatial resolution.

Frame rate vs. exposure time: What is the tradeoff?

To increase your frame rate, you must often decrease your exposure time. There is no way around this physical tradeoff.

Shorter exposures mean fewer photons hit the photodiode. If you double your frame rate from 60 FPS to 120 FPS, you are effectively cutting the maximum possible exposure time in half, which drops your image brightness and reduces the total photon count collected per frame, dropping your signal-to-noise ratio. To maintain a sharp, high-contrast image at high frame rates, you must flood the inspection area with high-intensity industrial illumination or switch to a synchronized strobe light.

Frequently asked questions

Dropped frames occur when the camera generates data faster than the host system can process it. This is rarely a camera failure. It typically happens when the interface bandwidth is saturated, the network switch is overloaded (in GigE setups), or the host PC's CPU cannot process the incoming image buffers fast enough.

It depends on where the processing happens. If a color camera outputs raw Bayer data (typically 8-bit), the bandwidth requirement is exactly the same as a monochrome camera. However, if you configure the camera to perform the debayering on-board and output a processed RGB24 format, you are tripling the data size per pixel over the cable. That massive bandwidth spike will severely lower your maximum frame rate.

Not always. A USB 3.1 camera might successfully capture and transmit 200 FPS, but if your machine vision software takes 15 milliseconds to run a complex algorithm on a single frame, without hardware pipelining or multi-threaded buffering, your effective throughput is capped at ~66 FPS. Your camera's capture rate must ultimately align with your system's processing reality.

Glossary