Other Information

nMeasuring Principles

<Basic Principle>

The RANGE7/5 uses the light sectioning method to emit a horizontal stripe light through a cylindrical lens to the object. The reflected light from the object is received by the CMOS sensor, and then converted by triangulation into distance information.This process is repeated by scanning the stripe light vertically on the object surface using a Galvano mirror, to obtain a 3D image data of the object.

Emitting lens

CMOS sensor

Galvano mirror

Light-receiving lens

Laser beam

Object

<High-Speed Image Processing Circuit>

The stripe light is scanned on the CMOS image plane at one horizontal line per frame, and the CMOS is driven so that the block readout start position is shifted one line per frame. Approximately 1400 frames are acquired.

Frame rate: 600 frames/sec.

Block readout: 350 lines

The output signal from the CMOS sensor is then converted into a digital signal, which is then subjected to digital signal processing. The processed data is finally transferred to the computer via the USB interface.

Frame

Memory

CMOS Sensor

 

 

 

 

A/D

 

 

 

 

 

FPGA

 

 

 

 

USB Driver

 

 

USB

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Galvano

 

 

 

 

 

 

Lazer

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Diodo

 

 

 

 

 

 

 

 

 

 

 

 

 

Scanner

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

CPU

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Driver

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

CPU Data Bus

 

 

 

<Time center of gravity and Space center of gravity>

With this instrument, 3D images are obtained by calculating the time center of gravity of each pixel of the CMOS sensor. With this method, compared to the space center of gravity, use of the time center of gravity reduces the influence of sensitivity variations of the CMOS sensor pixels and variations in object brightness.

20