aPPendix d: MatLaB saMPLe code

HDL-64E S2 and S2.1 User’s Manual

Matlab sample code to read calibration data from HDL-64E output.

fileFilter = '*.pcap'; [File_name,Directory]=uigetfile(fileFilter,'Open a .pcap file') ;

Filename=[Directory File_name];

tic;

fid=fopen(Filename);

ttc=fread(fid,40);

ttc=fread(fid,42);

ttc=fread(fid,inf,'1206*uint8=>uint8',58);

%ttch=dec2hex(ttc);

%Determine how many data packets. Packet=size(ttc)/1206;

% Convert data to single precision. S1=single(ttc(2,:))*256+single(ttc(1,:)); S2=single(ttc(102,:))*256+single(ttc(101,:)); S3=single(ttc(202,:))*256+single(ttc(201,:)); S4=single(ttc(302,:))*256+single(ttc(301,:));

for i=0:10000 % Packets loop

status(i+1)=(ttc(1205+i*1206));

value(i+1)=(ttc(1206+i*1206)); end

a=[85 78 73 84 35] fclose(fid);

toc;

Ind=strfind(value,a);

%Loop through 64 lasers.

for i=1:64 temp=single(value(Ind(1)+64*(i-1)+16:Ind(1)+64*(i-1)+16+7)); temp1=single(value(Ind(1)+64*(i-1)+32:Ind(1)+64*(i-1)+32+7)); temp2=single(value(Ind(1)+64*(i-1)+48:Ind(1)+64*(i-1)+48+7)); temp3=single(value(Ind(1)+64*(i-1)+64:Ind(1)+64*(i-1)+64+7)); LaserId(i)=temp(1);

%Add high and low bytes of Vertical Correction Factor together and check if positive or negative correction factor.

VerticalCorr(i)=temp(3)*256+temp(2); if VerticalCorr(i)>32768

VerticalCorr(i)=VerticalCorr(i)-65536;

End

%Scale Vertical Correction Factor by Diving by 100. VerticalCorr(i)=VerticalCorr(i)/100;

[ 20 ]

Page 23
Image 23
Velodyne Acoustics HDL-64E S2.1 user manual APPendix d MatLaB saMPLe code

HDL-64E S2, HDL-64E S2.1 specifications

The Velodyne Acoustics HDL-64E S2.1 and HDL-64E S2 represent cutting-edge advancements in Lidar technology, specifically designed for autonomous vehicle navigation and mapping applications. These high-definition lidar sensors are acclaimed for their precision, reliability, and robustness, making them indispensable tools in various industries, from robotics to transportation.

One of the defining features of the HDL-64E series is its 64 laser channels, which allow for high-resolution 3D mapping of the environment. This multi-channel design significantly improves the sensor's ability to capture fine details in the surrounding area, providing a complete spatial representation necessary for autonomous driving. The HDL-64E S2.1 and S2 can generate dense point clouds with over 1.3 million points per second, facilitating real-time data acquisition and processing capabilities.

The HDL-64E series employs advanced technologies for optimal performance. Its 360-degree horizontal field of view and a vertical field of view ranging from -15 to +15 degrees allow the sensors to detect and classify objects in a comprehensive manner. This feature is crucial for ensuring the safety and efficacy of autonomous vehicles, as it enables them to perceive their surroundings from multiple angles.

In terms of accuracy, the HDL-64E models boast a measurement range of up to 120 meters, with an accuracy of ±2 centimeters. This level of precision ensures that autonomous systems can make informed decisions based on reliable data, essential for avoiding obstacles and navigating complex environments.

The sensors are designed to operate effectively in a range of environmental conditions. With IP67-rated waterproofing and robustness against dust and debris, the HDL-64E S2.1 and S2 are built to withstand challenging operating environments, thus ensuring continuous, dependable performance.

Integration of the HDL-64E series into existing systems is streamlined, thanks to its advanced Ethernet interface. This functionality makes it easier for developers to incorporate the Lidar data into existing software frameworks, enhancing the usability of the sensor in various applications.

In summary, the Velodyne Acoustics HDL-64E S2.1 and HDL-64E S2 represent a significant leap forward in Lidar technology, featuring high-resolution mapping, advanced detection capabilities, and rugged design. These characteristics make them an ideal choice for companies looking to implement reliable and precise sensing solutions in their autonomous systems.