From Stationary to Mobile: Unleashing the Full Potential of Terrestrial LiDAR through Sensor Integration

Abstract This paper discusses a comprehensive methodology for transforming a static LiDAR (Light Detection and Ranging) system into a mobile mapping system. The initial step involves integrating various sensors, such as GNSS (Global Navigation Satellite System), IMU (Inertial Measurement Unit), and odometry sensors, into the system’s architecture to accurately estimate the position and orientation of the LiDAR sensor at any given time. Specialized algorithms compensate for the vehicle’s motion, and procedural requirements must be followed to ensure safe operation. Hardware integration is critical, requiring proper calibration and validation. Data processing workflows, including algorithms and software tools, play a crucial role in mobile LiDAR mapping systems. The quality and accuracy of the data depend on various factors, and practical considerations such as cost and time are also important. The proposed methodology provides a clear and effective way to transform a terrestrial LiDAR sensor into a mobile mapping system, increasing its flexibility and usability for various applications. However, the detailed analysis, system validation, and data processing details fall outside this paper’s scope and will be discussed in future publications.

. These sensors play a pivotal role in accurately and consistently estimating the position and orientation of the LiDAR sensor at any given time.By harnessing the data these sensors provide, it becomes possible to model the position and orientation of the vehicle and the LiDAR sensor within a unified 3D object space coordinate system.This integration process is the foundation for establishing a comprehensive and dynamic mapping system enabling mobile LiDAR data acquisition.The successful execution of these steps represents a significant advancement in LiDAR mapping, with far-reaching implications for diverse applications such as urban planning, environmental monitoring, and infrastructure management (Haala et al. 2008).The pursuit of transforming a static LiDAR system into a mobile LiDAR mapping system requires meticulous attention to detail, rigorous experimentation, and a deep understanding of sensor integration techniques, paving the way for innovative advancements in the field of LiDAR technology.
Once the necessary sensors have been integrated, specialized algorithms such as (Qin et al. 2022) and (Wang et al. 2021) can compensate for the vehicle motion, which can cause errors in the collected data.These algorithms can correct the data against motion distortion, ensuring that the collected data is accurate and reliable.In addition to these technical requirements, several procedural requirements must be followed to successfully transform a static LiDAR system into a mobile LiDAR mapping system.This may include developing a custom software layer to handle the data collection and processing and developing procedures for safely mounting and operating the system onboard a moving vehicle.
Sensor integration is critical for the mobile LiDAR mapping system, providing essential data on the position and orientation angles of the vehicle in addition to real-time synchronization signals during the data acquisition and integrating LiDAR and GNSS/INS/ DMI components, AKA.POS (Position and Orientation System) requires proper calibration and validation to ensure that the data collected by each component is accurate and consistent (Hutton et al. 2008).
Data processing workflows also play a crucial role in building mobile LiDAR mapping systems.They include the algorithms and software tools used to process the collected data and the procedures for integrating it into a final 3D map or model.Another important consideration is the quality and accuracy of the data.The data quality depends on various factors, such as the resolution and accuracy of the LiDAR sensor, the sensor integration quality, and the data processing algorithms (Wang et al. 2019).It is essential to use reasonable sensors, perform proper calibration and validation of the sensors, and use data processing algorithms to remove noise and error from the data and ensure the highest quality of the mobile data.
There are also practical considerations that must be considered when building a mobile LiDAR mapping system.One primary consideration is the cost of the system, which can be significant depending on the type of sensor, the data processing algorithms, and the hardware required.Another important consideration is the time and resources required to collect and process the data.Collecting data in small, incremental chunks may be more efficient than collecting all the data in one shot (Abdelwahab et al. 2019).
Our research demonstrates the stepwise generic integration of GNSS and IMU sensors into static LiDAR systems, enabling conversion to mobile systems.We believe our work contributes to the research community on the following aspects: � Comprehensive integration process: This article details the process required to transform a static LiDAR system into a mobile one.We thoroughly explained the relevant steps, including GNSS and IMU sensor integration, calibration procedures, and synchronization techniques.This level of detail is often lacking in the existing literature, making our article a valuable resource for researchers and practitioners seeking practical guidance on similar sensor integration efforts.� Overcoming Sensor Limitations: The LiDAR sensors used in our study were initially designed for static applications and had no option for mobility.Successful integration of GNSS and IMU sensors removes this limitation and extends the capabilities of LiDAR systems for use in mobile scenarios.This sensor integration methodology opens up new possibilities for researchers and industry professionals considering using static LiDAR sensors in dynamic environments and expands the potential applications of such sensors.� Practical Relevance: Our research has practical implications in areas as diverse as self-driving cars, robotics, and environmental monitoring.Mobile LiDAR systems are essential in these areas, providing accurate and efficient data collection for mapping, navigation, and object detection.By enabling static LiDAR systems to become mobile LiDAR systems, our approach provides a cost-effective solution that leverages existing LiDAR sensors, improves accessibility, and reduces the need for dedicated mobile systems.� Methodological Contributions: Beyond the actual integration process, our article contributes to sensor integration methodology.We tackle challenges related to data synchronization and data fusion techniques, providing insights and best practices to overcome these hurdles.This knowledge will serve as a foundation for future research efforts in sensor integration and facilitate progress in this field.
It is worthwhile mentioning that this paper also includes identifying requirements and integrating the system; however, it is essential to note that the detailed analysis and validation of the system's data fall outside the scope of this paper and will be presented in another publication.

LiDAR types
In recent years, LiDAR, short for light detection and ranging, has experienced a remarkable evolution.Initially utilized as a valuable measurement technique for studying atmospheric aerosols and conducting aerial mapping, LiDAR has become a highly sought-after gem within optomechanical engineering and optoelectronics domains.The comprehensive exploration of LiDAR technology in mapping and autonomous systems is extensively elucidated in the notable work (Khader and Cherian 2020), providing a solid introduction to this groundbreaking technology.
LiDAR systems in the market use different types of scanners to direct the laser beam and generate 3D point clouds to map the environment.The method used to direct the beam can affect the system's accuracy, speed, FOV (field of view), object detection capabilities, and resolution.These systems can be broadly classified into two categories based on the method of beam orientation: mechanical LiDAR and solid-state LiDAR (Agunbiade and Zuva 2018).
Mechanical LiDARs provide a wide field of view, using expensive optics and rotating or galvanometric mirrors or prisms attached to mechanical actuators.One variant of mechanical LiDAR contains a light source, and a detector rotates the optical assembly about a mechanical axis with multiple detectors arranged in parallel along the axis of rotation (Royo and Ballesta-Garcia 2019).Rotating the optical setup is usually the preferred scanning option because it provides straight and parallel scan lines at constant scan speed over a wide field of view (Yoo et al. 2018).
The mechanical type also permits a high SNR (Signal-to-Noise Ratio) for a wide field of view.However, it has several disadvantages, including bulkiness, high cost, reliability issues, and high sensitivity to shock and vibes (Holmstr€ om et al. 2014).It usually uses a pulsed laser source, characterized by high power consumption and limited scanning frequency due to rotating inertia.However, mechanical LiDAR is very effective for long-range sensing and is valuable in most mapping applications (Agunbiade and Zuva 2018).
Solid-state LiDAR systems differ from mechanical systems in that they do not have to move the mechanical components, which reduces the FOV.However, these systems can increase FOV by combining multiple sensors (N.Li et al. 2022).They are higher in resolution, faster, more powerful, less expensive than mechanical LiDAR, and are also physically smaller and lighter.One type of solid-state LiDAR is based on MEMS (Micro Electro-Mechanical System) scanners (Nam and Gon-Woo 2021).
A MEMS-mirror allows programming to control the laser beam using small mirrors whose angle is determined by applying a stimulus, directing the beam to a specific location (Khader and Cherian 2020).LiDAR MEMS systems feature a lightweight design, compact size, and low power consumption.However, the receiver aperture in MEMS systems, which determines the receiver SNR, is often relatively small, in the range of a few millimeters, which constrains the laser power and hence long ranges are hard to be achieved (Lopac et al. 2022).
Another type of solid-state LiDAR is the OPA (Optical-Phased Array) technology, which does not have moving components but uses optical phase modulators to steer the laser beam.It allows for stable and rapid beam steering, and the OPA structure is highly compact and robust, with the potential for very high measurement speeds and a wide field of view.However, the OPA system does have the disadvantage of a significant loss in laser output power, limiting its ability to scan at longer distances (Lopac et al. 2022).

LiDAR systems design parameters
The LiDAR system has several parameters that must be considered in any system's design, including accuracy, precision, field of view, angular resolution, eye safety, sensitivity to ambient light, maximum range, power consumption, and system cost (Behroozpour et al. 2017).Precision is the repeatability of measurements taken for a target at a fixed distance and is primarily influenced by the distance and the target reflectivity (Lv et al. 2020).Range resolution is the system's ability to distinguish between two or more closely spaced objects in the axial direction and is determined by the detected object's type and size, the transmitted pulse's width, and the receiver's efficiency (Royo and Ballesta-Garcia 2019).
The FOV can vary depending on the structure and technology of the LiDAR system.Horizontal and vertical angles around the front of the sensor define it.Angular resolution distinguishes between two adjacent points in the FOV (Behroozpour et al. 2017).Furthermore, Higher power is desired for detection at greater distances, but it is limited to prevent damage to the human eye.An important parameter is the MPE (Maximum Permissible Exposure) for a given laser beam is determined by its wavelength, diameter, and duration of exposure for continuous mode lasers or pulse width and frequency for pulsed lasers.Additionally, the range of a LiDAR system is limited by the transmitter's power and the receiver's sensitivity (Lopac et al. 2022).

Position and Orientation System (POS)
Positioning sensors, including GNSS receivers, IMU, and DMI (Direct Measurement Indicator), are commonly used in modern MMS (Mobile Mapping Systems) to provide accurate 3D mapping information.These sensors often work together through a sensor fusion algorithm to improve accuracy, although the individual measurement accuracy of each sensor is still essential (Bobkowka et al. 2017).A summary of the GNSS, IMU, and DMI benefits and limitations is explained in Elhashash et al., (2022).The following sections will explain the three primary positioning sensors in more detail.
A GNSS receiver uses positioning satellite constellations to determine its absolute 3D position and velocity with reference to a global coordinate system like WGS84.It receives these signals passively and uses trilateration to calculate its real-time position (Samama 2008).The raw observations from the GNSS receiver's chipset and its solver can provide autonomous positional error at the meter level, depending on the chipset and antenna used (Joubert et al. 2020).Some MMSs use augmented GNSS solutions, like D-GNSS (Differential GNSS) or RTK-GNSS (Real Time Kinematic GNSS), to improve the accuracy of the GNSS receiver to the decimeter or centimeter level (Hofmann-Wellenhof et al. 2008).
DGNSS and RTK-GNSS depend on a network of reference stations near surveyed points to apply corrections and eliminate errors such as the ionosphere and tropospheric delays.They can achieve centimeter-level accuracy in three dimensions with advanced processing algorithms (Rustamov and Hashimov 2018).However, these accuracy levels are only achievable in open areas with good satellite geometry; in dense urban areas with tall buildings, the GNSS signal can be heavily impacted by obstructions, resulting in inaccurate measurements.In such cases, additional sensors may be required to achieve accurate results (Shi et al. 2021).
An IMU is a sensor that records the host platform's angular velocity and linear acceleration, and an onboard computer can leverage these data to calculate the position and orientation relative to the system's initial pose (Yan et al. 2018).However, IMUs suffer from accumulation errors, which can cause significant drift from the true position (Valtonen Ornhag et al. 2022).
The IMU quality can be determined primarily by the type of gyroscope used.Most consumer-grade IMUs use MEMS gyroscopes, which are inexpensive but have poor precision and significant drift errors (Martin et al. 2013).Higher-grade IMUs for precise navigation use more accurate gyroscopes, such as ring laser or fiber optic gyroscopes, with drift errors of less than 1 degree per hour (Ahmed and Tahir 2017).IMUs can be used outdoors, indoors, in tunnels, and other GPS-denied environments, but their standalone measurements will only be accurate for a short period.
A GNSS sensor provides accurate measurements in open areas.It does not accumulate errors as the platform moves, so it is often combined with IMUs for additional observations in complex open and occluded environments, providing more accurate positional information (Falco et al. 2017).
Distance Measurement Instruments (DMIs) measure the distance traveled by the platform.In MMS literature, DMIs are sometimes called odometers or wheel sensors.They calculate the distance traveled based on the number of the vehicle's wheel rotations.They are often used with GNSS or IMUs to reduce accumulated errors and constrain drift from IMUs in GNSS-denied environments like tunnels.DMIs must be calibrated before use and can measure distance, velocity, and acceleration (Bobkowka et al. 2017).

Time synchronization
A GNSS receiver provides precise positioning and timing information to various applications, including aviation, maritime navigation, land surveying, and mobile device location services.One key aspect of GNSS receivers is the ability to synchronize the timing of devices using a PPS (pulse-per-second) signal and the NMEA (National Marine Electronics Association) messages (Vyskocil and Sebesta 2009).
The PPS time synchronization is a method of synchronizing the timing of multiple devices or systems using a standard reference signal.The PPS signal is a pulse transmitted at a precise interval, typically once per second.By synchronizing to the PPS signal, devices or systems can accurately maintain the same time reference, even if they are geographically separated (Niu et al. 2015).
One crucial aspect of PPS is latency, which refers to the time delay between the transmission of the PPS signal and its receipt by the device or system that is synchronizing with it.Various factors, including signal propagation delays, processing delays, and network congestion, can cause latency (Bowen and Danya 2014;Stebbins and Lanson 1961).
NMEA messages are a standardized format for exchanging GNSS data between devices.The messages contain information about the GNSS system, such as the 3D position, timing data, and receiver status.NMEA messages can be used to synchronize the time of multiple sensors, including LiDARs and Cameras, allowing them to work together to provide more accurate positioning and timing information (Lee et al. 2019).
There are several NMEA messages, each with a specific purpose and format (Vyskocil and Sebesta 2009).Some examples of NMEA messages include: � GGA (Global Positioning System Fix Data): This message contains information about the GNSS system, including the time of the GNSS measurement, the satellite positions, and the receiver status.� RMC (Recommended Minimum Specific GNSS Data): This message contains information about the GNSS system, including the time of the GNSS measurement, the position, velocity, and heading of the receiver, and the receiver status.� ZDA (Time and Date): This message contains the time, date, and local time zone offset.� GSA (GNSS DOP and Active Satellites): This message contains information about the satellite configuration and the dilution of precision (DOP) of the GNSS measurement.� GSV (GNSS Satellites in View): This message contains information about the satellites' positions and signal strength the GNSS receiver gives.
� VTG (Track-Made Good and Ground Speed): This message contains information about the GNSS receiver's track-made good (TMG) and ground speed.

Methodology and system components
Building a mobile LiDAR mapping system requires careful consideration of various factors.The following methodology covers all the essential aspects of the process, from determining the requirements to the system testing and qualification.A summary of the research methodology depicted in Figure 1.Each step in the methodology is explained as follows: 1. Determining Requirements: This work assumes that a survey-grade level of accuracy is necessary for this mobile mapping system.Survey-grade accuracy refers to precision and accuracy at the subcentimeter level.This level of accuracy is imperative in professional surveying and mapping applications where repeatability is essential.
Furthermore, the study assumed other crucial factors such as high scanning speed (At least 20 Hz), wide FOV (At least a 100 degrees vertical angle), PRF (Pulse Repetition Frequency) of at least 100 kHz, and an extended detection range of at least 100 meters.2. Selecting sensors: Factors that must be considered include the field of view, range, resolution, eye safety, and laser wavelength of the sensors.In addition to the LiDAR sensors, other hardware components are also needed, such as the POS, and an onboard computer.Based on the examination of the sensor technologies available within the RSIL (Remote Sensing Innovation Lab) at Toronto Metropolitan University, it was determined that the Teledyne Optech Polaris LiDAR sensor was the most suitable option to meet the specified requirements.
A similar methodology was employed in the sensor selection process concerning the selection of the POS.The RSIL possesses a variety of GNSS/ INS components, including the Applanix APX15.However, upon evaluation, it was determined that this system did not meet the specified accuracy requirements, particularly regarding the accuracy of its orientation angles.As a result, the lab engaged Applanix and temporarily procured a POS LV 520 system to be deployed in this system.
The POS LV 520 is widely acknowledged to be of high precision and accuracy and is utilized to achieve survey-grade accuracy that conforms to the established requirements.In addition, data integration protocols must be evaluated to ensure all data is appropriately stored for the post-processing time.

Teledyne Optech Polaris
The Teledyne Optech Polaris TLS (Terrestrial Laser Scanner) is an eye-safe instrument for collecting 3D point cloud data.One of the critical features of the Polaris system is its exceptional long-range capability, allowing it to operate effectively at distances of up to 2000 meters.Additionally, the system offers highspeed data acquisition of 2 million data points per second.It is equipped with a versatile field of view, allowing users to select both horizontal and vertical angles.The horizontal field of view spans 360 degrees, while the vertical field extends to 120 degrees.These features make the Polaris system an adaptable solution for various applications.In this research experiment, the Polaris API (Application Programming Interface) was utilized to lock the sensor's pan angle in a fixed position, enabling exclusive vertical scanning within a 120-degree range throughout the entirety of the data collection for this research project.Also, It can capture multiple LiDAR returns, making it highly efficient and capable of capturing detailed 3D data.It has weather-proof housing, internal data storage, hot-swappable batteries, and flexible external powering options.
Polaris can capture data up to 2000 meters at the lowest PRF @50 kHz.Polaris is categorized as surveygrade LiDAR equipment due to its high accuracy and precision of the produced measurements.Teledyne Optech, the manufacturer of Polaris, reported the instrument range accuracy at 5 mm (1 sigma) for a target at 100 meters and the precision of the measurement reported at 4 mm (1 sigma) for a target placed at 100 meters away from the sensor (Polaris j Teledyne Geospatial, 2013).

Applanix POS LV520
The Applanix POS LV 520 system is designed to enhance navigational accuracy by optimally integrating inertial and GNSS data.The Applanix POS LV system is a high-precision system for determining a land vehicle's position, orientation, and velocity.It features dual GNSS receivers for determining the vehicle's position and velocity and the True Heading using the GAMS (GNSS Azimuth Measurement System) technology.
POS LV 520, typically with GNSS coverage, provides positional RMSE (Root Mean Square Error) of 3 cm horizontally and 5 cm vertically, 0.008 degrees in the leveling angles (roll and pitch), and 0.02 degrees in heading using a DGNSS processing mode.In contrast with a 60 seconds GNSS outage, the positional RMS deteriorates to 42 cm horizontally and 53 cm vertically, while the orientation RMSs will not impact (Trimble Applanix 2023).
Applanix POS LV system accurately estimates errors in the inertial navigator using a Kalman Filter as shown in Figure 2 to process the differences between the position determined by the GNSS and that of the IMU typically at a frequency of 1 Hz (Hutton et al. 2008).

Sensor mounting on the vehicle
A custom mounting plate was designed and fabricated to fit the specific dimensions and mounting hardware of the LiDAR and POS sensors and securely attach them to the vehicle.This research used a Toyota Sienna as the sensor platform, and the mounting plate was tailored to fit the vehicle's dimensions, shape, and roof rack.
The system consists of several components on the car rooftop: Polaris on its custom mount, an IMU, and two GNSS antennas.These components were put together on the vehicle using a wood base plate.The base plate was constructed using two layers of plywood, each measuring 2 feet by 4 feet.The LiDAR mount and IMU were directly mounted on this base plate, while the two GNSS antennas were fixed to the vehicle's roof using suction cup mounts.
The wood plate was secured using U-shaped fasteners and assorted mechanical bolts to the vehicle roof rails.This setup allows the LiDAR, IMU, and GNSS antennas to be appropriately positioned and oriented for optimal and most straightforward configuration, as shown in Figure 3.
Only one aiding sensor was not mounted on top of the vehicle, but rather that it was mounted on the rear wheel of the vehicle; it is the DMI.
The DMI pulses representing partial revolutions of the car wheel, and POS LV converts these pulses into incremental distance measurements.These measurements are then added to give the total linear distance traveled by the vehicle from its initial position.

System frames of references and installation parameters
To ensure optimal performance of the POS system, measuring various parameters related to distance and orientation is necessary.Four measurements were required for the POS system: mounting angles, lever arms, scale factors, and antenna separation.These measurements should be recorded and loaded into the POS LV system using the LV-POSView software  during the first power-on for a new installation (Puente et al. 2013).Some parameters must be measured and declared before the POS LV can navigate and start producing a POS solution.However, before taking the measurements, another critical aspect was considered: the definition of the reference frames of the system.The developed system defines five body frames: IMU, Vehicle, Reference, DMI, and LiDAR.The IMU Frame is a coordinate system with its origin at the center of the IMU.The axes of the IMU Frame are fixed and always labeled on the IMU enclosure with any offsets required to identify the exact location of the IMU origin.
The Vehicle Frame is a right-handed coordinate system rigidly attached to the body of the vehicle, and it is used to pronounce the position and orientation of a vehicle.In this research, the vehicle frame is colocated with the IMU frame.The x-axis points forward, the y-axis points right, and the z-axis points down.
The Reference Frame is a coordinate system with an origin at an arbitrary point the user chooses.This research collates the reference frame with the IMU frame for simplicity.The Reference Frame is typically oriented like the Vehicle Frame, with the x-axis pointing forward, the y-axis pointing right, and the z-axis pointing down.
The DMI Frame is a coordinate system at the contact point between the wheel where the DMI is mounted and the road surface.The orientation of the DMI Frame's axes aligns with the Vehicle Frame, with the x-axis pointing toward the front of the vehicle, the y-axis pointing toward the right, and the z-axis pointing down.
The LiDAR Frame is shown in Figure 4, a coordinate system positioned at the optical reference point of the LiDAR sensor.The manufacturer defines its directions and origin, in this case, Teledyne Optech.The Z-axis is oriented toward the upward direction of the sensor, the Y-axis points in the direction of the laser window at the home position, and the right-hand rule determines the X-axis.
In order for POS LV to operate correctly, a set of installation parameters depicted in Figure 5 have to be measured as accurately as possible, including the following: � Lever Arm -Reference to IMU. � Lever Arm -Reference to DMI. � Lever Arm -Reference to GNSS Primary Antenna.� Mounting Angles -Reference Frame to Vehicle Frame.
� Mounting Angles -IMU Frame to Reference Frame.� Primary to Secondary GNSS Antenna Separation.� DMI scale factor, which is calculated using this formula.
DMI scale factor ¼ 4096 pd , Where the DMI generates 4096 pulses every complete wheel revolution, and d represents the wheel diameter.
Also, in order to georeference the LiDAR range measurements in the post-processing, the following installation parameters have to be roughly estimated since they will be fine-tuned in the post-processing: � Lever Arm -Reference to LiDAR Frame.� Mounting Angles -Reference to LiDAR Frame.
All the installation parameters were measured to the best achievable accuracy using steel tapes and a standard deviation figure to indicate the confidence level in the measurements.All the installation parameters are reported in Table 1, where LA refers to the Lever Arm, and MA refers to the Mounting Angle.
The installation parameters are stored in the Applanix POS View.This software interfaces with the POS PCS and uses the installation parameters as initials for the POS system calibration.The detailed procedure for calibrating the installation parameters of POS LV is out of the scope of this document.However, all the steps required to initialize the POS LV system with its GAMS system are comprehensively furnished in (POS LV V5 Installation and Operation Guide 2022).

POS LV and Polaris synchronization
Teledyne Polaris has a serial port DP9 and is configured optionally to receive a PPS & NMEA message with a predefined specification.NMEA messages must be transmitted over the serial communication link at a specified baud rate to Polaris at a baud rate of 38400; the message is transmitted at 38400 bits per second.It is a commonly used baud rate for NMEA messages and allows efficient data transmission between devices (Polaris j Teledyne Geospatial 2013).
In addition to the baud rate, NMEA messages have other specifications that define the format of the message.For example, Teledyne Polaris expects to receive a message with 8 data bits, meaning each message consists of 8 bits.They may also use one stop bit, which is an extra bit transmitted after the data bits to signal the end of the message.The expected NMEA messages by Polaris may also be transmitted without parity, meaning no additional bits are added to the message for error checking.This allows for a more straightforward message format and faster transmission speeds but may be less reliable when errors are more likely to occur.In order to configure those requirements on the POS system, POSView, an Applanix software interface of POS LV 520, was used to configure the serial port number 3, which provides an RS232 serial communication protocol to transmit the PPS NMEA messages with the required configuration by Polaris.
Teledyne Optech Polaris accepts either Ascii PPS, DPS, or ZDA NMEA messages on its serial interface; in this research, the ASCII (American Standard Code for Information Interchange) PPS format was used and configured on POS LV 520.
Example of a message received by Polaris serial port: $GPPPS,215946.0000,3,2128,18.00,3279,� 75 According to the PPS message format, the UTC (Coordinated Universal Time) of this PPS pulse is

Polaris and POS LV hardware integration
In an Applanix POS LV 520 system, the PCS is a core component of the POS LV system responsible for processing and controlling various aspects of the system's operation.The functions of the PCS include: � Receiving and processing data from the GNSS receivers, IMU, and DMI.� Computing the navigation solution, including position and orientation.� Providing interfaces for communication and data transfer with external devices.� Providing power and timing signals to other components of the system.
The PCS is typically a standalone unit connected to the other components of the POS LV system through various connectors and cables.The POS LV PCS-86 rear panel has ten connectors, as shown in Figure 6.Most of these connectors can only be used with one particular cable, but the I/O and COM connectors can be used with a special multi-connector breakout cable.
The PCS-86 has A1 and A2 ports on the rear panel, which connect the dual GNSS antennas to the GNSS receiver.The A1 port is for the primary GNSS antenna, and the A2 port is for the secondary GNSS antenna.The signals from the GNSS antennas are transmitted to the GNSS receiver through these ports.Also, it has a single DMI connector with two digital ports that serve as interfaces for DMI sensors.
The PCS-86 has an Ethernet interface that enables the communication between it and a host computer for monitoring or control purposes.The Ethernet port can also transmit POS LV data to the host computer through a network switch, allowing for real-time processing or data logging and post-processing.The PCS-86 is also equipped with a USB port on its front panel, which allows it to store navigation data on a USB flash drive.This capability was used in the research work being described.
The PCS-86 has one COM connector that supports a multi-connector breakout cable, which provides access to two independent 4-wire RS232 serial communication ports.It is not used in this research work since Polaris does not require 4-wire communication.Also, it has two I/O (input/output) connectors to interface with external peripherals and devices.
The IMU connector connects the IMU to the POS LV system.The POS LV IMU connector allows the PCS-86 to receive and process data from the IMU and use it in the navigation solution.
The power connector is as simple as three pins, one for the DC volt and another for the ground.The PCS-86 requires a voltage range of 8-34 V DC; the maximum current draw at 8 V is 7 A. The power cable of the PCS was connected to the car battery, which provides 12 DC volts.It was essential to ensure that the power supply and cables to power the PCS-86 could provide sufficient current and maintain stable voltage within the specified range.As an extra layer of protection, a 7 Amps fuze was used to break the circuit when an overcurrent event happened.
Teledyne Polaris has a DB9 serial interface port at the back of the instrument.This port is internally programmed to receive both a PPS signal and an RS232 GNSS timestamp using one of the supported NMEA messages by Polaris.
The Applanix POS LV comes with a breakout cable, a specialized cable designed to split the I/O connector of POS LV into multiple serial ports, such as RS232 and RS422 serial data communication.In addition to these serial ports, the cable also features event and PPS signal outputs.
The RS232 and RS422 connectors on the breakout cable have different pinouts, so it is essential to ensure that the correct pins are used to avoid malfunctions in the POS LV system or Polaris.The event signal is not used in this research work; however, the PPS is crucial to synchronize the LiDAR range measurements with the GNSS timestamp.

POS GAMS system adjustment
The GAMS in POS LV can determine the true heading by analyzing the position of its dual antennas.In order to function correctly, the GAMS requires knowledge of the baseline vector between the phase centers of the primary and secondary antennas and the expected accuracy of this measured baseline vector AKA-standard deviation.
The GAMS needs data from at least five satellites with a PDOP (Positional Dilution of Precision) of 3 or lower to properly calibrate its antenna installation.It is best to perform the antenna installation calibration in an open sky with no obstacles or multipath, and the satellite geometry is suitable.The antenna installation calibration should be in an area where the vehicle can freely move around, such as a large, oversized parking lot.
Standing still for 5 minutes will enable the GAMS to successfully resolve the carrier phase ambiguities and display the status on the POS View of "Ready Offline" indicating that the system is ready for the dynamic calibration procedure.
Then the vehicle should be driven in a straight line as fast as possible to reach a speed of at least 60 km/h, and then it has to be stopped as quickly as possible, then this pattern of acceleration and stopping has to be repeated multiple times.The next step is to change the maneuver to circular or 8-figure turns while ensuring no multipath signals.Those maneuvers must be repeated until the required GAMS heading accuracy is observed on the status screen of the POS view (0.3 degrees).The last step is to run the GAMS calibration engine from POS View.Once done successfully, a status indicator will indicate that the GAMS solution is calibrated and ready for data collection (POS LV V5 Installation and Operation Guide, 2022).

System software layer and data flow
The software layer is a vital component of the proposed system software architecture shown in Figure 7, responsible for various tasks that help collect, preprocess, calibrate, and produce the final data product.In this section, the four primary system software tools are explained.
The first tool is Applanix POS-View, responsible for configuring and calibrating the POS hardware and collecting the raw POS measurement data.Proper calibration of the POS hardware is essential for ensuring the accuracy and reliability of the system's measurements.Applanix POS View outputs the raw POS sensor data as part of the data products shown in Figure 7.
The second essential tool in the pipeline is POSPac, which is used to post-process raw POS system measurements and produce a smoothed best estimate trajectory.This is accomplished using a tightly coupled Kalman filter algorithm, which combines raw data from the GNSS, accelerometer, gyroscope, and other aiding sensors to produce a high-accuracy estimate of the system's position and orientation.POSPac includes other features, such as correcting atmospheric effects and handling errors and outliers in the measurement data.
The third tool is the Teledyne Optech API (Application Programming Interface), which collects raw range measurements from Polaris terrestrial LiDAR.The API interfaces with the Polaris via an ethernet port and can save the data on the equipment or external computer storage.The raw LiDAR measurements collected by the API are an essential input for producing the final data product of the system, which is typically a georeferenced point cloud.
The fourth and final tool is the Teledyne Optech Processing API, which combines the adjusted trajectory produced by POSPac with the raw LiDAR measurements collected by the Teledyne Optech API.This process results in the production of the final georeferenced point cloud, which is the primary data product of the system.

System test data collection
In order to ensure the functionality of the integrated system, a test mission was conducted utilizing the system mounted on the vehicle's roof.The location chosen for the initiation of the mission was a parking lot close to the Toronto Metropolitan University, it was selected due to its good visibility to the sky and established GNSS solution.
The POS was configured to collect data, encompassing both GNSS and INS information, with a frequency of 200 Hz for inertial data and 1 Hz for GNSS data.The standard operating procedure for aligning the IMU with the navigation solution, as outlined in the Applanix POS LV manual, was strictly followed until a heading accuracy of 0.03 degrees was achieved, considered a good figure for initiating the POS calibration drive.
The vehicle was then driven in various linear and circular patterns, incorporating various accelerations to allign the IMU and improve the POS angular accuracy.Based on an API provided by Teledyne Optech, the data acquisition tool shown in Figure 8 was used to communicate with the LiDAR sensor using an ethernet connection to verify its ability to read PPS and NMEA messages from the POS system.Additionally, this tool was used to configure the laser PRF, horizontal and vertical scanning angles, and trigger the LiDAR sensor to initiate and terminate LiDAR scanning.
The investigation of the API tool revealed that the PPS and NMEA messages were successfully received by the LiDAR sensor, indicating that the wiring and configuration of the POS were accurate.Additionally, the LiDAR API tool was subjected to various scanning angles to verify its functionality and the ability to conform to the specified configuration parameters, including the PRF and both scanning angles.
A series of LiDAR data strips were acquired to confirm the proper storage of both POS and LiDAR data in the internal storage of the sensor.Subsequently, a thorough examination of the raw data was conducted to confirm that all necessary data required for postprocessing had been collected and that the system could be utilized in actual data collection.Additionally, the raw data was verified to ensure that it could be used for georeferencing LiDAR raw observations after the data collection.

Initial system quality report and point cloud validation
The system test data collection was conducted at TMU in downtown Toronto, which served as an urban canyon environment characterized by numerous high-rise buildings.Consequently, the navigation solution was anticipated to encounter challenges due to the GNSS signal blockage and the multipath effect prevalent in such settings.Additionally, due to the absence of a dedicated close-range GNSS base station, the closest available public GNSS base station located in Saint Catharines, ON, approximately 50 km from the acquisition site, was utilized.This choice facilitated the implementation of the previously described DGNSS method.
The Applanix POSPac software was utilized to evaluate solution quality, both with and without the use of a base station.In this context, the solution without a base station is denoted as the "real-time solution," while the one incorporating the base station is known as the "post-processing solution."The postprocessing solution leverages the differential GNSS method.
Differential correction, a key component of the post-processing solution, involves comparing the position and orientation calculated by a reference station with a precisely surveyed known location.This comparison helps identify errors in the GNSS signals and correct them, thereby enhancing the accuracy of the positioning and orientation data.The results obtained from the trajectory post-processing software Applanix POSPac, demonstrate noteworthy improvements in the accuracy of the North, East, and Down (vertical) positions compared to the real-time solutions.Specifically, the post-processed North position achieved an RMSE better than 16 cm, whereas the real-time East position reported an RMSE better than 3.2 meters.Furthermore, the real-time East position RMSE improved to better than 13 cm compared to the initial RMSE of approximately 5 meters.Lastly, the down or vertical position exhibited an improved RMSE better than 23 cm, while the real-time down position reported an RMSE better than 2.2 meters.
Another aspect of the solution's quality is estimating the orientation angles, namely Roll, Pitch, and Heading.The post-processing solution demonstrated improved accuracy, with the Roll RMSE better than 0.3 arc minutes as opposed to the real-time RMSE of 1.15 arc minutes.Similarly, the Pitch RMSE was identified as better than 0.32 arc minutes, contrasting with the real-time solution of better than 1.15 arc minutes.However, the Heading RMSE did not significantly improve after post-processing and remained better than 1.5 arc minutes.Figures 9 and 10 illustrate the positional enhancement in East, North, and Down and the trajectory angles enhancement in Roll, Pitch, and Heading around X, Y, and Z axes, respectively.
Furthermore, part of the benefit of the Kalman filter in post-processing is that it also calibrates most of the installation parameters, such as the lever arms measured from the primary GNSS to the IMU frame, which was measured roughly during the system mounting.Figure 11 presents the adjusted lever arm values typically reported in the X, Y, and Z directions.Each chart depicts the evolution of these values, starting with the initial measurements recorded during the start of the data acquisition.Subsequently, the lever arms undergo adjustment, and the adjusted values are depicted in the latter part of the curve, where the curve flattens out.
For instance, the X lever arm was initially measured at −1 meters, the Y lever arm measured at 0.38 meters, and the Z lever arm at −0.76 meters.Following the adjustment process, these values were significantly enhanced, resulting in X, Y, and Z lever arms of approximately −0.97 meters, 0.35 meters, and −0.08 meters, respectively.The adjusted values demonstrate improvements compared to the initial measurements, as indicated by the nearly flat portion of the curve toward the end.
In order to achieve comprehensive validation and readiness for actual data collection, stringent measures were taken.Firstly, the collected data strips from the system test data collection were subjected to georeferencing using the API developed by Teledyne Optech.Furthermore, test mission point clouds were generated and visualized using a specialized point cloud visualization tool called Quick Terrain Modeler to evaluate the quality and interpretability of the collected data.This tool facilitated a thorough examination of the point cloud data, enabling a visual assessment of its consistency and point density.The observed sample strips within the generated point clouds exhibited commendable characteristics.Notably, they demonstrated consistent point cloud patterns, indicative of reliable data acquisition, as shown in Figure 12.Moreover, the point density achieved in these strips was deemed sufficient for the practical interpretation of the campus buildings and prominent features along the streets.These indicators contribute to the assurance of reliable data collection for subsequent scientific analyses and investigations of the developed system.
The georeferenced mobile data and the static data gathered from various stations were compared to assess the system's initial performance.To evaluate the accuracy of the mobile data georeferenced by the navigation sensors, linear features, such as window dimensions and traffic light height, were measured in both mobile and benchmark static data obtained through a static LiDAR survey (Figure 13).
The data collected in static mode from three stations near the TMU Monetary Building was compared to the mobile data georeferenced by the navigation sensors to assess accuracy.The mobile and static data were used to measure approximately 15 different features, and the discrepancies were calculated for each feature.After computing the RMSE (Root Mean Square Error) to determine the overall accuracy, a value of 11 cm was obtained.This assessment shows the preliminary efficiency and dependability of the suggested methodology in enabling mobile LiDAR systems and highlights the system's capacity to generate georeferenced data in a timely fashion compared to the classical static approach.
Upon comparing the mobile data with the static benchmark data, it becomes apparent that the static data demonstrates superior consistency and density.This is primarily due to its acquisition from a fixed station utilizing the sensor panning mechanism, characterized by an exceptionally small angular step of 20 micro-radians.In contrast, the mobile system opts for disabling panning rotation, instead relying on longitudinal scanning achieved by maneuvering the vehicle back and forth to steer the laser beam along the direction of motion.
The density of the mobile system data fluctuates and is contingent on both the speed of the vehicle and the scanner speed.As depicted in Figure 14, the mobile data appears sparse at positions where the car speed is increased, occasionally reaching a density comparable to that of the static system when the vehicle moves slowly.Notably, Figure 14 highlights that the majority of building facades in the mobile  system lack density, ranging from 2,000 to 10,000 points per square meter, owing to the high speed of the vehicle, whereas the same buildings exhibit a consistent density of almost 12,000 points per square meter across the entire static strip.
The Monetary Times Building, situated at the far left of Figure 14, exhibits a density that appears remarkably consistent with the static data, as the vehicle almost came to a stop while crossing the intersection.It is imperative to emphasize that the observed uniformity and density in the static data can be ascribed to its meticulous acquisition methodology.The static data was gathered with great care from a fixed station, demanding a significant amount of time to capture multiple stations from various viewing angles of the buildings.In contrast, the mobile data is collected while the vehicle is in motion, requiring considerably less time than the static configuration but results in less point density.

Discussion on the sensor integration obstacles
There are common challenges and impediments related to the improvement of any portable LiDAR framework comparable to the one displayed in this investigation, and the taking after could be a rundown of those deterrents: 1. Synchronization: Guaranteeing exact time synchronization among the sensors is significant for proper data fusion and alignment.Insufficient synchronization can present errors within the georeferencing process and influence the quality of the ultimate 3d data product.Usually, due to the delay in the PPS signal path from the GNSS receiver into the LiDAR sensor, a fraction of a second delay has to be considered to ensure the alignment of the internal lidar clock with the external reference clock of the GNSS (Ding et al. 2005).2. Sensor Calibration: Each sensor must be calibrated independently and in connection to guarantee reliable estimation for the final point cloud 3D data.Calibrating the LiDAR, IMU, and GNSS sensors for their natural parameters, such as the internal bias and drifts, and mounting arrangements, such as the boresight and lever arms between each sensor and the other, is crucial to ensure accurate data fusion results (Tedaldi et al. 2014).It is worthwhile to mention that the sensor has conducted the individual sensor calibration in this research manufacturer, and it is preloaded on each sensor.3. Sensors Obstruction (Shadowing): The situating and establishing of sensors on a portable platform should be carefully arranged to address the sensor's obstruction.Impediment from objects within the environment or the platform can result in lost or misshaped information.For instance, the GNSS antennas must be at the highest possible points on the vehicle to ensure that the other sensors will not obstruct the GNSS signals (Jianghui et al. 2020).As for the LiDAR sensor has to be mounted so that the laser beam at any time does not hit the rooftop of the vehicle or otherwise; the rooftop will appear in all the data collected.4. Nature of the Area: Urban canyons, characterized by tall buildings and narrow streets, present obstacles that impede the line-of-sight between the GNSS antenna and the satellite constellation which impacts the positional accuracy (Haala et al. 2008).Also, urban Canyons impacts the visibility of the LiDAR sensors to the target surfaces.As a result, the scanning angles become limited, leading to reduced data coverage and increased data occlusions.Conversely, open sky areas, devoid of significant obstructions, allow for a wider field of view and enhanced scanning angles, resulting in more comprehensive data capture and also a better trajectory solution which significantly impacted by the geometry of the GNSS satellite in the sky (Kabir et al. 2022).
Understanding and mitigating these environmental challenges are critical for harnessing the full potential of LiDAR mobile mapping technology in various urban and rural applications.
By tending to these potential impediments, analysts and engineers can optimize the integration of LiDAR, IMU, and GNSS sensors in a mobile mapping framework, empowering exact and solid information collection for different applications.

Conclusion
In this paper, a comprehensive methodology for the transformation of a terrestrial LiDAR sensor into a mobile mapping system was presented.Using a mobile mapping system offers several advantages over traditional stationary LiDAR systems, such as increased efficiency, flexibility, and the ability to cover large areas quickly.The integration of the Teledyne Polaris LiDAR sensor and the Applanix POS LV system was discussed in detail, describing the 'sensors' requirements, selection, and mounting.The installation measurements for linear and angular misalignment were conducted to ensure the accuracy of the data collected.The connections required for delivering PPS and NMEA messages to the LiDAR sensor for timestamping were also explained, providing an understanding of the communication between the different components of the system.
The system's functionality was tested to ensure its readiness for data collection and to confirm that it provided the necessary raw data for post-processing.The tests were conducted to check different LiDAR scanning angles to ensure the sensor can be controlled and follow the configuration parameters passed to the API.A series of LiDAR data strips were acquired to confirm the proper storage of both POS and LiDAR data in the internal storage of the sensor.Moreover, the raw data were examined to confirm that all necessary data required for post-processing had been collected and that the system could be utilized in actual data collection.
In conclusion, the methodology outlined in this chapter presents a comprehensive and generalizable approach for transforming any static LiDAR sensor into a mobile mapping system.While the Teledyne Polaris LiDAR sensor and the Applanix POS LV system were used as specific examples in this study, the methodology can be adapted and applied to any other type of POS or LiDAR sensor.This study also provides a step-by-step approach to ensure the data is complete and ready for further post-processing.The integration of sensor systems, requirements, selection, mounting of sensors, installation measurements, wiring, and system testing were all discussed to understand the process comprehensively.This study provides a powerful tool for integrating any LiDAR sensor with a POS to enable the data collection in a kinematic mode.It enables researchers and practitioners to convert any LiDAR sensors into mobile mapping systems, offering new opportunities while ensuring data quality and cost control while expanding the scope of mapping applications.
3. Sensor Integration and Mounting: Integrating GNSS, IMU, and DMI involves combining sensor data to estimate the position and orientation.One of the main challenges of integrating these sensors is synchronizing the data they provide.A GNSS receiver can use a common time base or a synchronization signal, such as PPS (pulses per second), for data synchronization.The PPS signal is a hardware trigger to synchronize LiDAR sensors.This allows for precise arrival times for LiDAR, essential for the georeferencing process of the LiDAR raw measurements.Regarding Sensor Mounting on vehicles, it is essential to ensure the sensor is placed in a location that allows it to have an unobstructed view of the environment and protected from physical damage.A common approach is to mount the LiDAR sensor on the vehicle's roof to allow a clear view in all directions.Additional considerations like safety, vibration isolation, and temperature control may be needed to ensure the sensor can function appropriately under weather conditions.4. System Calibration: Calibration aims to accurately determine the sensors' relative position, orientation, and alignment to each other so that the resulting LiDAR data can accurately relate to the real-world coordinates.The calibration of a mobile LiDAR mapping system typically includes two major parts: calibration of the POS and the LiDAR sensor Calibration.Calibration of the POS involves determining the linear and angular misalignment between all the POS's frames of reference.The LiDAR sensor's calibration involves determining the sensors intrinsic (reported by the manufacturer) and extrinsic parameters with respect to the POS reference frame and any errors or biases (Z.Li et al. 2019). 5. System Software: The main software components in a mobile LiDAR mapping system must include: � A POS software GUI (Graphical User Interface) allows the system to receive the raw observation from the GNSS receiver and the IMU and provides the position and orientation of the platform's reference frame.� LiDAR software driver: communicates with the sensor to control its operation and collect data.

Figure 1 .
Figure 1.Methodology of transforming a terrestrial LiDAR into a Mobile Mapping System.

Figure 3 .
Figure 3.The vehicle rooftop sensor mounting and orientation.

Figure 5 .
Figure 5. Sensors lever arms and separation measurements.

Figure 6 .
Figure 6.The required hardware integration of Applanix POS LV and Teledyne Optech Polaris.

Figure 7 .
Figure 7. System Hardware, Software, and the different data products.

Figure 8 .
Figure 8. Polaris API for Sensor Configuration and Monitoring (Courtesy of Teledyne Optech).

Figure 9 .
Figure 9. Positional RMSE reported in real-time versus post-processed solution.

Figure 10 .
Figure 10.Angular RMSE reported in real-time versus post-processing solution.

Figure 11 .
Figure 11.Adjustment of the primary GNSS to IMU lever arms.

Figure 12 .
Figure 12.Generated point cloud from the test data collection at Toronto Metropolitan University.

Figure 13 .
Figure 13.Example of a linear feature measured in the mobile data versus the benchmark data (static).

Figure 14 .
Figure 14.The density comparison betweeb the static and the system mobile data.

�
Data storage software: Store the LiDAR data on disk for later processing and analysis.� Trajectory post-processing software processes the data from the POS system to generate a precise trajectory of the vehicle.� A LiDAR georeferencing tool that aligns the LiDAR data with the platform's trajectory and georeferences it in real-world coordinates.
7. Performance and functional test: The performance and functional capabilities of the POS and LiDAR have been tested.The primary purpose of these tests is to ensure that all system components are working during the test task and that all signals and data integration protocols are working as intended.This test includes verification of solution trajectory and LiDAR and POS data quality.

Table 1 .
POS LV initial installation parameters.Wednesday in the GPS week 2128, which is equivalent to October 21, 2020.The value 18 represents the offset between the GNSS time and the UTC at this epoch.Finally, the 75 value is a checksum, a type of error detection used to verify the integrity of data transmitted between devices.