DJI Zenmuse L1 Lidar Review

DJI Zenmuse L1 Lidar Review

DJI’s long-anticipated L1 sensor is finally here, and we’ve been doing dozens of test flights and spending hundreds of hours flying and analyzing data to create a real-world, field-tested review of the newest DJI L1 sensor.

The DJI Zenmuse L1 integrates a Livox Lidar module, a high-accuracy IMU, and a 20MP camera with a 1-inch CMOS sensor and a mechanical shutter on a 3-axis stabilized gimbal. The L1 is compatible with the M300 RTK.

A few caveats, though. While this is already going to be a pretty long and detailed article, we couldn’t possibly cover everything we know so far about this system here, so stay tuned for more updates as we refine field data collection workflows, recommended settings in different environments and data processing best practices. Sign up for our mailing list below to be the first to know about these things.

Summary

Overall, the DJI L1 sensor and the M300 platform make for an excellent UAV lidar platform. We would recommend it to any surveyor that regularly needs to get topo data in vegetated areas.

Pros:

  • Vegetation Penetration - the L1 can get ground data even in moderate vegetation. But be aware that it isn’t perfect.

  • Photogrammetry & Lidar - the L1 has a good photogrammetry camera, too, allowing it to collect both lidar and photogrammetry data simultaneously.

Cons:

  • Processing Time - You have to process both lidar and photogrammetry datasets and then merge them to get the best of both worlds. Lidar data is considerably less accurate than photogrammetry data on hardscape.

  • Cost - Especially when compared to something like a Phantom 4 RTK, the M300 with an L1 sensor is 3-4x more expensive.

  • Ease of Use - The M300 system is large, loud, heavy, and difficult to transport and use. It is not nearly as versatile as a Phantom 4 RTK.

Flying and Data Collection

Anyone familiar with flying DJI autopilot missions for surveying, such as the Phantom 4 RTK, will immediately be able to use the software to fly the L1. The flight planning software is fairly straightforward and easy to use, and really nothing special at all in this day and age of drone surveying. In that sense, the actual “flight” aspect of the M300 and the L1 is extremely easy, exactly what you would want. There are a handful of cool things, like visualizing live point clouds, that have been heavily promoted in marketing materials. In our opinion, these are little more than a gimmick. They are cool but don’t actually provide any useful data, which is what we primarily care about.

While mission planning and flying are trivially easy, the setup process, unfortunately, is not. The M300 is an extremely large, loud, heavy, and complex system. Just bringing it out into the field requires a minimum of 4 cases (for the aircraft, L1 sensor, batteries, and base station). Setting up the drone is cumbersome, too, because you need to remove and attach the legs individually, then unfold and lock in place each arm, and ensure none of the propeller chocks are attached. There are dozens of dust covers on each of the ports and attachment points and batteries, each of which can be easily lost. No one item is particularly difficult, but they all add up to quite a lot of complexity. As a pilot, if I didn’t absolutely need the lidar data, I would prefer to fly something simple like the Phantom 4 RTK.

However, it might be unfair to compare this to a Phantom 4 RTK because the lidar system is simply so much more capable in terms of vegetation penetration. And comparing the L1 to other lidar systems on the market - there is simply no comparison. The L1’s ability to integrate reliably into the entire DJI hardware ecosystem is unmatched. The L1 and M300 system is far easier to use than other drone lidar systems on the market.

Data Processing Overview

Processing a blend of lidar and photogrammetry data is extremely complex, with an infinite number of ways that you could process it. So rather than review exactly why we process the data the way that we do, we will simply give you a brief overview of exactly how we process the data right now.

For starters, you must use DJI Terra to turn the raw data from the L1 sensor into a point cloud. There are no other options. DJI Terra is also extremely limited in terms of its functionality for surveyors, so we recommend only using DJI Terra as a pre-processing software to create LAS point clouds. We then use a different set of point cloud processing software to translate the point cloud to our local project datum (which DJI doesn’t support) as well as cleaning any noisy points, classifying point types, and vectorizing the data into useful data.

Separately, we process the imagery and ground control in a photogrammetry workflow to create an orthophoto as well as a photogrammetry derived point cloud. The photogrammetry point cloud is more accurate than the lidar data on hardscape. Further, the orthophoto is far more accurate in X and Y than the lidar data for extracting planimetric features, so a blended vectorization workflow is required.

Ultimately our goal, and the goal of most surveyors, is to get a clean set of data into CAD, and we do that by extracting the best data from both the lidar and photogrammetry derived data.

Vegetation Penetration Capabilities

The key benefit to the L1 sensor is the ability to penetrate vegetation. So to test this out, we went to an incredibly difficult to survey, highly vegetated creek bed in Southern California. Rather than try to test the L1 in really generous conditions, we wanted to throw it at real-world, highly challenging conditions, which is really where this type of technology can shine if it works.

A ground-level view of the project site.

A view from within the creek bed showing the density of vegetation.

Many areas are entirely impassable without a machete to cut through the vegetation.

A view of the creek bed as a point cloud in DJI Terra.

So how well is the L1 able to penetrate this vegetation? The answer is - pretty well, but not perfect. Way better than photogrammetry, at least.

The above image is a profile view of the creek bed. The points in green come from a photogrammetry derived point cloud, in this particular case taken with a Phantom 4 RTK. The data in blue comes from the L1 sensor. As you can see, the L1 sensor does a considerably better job of gathering data along the floor of the creek bed than photogrammetry, by a VERY wide margin. However, it is not perfect. There are meaningful gaps where even the lidar data is not able to gather ground data accurately.

Some areas are worse than others. The above screenshot is a profile view of the creek where the points have been manually classified, and the ground data is shown in brown. As you can see from this profile slice, there are even more gaps than before.

One major challenge with all lidar sensors is that processing lidar data is very time-consuming. Classifying lidar point clouds requires a challenging mix of a skilled hand, a powerful computer, and a good understanding of the capabilities and limitations of automatic classification algorithms. At Aerotas, we have all of these things, and it still takes a good amount of time to get it right over an entire project. The L1 is no different in this regard; processing L1 data is similarly time-consuming to all other lidar sensors we’ve worked with.

But overall, the conclusion is that the L1 has a good, but not perfect, ability to extract ground data even through dense vegetation. It is no magic wand, and the data still requires significant processing and analysis, but it is far better than any photogrammetry solution could provide.

Accuracy

The L1 sensor can penetrate vegetation pretty well, but that isn’t of much use if the resulting data isn’t accurate. Because we work with surveyors and engineers, accuracy means everything. Without reliable, measurable accuracy, the point cloud is little more than a pretty picture.

To measure accuracy with the L1 sensor, we used a project that was mostly hardscape so that specific features could be identified and measured in X, Y, and Z dimensions. We utilized 3 control points to localize the project and 45 independent checkpoints to measure accuracy in accordance with ASPRS positional accuracy standards. All points were measured with a survey-grade base and rover, dual-band GNSS system with a theoretical precision of 8mm (0.026’) horizontal and 15mm (0.049’) vertical.

Accuracy Results

When flying the L1 sensor at an altitude of 200’ above ground, we calculated a vertical RMSE error of 0.08’. When flying at 400’, this error increased to 0.13’. This is actually better than we expected, given the published specifications of the lidar system by DJI; however, it is not quite as good as a photogrammetry system. The Phantom 4 RTK, for example, produced a vertical RMSE error of 0.07’ at 200’ and 0.11’ at 400’ above ground. Slightly better than the lidar system, but not by a lot. Overall, achieving 0.08’ vertical accuracy at 200’ AGL is very good for this system.

Vertical Accuracy (RMSE)

 

Phantom 4 RTK

L1 Sensor

200’ AGL

0.07’

0.08’

400’ AGL

0.11’

0.13’

But, getting to 0.08’ takes some work because the L1, like most lidar sensors, does produce some noise and artifacting. The above screenshot is a profile view of a road with curbs on either side, colorized by the scan angle of the lidar sensor. Z values have been exaggerated for visualization purposes. What is clear from this image is that the outer edges of a single lidar swath lead to more error relative to the full point cloud. To get the best quality lidar results, a good data processor will need to monitor things like excessive scan angles and the inconsistency from one swath to the other and filter the data accordingly. This type of noise simply does not happen in photogrammetry processing.

While the vertical accuracy of the lidar data is very good, the same thing cannot be said for horizontal accuracy. DJI’s specifications list a horizontal accuracy of the L1 system of 0.3’ at 160’ flight altitude. However, in our testing, we noticed that the lidar point cloud produced significant ghosting and artifacting within the point cloud when produced by DJI terra.

In the image above, note the multiple parallel curb lines and how each of the paint stripes is represented multiple times. This ghosting of features in X and Y was present in numerous different flights despite a number of different settings. While there may be flight settings to reduce or eliminate this type of artifacting, we have not yet found the proper procedures to do so. So overall, the horizontal accuracy of the L1 sensor is actually somewhere in the 0.5’ up to 1’ range, which is far below survey grade.

The above photo is a 2D orthophoto produced using the photos from the L1’s 20MP camera during the same lidar flight as our lidar testing.

Thankfully, the L1 sensor has a photogrammetry camera that takes photos while collecting lidar data. When processing the photogrammetry data from the L1 sensor, we achieved a horizontal accuracy of 0.06’, which is completely in line with results from the Phantom 4 RTK at the same flight altitude. That means that, when processed correctly with photogrammetry data for X and Y values and lidar data for Z values, the L1 is capable of getting fantastic results in nearly all circumstances, so long as you have the right workflow to process the data in this way.

Overall, the data collected with the photogrammetry camera of the L1 sensor alone is nearly identical to the data that comes from the Phantom 4 RTK. This was expected since their specifications are nearly identical to one another. So the benefit of the L1 is that it can gather all the same data that the Phantom 4 RTK can, and a lidar point cloud in addition to that data to enhance accuracy in vegetated areas.

Accuracy Summary

The L1 is fully capable of producing survey quality accuracy, better than a 0.1’ X, Y, and Z, but the processing workflow matters enormously. Utilizing the raw LAS point cloud produced by DJI Terra would result in significant error and should be avoided unless accuracy tolerances are very loose. Good accuracy is possible with the L1 sensor. Still, it requires a good post-processing workflow and the ability to merge data from both the photogrammetry sensor and the lidar sensor to produce final results.

Conclusion

Is the L1 sensor worth the time and money?

If you do a lot of projects in vegetated areas, then yes, it just might be. If you don’t have many vegetated projects, then the data quality is no better than a Phantom 4 RTK, and you are probably better off with a system like that. But ultimately, the most important thing to realize is that the data processing workflow is of the utmost importance in extracting valuable data from the L1 sensor. Without powerful point cloud processing software and experience, the value of the L1 sensor deteriorates significantly. And Aerotas would love to be your processing partner in helping you extract the highest quality data from your L1 sensor.

Support More Articles like This

All testing was done by Aerotas personnel on Aerotas owned equipment. This review is not sponsored or subsidized by DJI or any third party in any way. We’ve seen too many “reviews” out there that are actually sponsored by DJI - we are proud to provide you truly independent analysis of the hardware. If you want to consider supporting this type of analysis, the best way is to work with us.

Aerotas provides data processing services to land surveyors and civil engineers throughout the United States. You fly the drone, and Aerotas delivers CAD files created from your data. You’ll get a clean surface, contours, and dozens of other features like curblines, paint striping, utilities, and more. Thousands of high-performing surveyors already use Aerotas to process their drone data. Many surveyors we work with have the ability to process drone data in-house. Because Aerotas offers per project pricing with no commitment and no contract, we are adding to your options, not replacing your in-house processing. Scale-up and down based on need.

Please contact us below to learn more!


INCREASED FLEXIBILITY | SAVE TIME | BETTER DELIVERABLES

Contact us anytime at (949) 335-4323, or support@aerotas.com