From LIDAR point cloud to archaeological DSM

Kristina Georgieva
Towards Data Science
6 min readNov 24, 2020

--

In my previous article I gave an introduction to LIDAR for archaeology. This article aims to take you one step further. It shows an example of processing real world LIDAR point clouds for archaeology, in order to create a digital surface model (DSM).

In today’s example we will be looking at LIDAR data for Tikal, a Mayan city located in Guatemala. This city flourished between 200 and 850 A.D, and became the home to Temple IV, the tallest pre-Columbian structure in the Americas.

We will be going from site, to point cloud, to digital surface model (DSM) in a few steps. Well, today we will skip the first step of actually generating a point cloud as we would need some equipment as well as the ability to go outside for that, but you get the point (pun intended).

3 images represent the process. The first a photo of Tikal, the second the point cloud, the third the resulting DSM
Tikal is mapped to a point cloud using LIDAR, then the LIDAR point cloud is converted to a Digital Surface Model (DSM)

Preparation

In order to get started, you will need to setup the following:

Some things to take note of before you start:

  • You only need wine if you are on a mac. If you are on windows then you don’t need wine
  • If you are on Windows, then the references to the environment variables would use opening and closing ‘%’ instead of a starting ‘$’
  • If you are on Catalina “wine” will be “wine64” instead
  • LAS_DIR refers to the directory where you have downloaded LASTools to
  • HIVE_DIR is the directory where your data is

Processing

You are now ready to start. The first thing that we need to do is convert the .e57 file to a .las (or .laz) file for us to be able to process it further. We can do this as follows:

wine $LAS_DIR/e572las -v -i $HIVE_DIR/data/tikal/Tikal.e57 -o $HIVE_DIR/data/tikal/Tikal.laz

If we want to look at the point cloud we can use lasview to do so:

wine $LAS_DIR/lasview-i $HIVE_DIR/data/tikal/Tikal.laz

In the case of the Tikal data, there has been some colouring added to the points to make the items identifiable. We can see this in the screenshot below, however other sources would likely not have such colouring.

3D model of Tikal represented in dots. Green dots represent trees and grass, beige ones represent buildings.
Point cloud visualisation

You can find out information about the point cloud by running lasinfo.

wine $LAS_DIR/lasinfo -i $HIVE_DIR/data/tikal/Tikal.laz

And if you want to save the information to a file, you can do so as follows:

wine $LAS_DIR/lasinfo -i $HIVE_DIR/data/tikal/Tikal.laz -odix _info -otxt

This will output all kinds of information about your point cloud, including the number of returns, classifications, angles, etc. You can use this command to extract information at various points in your pipeline in order to analyse changes caused by it.

Screenshot of terminal showing information about the point cloud such as what classes exist in it, number of returns, etc
Example output of lasinfo

We can see in the above information that none of the points have been classified. In order for us to clean up the point cloud to have a nice DSM at the end, we need to know what kind of surface each point represents, and therefore need to classify these. The first step to classification is identifying which points form part of the ground, which we can do as follows:

wine $LAS_DIR/lasground -i $HIVE_DIR/data/tikal/Tikal.laz -archeology -o $HIVE_DIR/data/tikal/Tikal_ground.laz

Note that we use the “archaeology” option. This has some default parameter configurations that in general work well for archaeology. The lasground command has other options that also allow you to tune the process further, should your use case need such further tuning. Next we need to calculate the height of every other point in relation to the ground using lasheight.

wine $LAS_DIR/lasheight -i $HIVE_DIR/data/tikal/Tikal_ground.laz -o $HIVE_DIR/data/tikal/Tikal_height.laz

And now, we are finally ready to classify! This can be as easy as the previous steps if we use lasclassify.

wine $LAS_DIR/lasclassify -i $HIVE_DIR/data/tikal/Tikal_height.laz -small_buildings -small_trees -step 1.0 -o $HIVE_DIR/data/tikal/Tikal_classify.laz

The extra parameters “small_buildings” and “small_trees” tell the process to not exclude small buildings and trees from the classification. Without these options, the algorithm would leave them as “unclassified”. The option “step” refers to the step size in meters, which affects the thresholds used by the classification algorithm and is tuned based on how dense your cloud is.

Now that the points are classified, we are ready to start working on the DSM. Before we go further with these steps, let’s see what a DSM would look like without cleaning any of the data.

Greyscale DSM before cleaning. Looks graney, vegetation is covering multiple buildings. Pyramids at each side +- visible
Uncleaned DSM

That’s not very nice. It is graney, only the large buildings are more or less identifiable, and it reveals nothing that we couldn’t see in the point cloud version of the same site. We would have a better time with a high definition satellite image than with this DSM.

The first step to fixing this is to remove noise. We can think of noise as points classified as one thing within a large group of points classified as another. We do this cleaning as follows:

wine $LAS_DIR/lasnoise -i $HIVE_DIR/data/tikal/Tikal_classify.laz -isolated 15 -step 2.0 -remove_noise -o $HIVE_DIR/data/tikal/Tikal_classify_no_noise.laz

The parameter “ isolated” states how many points need to be be alone within the “ step” size of 2 meters in order to be considered to be noise. For denser clouds, or larger step sizes, this value would likely be larger than for scarce ones. The parameter “ remove_noise” removes the points identified as noise from the point cloud Without the “ remove_noise “ parameter, the resulting point cloud would still contain noise that has simply been classified as such.

In order to create the DSM, we want to reduce the number of points that are being drawn. This helps us make a smoother and less graney image. To do this, we use the thinning process, tuning the step size in order to smoothen it further.

wine $LAS_DIR/lasthin -i $HIVE_DIR/data/tikal/Tikal_classify_no_noise.laz -step 0.2 -o $HIVE_DIR/data/tikal/Tikal_classify_thin.laz

In our use case, we want to only keep points classified as building and ground. Therefore, we need to drop classes 5 and 1, which are “vegetation” and “unclassified” respectively. Of course, depending on your site, you might need to drop other classes too. You can see what types of points exist in your laz file by running lasinfo after lasclassify. The classes are number based, and you can see what each one means in this documentation.

wine $LAS_DIR/las2las -i $HIVE_DIR/data/tikal/Tikal_classify_thin.laz -drop_class 5 1 -o $HIVE_DIR/data/tikal/Tikal_clean_class.laz

Finally, we create our digital surface model using blast2dem:

wine $LAS_DIR/blast2dem -i $HIVE_DIR/data/tikal/Tikal_for_dem.laz -hillshade -scale 2.0 -step 0.3 -kill 10 -o $HIVE_DIR/data/tikal/Tikal_dem2.asc -v

The parameter “ hillshade” adds shading to the resulting image, “ kill” excludes triangles with edges longer than 10 in this case, and “ scale “ multiplies all elevation values by the given amount. All these parameters are defined by experimenting until satisfied with the resulting DSM, which in our case we can see below.

Cleaned DSM. Looks smooth, clear, pyramids well visible, vegetation removed, additional stairs and buildings revealed
Clean DSM

This is a much more pleasant and smooth DSM. It reveals parts of buildings previously covered by vegetation, new staircases at the top, and smaller buildings that were previously not visible next to the right hand side pyramid.

That’s it folks, that’s how you can get started with lastools to process archaeological LIDAR data. I am sure that professionals in this field can get an even nicer DSM, but I hope that you have learned at least some basics through this post. If you want to try out some other tools, it may be worth exploring:

Happy geospatial digging!

Originally published at http://computationalarchaeology.wordpress.com on November 24, 2020.

--

--