M3C2 with large datasets

Questions related to plugins development
Post Reply
ebash
Posts: 6
Joined: Thu Nov 24, 2016 4:46 pm

M3C2 with large datasets

Post by ebash »

I am working with large point clouds from UAV imagery (approximately 230M points). I want to use the M3C2 plugin to look at small changes between point clouds when different ground control points are used for orienting the imagery. When I try to run the tool, however, it gets stuck on the calculation after a day and I have to force quit. Am I working with point clouds which are too large? Or is there a way to change the settings to make it run more efficiently? Or to split it into chunks and then merge them?
Thank you,
Eleanor

My machine:
Intel i7-10750H
16 GB RAM
NVIDIA GeForce GTX 1650
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: M3C2 with large datasets

Post by daniel »

The issue may be more on the side of parameters: did you use the 'guess' button to get approximate values for the parameters? And does your cloud already have normals or are you letting the plugin compute them? Last but not least, it's generally a good idea to test the results first on a sub-sampled version (see the 'core points' option).

Don't hesitate to post snapshots of your cloud and of the parameters.
Daniel, CloudCompare admin
ebash
Posts: 6
Joined: Thu Nov 24, 2016 4:46 pm

Re: M3C2 with large datasets

Post by ebash »

Thanks, somehow I missed your reply... I did use the 'guess' button for the parameters and the plugin is computing the normals. The guessed parameters are using a subsampled cloud already. I will try to post some screen shots tomorrow, right now my computer is tied up with other processing and won't have enough memory to load the clouds.
ebash
Posts: 6
Joined: Thu Nov 24, 2016 4:46 pm

Re: M3C2 with large datasets

Post by ebash »

Here is a screenshot after guessing the parameters. Is the calculation slowed down more by the number of points, or the density of them? The point cloud is very dense - 350 pts/m2.

https://drive.google.com/file/d/0B3Ul_G ... sp=sharing
Thank you,
Eleanor

My machine:
Intel i7-10750H
16 GB RAM
NVIDIA GeForce GTX 1650
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: M3C2 with large datasets

Post by daniel »

Wow, 400.000 points per cell at level 7?! I wonder why the plugin chose this configuration... It's obviously a glitch. You should reduce a lot the diameter (0.5 seems more than enough, and maybe already too much).

On my side I'll have to investigate this issue. Is it possible for you to share the cloud with me? (if yes you can send me a link to cloudcompare [at] danielgm.net).
Daniel, CloudCompare admin
ebash
Posts: 6
Joined: Thu Nov 24, 2016 4:46 pm

Re: M3C2 with large datasets

Post by ebash »

Daniel,
I think the problem with running the plugin lies in my data, I was able to successfully run the plugin on a different dataset of similar size. I am going to continue looking into it after working with the successful product of the second dataset for a while.

I wonder if there is a way to compare the M3C2 distance to actual measured change stored in a shapefile or text file within Cloud Compare. Or to export the M3C2 distance into a format readable in ArcMap? When I try to export a .las file, the scalar fields get ignored.
Thank you,
Eleanor

My machine:
Intel i7-10750H
16 GB RAM
NVIDIA GeForce GTX 1650
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: M3C2 with large datasets

Post by daniel »

Indeed, on my side I found that the plugin was overestimating the normal radius in some cases (when you click on the 'Guess parameters' button). I fixed it.

And there's no simple way to compare the M3C2 distances with other distances inside CloudCompare. But you could export them as a raster (with the Rasterize tool). With the 2.8 version you should be able to generate the raster with the M3C2 distances as active 'layer' and then export it to a geotiff file.

P.S.: technically, it's also possible to export any scalar field as the 'Intensity' field of a LAS file. To do this, make sure that no other scalar field is named 'Intensity' in your cloud, and rename your own scalar field 'Intensity' (mind the capital 'I'). But sadly the 'Intensity' field of a LAS file is limited to positive integer values between 0 and 65535, so it's not very practical...).
Daniel, CloudCompare admin
Post Reply