Page 1 of 1

How to generate composite texture file for mesh from multiple scalar fields

Posted: Tue Jun 25, 2024 11:22 am
by joshua19
Hi, I am trying to make custom textures for my meshes. My goal is to make grayscale textures that label the meshes according to multiple notions:
  1. inclination (presumably this can come from dip/dip direction)
  2. planarity
  3. roughness
Can someone let me know how combine these into a single metric and then create a texture that I can apply to the original mesh? Is there a way to compute geometric features directly on the mesh, instead of using a point cloud?

In some cases I start from meshes that come from photogrammetry via WebODM, and in some cases I start from LiDAR point clouds, but I would like to have a consistent method for automatically processing both. In both cases, I need to end up with a mesh that has an RGB texture representing its actual appearance, and the custom texture I describe above. The pointclouds from WebODM (from my photogrammetry surveys) are very fuzzy compared to the real point clouds from LiDAR surveys, so it is difficult to get good results when working on them directly. I have found that it might be better to run calculations on the point clouds sampled from the meshes that WebODM exports, since the meshes themselves look more realistic. Similarly, the LiDAR point clouds are from drone surveys, and they have some overlap areas that have twice the normal point density, which affects especially roughness and planarity. I have found that it might be better to do a Poisson reconstruction, then sample the resulting mesh, and then run calculations on the resulting pointcloud.

Is it good practice/reasonable to create derived pointclouds in this way? Are there downsides?

Re: How to generate composite texture file for mesh from multiple scalar fields

Posted: Tue Jun 25, 2024 8:25 pm
by daniel
So meshes are definitely not CloudCompare's forte. There's no way to create textures for instance.

So indeed, if you want to work with CloudCompare, you'd better samples points on the mesh (at least, the color of textures will be transferred to RGB colors on the cloud).

And maybe to solution your density issue, it would be easier/faster to simply subsample the LIDAR cloud (with Edit > Subsample).

Re: How to generate composite texture file for mesh from multiple scalar fields

Posted: Thu Jun 27, 2024 8:39 pm
by DA523
But Poisson reconstruction generates new pointscloud which is not identical to the original pointscloud
Also, Poisson reconstruction plugin in CC does not color mesh with pointscloud color same of Delaunay tool??

Re: How to generate composite texture file for mesh from multiple scalar fields

Posted: Sun Jun 30, 2024 8:48 am
by daniel
I thought the original RGB colors were transferred/interpolated by PoissonRecon. Isn't it the case anymore?