Page 1 of 1

Large Data Set

Posted: Wed Oct 27, 2021 7:35 pm
by TJGRO
I have a large e57 file (110GB) and I am unable to open in CC. I have a dedicated 6TB SSD card I am trying to use as virtual memory but cannot seem to get CC to see and use the additional RAM. Any suggestions, tips, and or approaches to working with a large data set with virtual memory?

here's the computer specs:
HP Z6 G4 Workstation
Intel Xeon Gold 6248R CPU @ 3.00 GHz, 2993 MHz, 24 cores
256 GB Physical RAM

Thanks,
TJGRO

Re: Large Data Set

Posted: Thu Oct 28, 2021 7:30 pm
by daniel
Is the error "out of memory" or is it something else?

Opening such a monster with CloudCompare is challenging anyway, as if even if it fits in memory, the interaction might be a little bit slow ;)

One option is to subsample the cloud via the command line first.

Re: Large Data Set

Posted: Fri Oct 29, 2021 5:15 pm
by TJGRO
Runs out of memory.

How do you subsample via command line?

Re: Large Data Set

Posted: Mon Nov 01, 2021 8:41 pm
by daniel

Re: Large Data Set

Posted: Wed Jan 19, 2022 2:18 am
by kdgrover
Is there a way to only partially load a structured E57 file? Such as see a tree view of the scans and just load selected ones?

Kevin

Re: Large Data Set

Posted: Thu Jan 20, 2022 8:37 am
by daniel
Not in the current implementation.