I am doing basic LB radiation and sunlight hours analysis on a project for multiple days (21 - march, june, sept, dec). The model is quite big and takes about 3 hrs to run the radiation esp. at the desired grid density.
My question is that is there a way I can save the results for future editing purposes. In simple terms I want to use Legend Parameters to change the display (gradient, low high bounds, etc.) when required. I have been searching on the forum but could not find anything.
What I tried is to bake the mesh and re-select it in gh, copied the sunlight hours results for that mesh in excel and pasted them back in a fresh panel… plugged the mesh and the results panel in custom preview with a gradient… but the preview doesn’t seem to match the original results mesh (test sample attached).
So is there any way I can save my analysis results and use it in future to process as required. I don’t want to re run the analysis every time.
Did you try using native grasshopper Data component? You can provide your results / mesh to a Data component and then internalize the data for future use. You can internalize data by right clicking. You can save results for every iteration this way.
Yes I did. I don’t know if I am doing something wrong but the default gradient doesn’t seem to go away. I attached a sample image where I have a greyscale gradient but custom preview still shows the original colours.
What Devang said does the job. A slightly more advanced approach is to add data as UserDictionary or UserData to mesh so you can read it back from Rhino and then use recolor mesh to change the coloring. We have an open issue for this but haven’t had the chance to implement it yet: https://github.com/mostaphaRoudsari/ladybug/issues/53
Hi Devang, The gh is pretty straight forward. I cannot share the model since its a company project but I have made something similar and internalized in a gh file.
I copied the floors 10 times and ran the analysis. It woks in this model I made from scratch. But somehow if I select say 15 floors in the original model, it just doesn’t work.
I could not re-create the error in this file. It works as you said. Can you not simplify your analysis geometry? Say one continuous envelope (mesh or surface) per floor height of 4 meters? I believe that should help.
What’s troubling me is that when I select 9 floors and run, it doesn’t work but when I select 3 of them pair by pair and run, it works.
Right now I am managing that way only. Saving the results bit by bit.
The total number of data points for the tower are around 150,000. I guess its too much for gh/lb to handle at once.
Hi Abraham,
There were memory issues a couple of times but in those cases rhino pops up a dialogue box saying out of memory and it will close after that. The machine is pretty tough (intel Xenon, 500SSD, 32GB RAM, 4GB nvidia quadro).
Right now the cpu usage during the heavy overall analysis goes up to 95% sometimes but memory usage stays between 8-9 GB. CPU usage does crank up to max I guess. How to solve it? Any ideas?
I increased the grid size to 2000 and tried the whole tower, still doesn’t work but if I leave out a couple of floors it works. I guess its a quantitative issue.
Recreating the Geometry is one way I can solve this. It will be little work for me esp. if the architects send new model every now and then. But I think I will have to try it.
Unless you have a good reason to do so I would not run the analysis in mm. The tolerance in Rhino is relative and by default is 0.001 of the unit. Now you’re making Rhino to run the calculations by the accuracy of a 10-6 which is not really what you need. Adjusting the tolerance or changing the units to Meters should boost the performance.