- Frugal Cafe
- Posts
- FC57: Very large object and 889 million object challenge
FC57: Very large object and 889 million object challenge
Can your memory dump analyzer handle very large object?
DebugDiag 2.2 crashed on dump with 500 million objects due to array being too large (> 2gb). .Net has a setting to enabled very large objects in 64 bit world. You just need to add the following in your config file:
Now we can run its code again. It completes the analysis after rather long time and lots of memory usage. I took a dump of after it ended analysis. The dump has 889 million objects:
There are 501 million ClrObject objects, and 388 million free objects. The high ratio of free objects is normally a sign of not having enough Gen2 foreground GC to really cleanup memory.
Notice the largest array is at 4 gb, this is the correct information.
Now we can use the new dump to challenge our dotnet dump analyzers again. PerfView only scans 132 million objects, and skips the 4 gb array. So data is not accurate. Visual Studio seems to finish scanning the dump and reported back very little information. Maybe it failed silently.
My modified version of PerfView finish scanning the whole thing and even generated a useable .gcDump file:
Here is the type stat in log file:
Notice there is a mistake here. PerfView’s analysis code uses 32-bit integer to represent object sizes. So it can’t handle very large data properly. The largest number it can use is int.MaxValue.
More work is still needed in very large object handling.