The amount of memory allocated to dot net application depends upon whether the application is running as 32 or 64 bit process.
On a 32-bit system, every object is allocated an memory of 8 byte. For existence of object it has to be referenced from somewhere - which eventually increases the amount of memory needed for an object existence to 12 bytes.
On 64-bit systems, the situation is worse. The object memory allocation increased to 16 bytes, and 8 bytes are required for a reference, so every object needs 24 bytes simply tfor existence.
If arrays are being created and destroyed a lot, it is possible that the pattern of creations and garbage collections can result in .NET leaving large holes in memory that will reduce the size of the largest array that it can allocate. This problem can result in an application gradually running out of memory even though it has no memory leaks and its memory requirements are not otherwise increasing over time.
All objects created by the CLR are subject to this hidden memory cost, which can result in an application using many times more memory than expected. Reducing the number of objects kept in memory at any one time, perhaps by increasing the number of fields in individual objects or by storing bulk data in large data structures, is an effective way to increase the capacity and efficiency of .NET applications.
Disclaimer: The information provided here by is based on the reading I have done on internet.