How to determine which value types are initialized by the CLR
This applies to C # and the .NET Framework.
Your best bet is not to initialize the following types in the .NET framework because the CLR will initialize them for you:
Int, bool, etc.
and the same for setting the object to null (I believe).
For example, you don't need to do this, and in fact it's a performance hit (falls in the bucket or not):
int Myvar = 0;
You just need int Myvar;
this. The CLR initializes it to int.
I obviously just "know" from programming that by default int is set to 0 and bool is false.
And also setting the object to null, since the CLR does it for you. But how can you determine what these primitive types are about. I tried opening Reflector to take a look at int32 and bool, but couldn't figure out how they are initialized by default.
I looked at msdn and I can't see it either. Maybe I just missed it.
Try debugging your application and check what the initial value was before you changed it. There might be a better answer, but in order to close this thread I am just posting a quick and dirty solution.
static void Main(string[] args)
{
int iTest;
string sTest;
double dTest;
bool bTest;
float fTest;
// stop debugger here
Console.WriteLine();
iTest = 0;
sTest = "";
dTest = 0.0;
bTest = false;
fTest = 0;
}
source to share
http://www.codeproject.com/KB/dotnet/DontInitializeVariables.aspx
Here we go. Simple: value types are initialized to 0 and reference types are initialized to zero