Why the .NET DLL cannot be included in the application using "copy local" - so the .NET installed is not needed

Quite a simple question and I really want to know the reason (real reason) behind this. Let's say you want to redirect a .NET application to computers without .NET installed (not even version 1.1). Why can't we just enable mscorelib.dll and other applications? Is it because the CLR has to be installed somehow in order to get JIT capabilities to intrereting IL?

I know this is a completely pointless question at the moment, since every system has a .NET 2.0 minimum, but I'm still curious: =)

+2


source to share


3 answers


Installing libraries and the CLR also allows you to create shared assemblies. Do you really want hundreds of CLR instances released on your machine? I love that I know that some things will be available for my application. Much better than worrying about ... I compiled version 1.4.5 or 1.4.6 at runtime ... maybe it was even 1.2.5 (JRE can be a pain)



There are also many parts of the .Net Framework that are just managed wrappers over unmanaged APIs. Apart from a ton of other assemblies you are using but not directly referenced. (see mscoree.tlb and many others)

+1


source


Well, as you said, the assemblies themselves are useless.



You need a runtime including jitter loader, type loader, garbage collector, etc.

+5


source


.NET is an interpreted langauge. Your assemblies will not compile to a binary executable.

Therefore, you need to install the .NET runtime so that your code can be interpreted at runtime. Without it ... your application won't do anything.

+3


source







All Articles