Why is cabal being downloaded and compiled from source?

When I create a new project. Let's say a web app using Snap. I will generate a skeleton using snap init barebones

, create a new sandbox and then install the dependencies.

It lasts forever. Jokes aside. If you've ever worked with any other webmap (node.js with expression, for example), the process is almost identical, but takes a little time. I know that most node dependencies do not require compilation, but it seems strange to me that this is not considered a more serious problem. For example, I can never run a Yesod application on my cheap VPS because the VPS is not powerful enough to compile it and I cannot load 500mb of precompiled libraries.

The question is, why aren't the repository binaries used instead of the repository binaries? .NET is also compiled (for bytecode), but I can use its DLL without recompiling.

Of course, there are drawbacks to hosting binaries like more memory, multiple binaries in a library for multiple OS ... But all the problems seem minor to the huge benefits you get, for example

  • More compilation errors
  • Much faster setup for new projects
  • Considerably less memory required
  • Knowing the library doesn't support your OS before you find out for yourself

It is not easy for me to understand why a hellish hell exists. If all libraries were available for dynamic linking, wouldn't the need for recompilation not exist at all?

Nowadays one should try very hard to stick to Haskell in this regard. It looks like the system is punishing me for what you are trying. If I want to add a new library to my project, I must be sure that I am ready to wait 15-45 minutes (!!!) to compile it. Not to mention, the library doesn't compile anymore than I like. After you have gone through this process, only then can I know if this library is what I want to use, or if it is even compatible with the rest of my project.

+3


source to share


3 answers


In a nutshell: because native code is complex.

If you want to host binaries for arbitrary systems, you need to map binaries to each system you want to run on. This can mean compiling dozens of sets of binaries to support all systems that the code will compile against.



On the other hand, you may well find that someone has compiled the necessary code: the distribution vendor may well provide packages for the Haskell libraries you need.

+3


source


Because this is the easiest way to distribute everything while keeping it up to date. Offloading collection costs to users, library authors must provide source code.

This can be mitigated in various ways. For example, my CI setup uses CircleCI and Heroku. The nodes on both sides hold the pre-set chambered sandboxes (they are actually very easy to set up). I am building my project on Heroku, but there is no reason why you could not take the ready-made artifacts from your CI and deploy them directly.



As far as dynamic linking is concerned, it is possible to dynamically link Haskell modules, but more often than not these are problems with shared libraries. One glance at the Windows DLL addon should be enough to see this, and most commercial applications just ship the DLLs they use anyway. If the library changes, the DLLs must be replaced anyway, and the way Cabal does this makes it easier to have the latest and greatest versions of everything.

+2


source


First, note that on some platforms, you can install binaries. For example, on my OpenSUSE Linux system, YaST will quite happily download and install some Haskell libraries without building anything from source.

Of course, this only applies to a fairly small set of libraries, and all RPMs will be out of date for many months. (Not a big thing for X11, sort of a switch for something like Yesod that's under heavy development ...)

I think another big part of the problem is that if you compile a Haskell library with GHC 7.6.4, then you won't be able to use that binary compiled library with GHC 7.8.3. So we're not just talking about one compiled binary for each OS; we are talking about one compiled binary for each OS + GHC combination.

Oh, and did I mention? If you compile Yesod 1.4.0 against ByteString 0.9.2.0, then this compiled binary is useless if your system has ByteString 0.9.2.1 installed. Thus, you may need one compiled binary for each OS, each release of GHC, and each release of each library that it transitively depends on.

... This is partly why the Haskell platform was invented. This is one binary download that gives you a big bunch of code that you don't need to compile from source and where all versions of the libraries in it are mutually compatible. (No dependency, hell - the Haskell Platform supporters you like!)

I agree that binary packages will be very nice. But the aforementioned issues make it unlikely, IMHO.

+1


source







All Articles