Table of Contents for Programming Languages: a survey

Chapter ?: Tooling

Part of what draws people to (or repels people from) a programming language is its tooling. So if you are implementing a programming language, you may wonder, what does good tooling look like? For now, we'll settle for just surveying what tooling is out there, good or bad.

Note that many tools span multiple categories; in such cases we made a more-or-less arbitrary choice as to which category to put the tool in, so don't get too hung up on that.

Our focus here is language-specific tooling, rather than language-independent tools. However, there are many tools that, although they can in theory be used upon codebases of any language, in practice are associated with a particular language (typically the same language that the tool is written in). In some cases such tools may have a better fit with one language, in others they are just more popular in one language community than in others. In any case, we'll list many of these as well.

In addition, we sometimes list the most popular language-independent tools in each category.

Compiler features

Readable errors

IDEs and IDE API/module case studies

OmniSharp .NET IDE module

Go Oracle


Go for Visual Studio Code

"This extension adds rich language support for the Go language to VS Code, including:

    Completion Lists (using gocode)
    Signature Help (using godoc)
    Quick Info (using godef)
    Goto Definition (using godef)
    Find References (using go-find-references)
    File outline (using go-outline)
    Workspace symbol search (using go-symbols)
    Rename (using gorename)
    Build-on-save (using go build and go test)
    Format (using goreturns or goimports or gofmt)
    Add Imports (using gopkgs)
    [partially implemented] Debugging (using delve)" --

Sublimetext syntax definition format


links about kythe

opinions on Kythe

summary: that is a rant against the cloud-based nature of at-that-time Grok. Mentioned some other related systems and features:

desired features:

'similar projects' to Kythe from wikipedia

another blog post with lists of similar projects (todo take notes on this; i think i read it years ago already tho):

See also [1].

Haskell ghc-mod


Supported by MS (VS Code), Redhat, CodeEnvy?, Rust Language Server, and others (see ).

Supports features such as (list from [2]):

an implementation for VS Code:

Swift SourceKit protocol

note: some commentators note that it seems like SourceKit? pushes syntax highlighting across process boundaries and say that that is too slow: [3]

Typescript Language Service API

Ternjs language service API

(list from [4])

Elixir alchemist language service API

Lisp Swank protocol (within Slime emacs mode)

IDE features


-- [5]


Code formatters

Static program analysis and transformation case studies


Debugger Case studies

Profiler case studies

Build system case studies

GNU Build System (make, etc)

make debugger: tips:

"...This does make me wonder how things went so badly with make, makemaker, autoconf, aclocal, and the rest of the Texas Toolchain Massacre." [6]

"I work on a lot of Javascript projects. The fashion in Javascript is to use build tools like Gulp or Webpack that are written and configured in Javascript. I want to talk about the merits of Make (specifically GNU Make). Make is a general-purpose build tool that has been improved upon and refined continuously since its introduction over forty years ago. Make is great at expressing build steps concisely and is not specific to Javascript projects. It is very good at incremental builds, which can save a lot of time when you rebuild after changing one or two files in a large project. Make has been around long enough to have solved problems that newer build tools are only now discovering for themselves." [7]

"I used make heavily in the 80s and 90s, but haven't much since then. Recently I started a project that had source files getting processed into PDF files, for use by humans. Since this is the 21st century, those files have spaces in their names. At a certain point, I realized that I should be managing this processing somehow, so I thought of using a simple Makefile. A little searching reveals that the consensus on using make with files with spaces in their names is simply "don't even try." In the 21st century, this is not an acceptable answer." [8]

"I think the reason make is both so controversial and also long-lived is that despite how everyone thinks of it, it isn't really a build tool. It actually doesn't know anything at all about how to build C, C++, or any other kind of code. (I know this is obvious to those of us that know make, but I often get the impression that a lot of people think of make as gradle or maven for C, which it really isn't.) It's really a workflow automation tool, and the UX for that is actually pretty close to what you would want. You can pretty trivially just copy tiresome sequences of shell commands that you started out typing manually into a Makefile and automate your workflow really easily without thinking too much. Of course that's what shell scripts are for too, but make has an understanding of file based dependencies that lets you much more naturally express the automated steps in a way that's a lot more efficient to run. A lot of more modern build tools mix up the workflow element with the build element (and in some cases with packaging and distribution as well), and so they are "better than make", but only for a specific language and a specific workflow." [9]

"> and the UX for that is actually pretty close to what you would want.

That is so not true. Make has deeply woven into it the assumption that the product of workflows are files, and that the way you can tell the state of a file is by its last modification date. That's often true for builds (which is why make works reasonably well for builds), but often not true for other kinds of workflows...Anything where the relevant state lives in a database, or is part of a config file, or is an event that doesn't leave a file behind (like sending a notification)... " [10] , [11]

" send: foo.log tail foo.log

          touch send" [12] (in response to the previous comment about 'make' not being able to record events like the sending of notifications)

"But regardless of that, a tool that makes a semantic distinction between tabs and spaces is NEVER the UX you want unless you're a masochist. " [13]

"GNU make has had an option (.RECIPEPREFIX) to change this..." [14] (in response to the previous comment regarding spaces and tabs)

" Another issue with Make is that it's not smart enough to know that intermediate files may change without those changes being important. Consider that I change the comments in foo.c or reformat for some reason. This generates a new foo.o because the foo.c timestamp is updated. Now it wants to rebuild everything that uses foo.o because foo.o is newer than those targets. Problem, foo.o didn't actually change and a check of its hash would reveal that. Make doesn't know about this. So you end up making a trivial change to a source file and could spend the afternoon rebuilding the whole system because your build system doesn't understand that nothing in the binaries are actually changing. ... With regard to my last comment (the problem with small changes in a file resulting in full-system recompilation), see Tup. It maintains a database of what's happened. So when foo.c is altered it will regenerate foo.o. But if foo.o is not changed, you can set it up to not do anything else. The database is updated to reflect that the current foo.c maps to the current foo.o, and no tasks depending on foo.o will be executed. Tup also handles the case of multiple outputs from a task. There are probably others that do this, it's the one I found that worked well for my (filesystem-based) workflows. " [15] , [16]

" The reason I don't like (((make))) is portability. Since the steps within the makefile are going to be run through a shell, it is going to behave differently on different systems.

If your makefile fixes up a file using sed and your system has gnu sed, your makefile may fail on a system with BSD sed (e.g., a mac). If you rely on bash-isms, your makefile may not work on a debian system where it will be run with dash instead of bash. And so on. " [17]

"Make's interface is horrible. Significant tabs. Syntax which relies on bizarre punctuation... If only whoever authored Make 40 years ago had had the design acumen of a Ken Thompson or a Dennis Ritchie!" [18]

"I've seen plenty of unmanageable Makefiles, but I haven't seen another system that would make them inherently cleaner. (I love CMake, but it's a beast, and even harder to debug than make. If it weren't for its nice cross-platform capabilities, I'm not sure it would see much use. It's also too specialized for a generic build tool. Then again, I definitely prefer it to raw Makefiles for a large C++ project.) " [19]

" "In all seriousness, what's wrong with it?"

1. Claiming a rule makes a target, but then fails to make that target, ought to be a runtime fatal error in the makefile. I can hardly even guess at how much time this one change alone would have saved people.

2. String concatenation as the fundamental composition method is a cute hack for the 1970s... no sarcasm, it really is... but there's better known ways to make "templates" nowadays. It's hard to debug template-based code, it's hard to build a non-trivial system without templates.

3. Debugging makefiles is made much more difficult than necessary by make's default expansion of every target to about 30 different extensions for specific C-based tools (many of which nobody uses anymore), so make -d output is really hard to use. Technically once you learn to read the output it tends to have all the details you need to figure out what's going wrong, but it is simply buried in piles of files that have never and will never be found in my project.

4. The distinction between runtime variables and template-time variables is really difficult and annoying.

5. I have read the description of what INTERMEDIATE does at least a dozen times and I still don't really get it. I'm pretty sure it's basically a hack on the fact the underlying model isn't rich enough to do what people want.

6. Sort of related to 2, but the only datatype being strings makes a lot of things harder than it needs to be. " [20]

" With the debugging expansion thing you're mentioning, now I'm craving a make built on some minimalist functional programming language like Racket where "expand the call tree" is a basic operation. " [21]

" I've been writing Makefiles regularly for maybe 15 years and I always end up on this page every time I need to write a new one:

$< $> $* $^ ... Not particularly explicit. You also have the very useful substitution rules, like $(SRC:.c=.o) which are probably more arcane than they ought to be. You can make similar complaints about POSIX shell syntax but at least the shell has the excuse of being used interactively so it makes sense to save on the typing I suppose.

That's my major qualm with it however, the rest of the syntax is mostly straightforward in my opinion, at least for basic Makefiles. " [22]

" give pmake a shot sometime.. the syntax/semantics are much more 'shell-like' imho and some things are just much more possible.. (e.g. looping rather than recursive calls to function definitions)" [23]

" ...some great features of make:

"it could be a lot worse. (see also: m4, autoconf," [25]

autoconf and automake

(do these even belong in this section?)


"an F# "Make" system...a build DSL in F# scripts." [26]




"I've seen plenty of unmanageable Makefiles, but I haven't seen another system that would make them inherently cleaner. (I love CMake, but it's a beast, and even harder to debug than make. If it weren't for its nice cross-platform capabilities, I'm not sure it would see much use. It's also too specialized for a generic build tool. Then again, I definitely prefer it to raw Makefiles for a large C++ project.) " [28]


" Cmake is more of a replacement for autotools than for make.

Advantages: Good Windows support.

Disadvantages: Dictates the directory structure, much less flexible than autotools.

If you know shell, an existing autotools project can be modified easily.

If you want to do something special in cmake, first you need to go to Stackoverflow. If your are lucky, the thing you want to do is supported (often it is not).

All in all, I feel locked in by cmake - to the point that once the build works I'm less inclined to refactor because the directory structure cannot be changed easily. " -- [33]

Used by:




"When to stick with Webpack. The job that Webpack does is quite specialized. If you are writing a frontend app and you need code bundling you should absolutely use Webpack (or a similar tool like Parcel). On the other hand if your needs are more general Make is a good go-to tool. I use Make when I am writing a client- or server-side library, or a Node app. Those are cases where I do not benefit from the specialized features in Webpack." [34]


Apache Ant

Apache Maven

"Maven, though at first rather overwhelming, turned out to have a ton of features I’d often wished for in other build/dependency management systems." --

Clojure Leiningen

"I find Leinigen a bit bloated respect to Mix. Mix is faster, lighter and integrated with Elixir. Lein is not. I find Lein a bit slow..." [35]

Clojure Boot

eg with Clojure Boot installed:

$ curl

  1. !/usr/bin/env boot

; this is a minimal dispatch script that fetches the latest version of ; alda from clojars (a maven repository for clojure projects) and runs the ; alda.cli/main method, passing along any command-line arguments.

; this script will automatically update your version of alda as newer ; versions are released.

(set-env! :dependencies '[[alda "LATEST"]])

(require '[alda.cli])

(defn -main [& args] (apply (resolve 'alda.cli/-main) args)) $ sudo curl -o /usr/local/bin/alda $ sudo chmod +x /usr/local/bin/alda $ alda Retrieving boot-2.2.0.jar from Retrieving clojure-1.7.0.jar from Retrieving dynapath-0.2.3.jar from Retrieving pod-2.2.0.jar from Retrieving shimdandy-impl-1.1.0.jar from Retrieving core-2.2.0.jar from ...







Build system links


" Specifically, we propose that, as an experiment for Go 1.5, we add a temporary “-vendor” flag that causes the go command to add these semantics:

If there is a source directory d/vendor, then, when compiling a source file within the subtree rooted at d, import "p" is interpreted as import "d/vendor/p" if that exists.

When there are multiple possible resolutions, the most specific (longest) path wins.

The short form must always be used: no import path can contain “/vendor/” explicitly.

Import comments are ignored in vendored packages. " --!msg/golang-dev/74zjMON9glU/4lWCRDCRZg0J

0xdeadbeefbabe 10 hours ago

A vendors B and C, but C vendors D and E. According to this proposal do you lay out the files like this: (A (vendor (B C D E)))?

Edit: I guess HN tree markup is not working :)


jooon 9 hours ago

Yes. That would work. However, I also believe this will work: (A (vendor (B (C (vendor (D E)))))).


jooon 9 hours ago

and if you have both: (A (vendor (B (C (vendor (D E))) D E)))

C would pick D E in its C/vendor before D E in A/vendor before D E in $GOPATH/src

ps. I have edited these comments about 5 times to get the parentheses right. :)



for 'monorepos'


combined build system/packaging system

See also:


" iainmerrick 27 days ago


I think Xcode is what a lot of other IDEs and build systems are moving towards. Xcode is nice as long as you're working with normal code in permitted languages (C++, Obj-C, Swift) and permitted resource formats (NIBs). But if you need to do something slightly unusual, like calling a shell script to generate resources, it's horrible.

Oh, and I didn't even mention package managers! Having those tightly coupled to the other tools is horrible too.

bluetomcat 27 days ago

> But if you need to do something slightly unusual, like calling a shell script to generate resources, it's horrible.

Not quite true. Xcode provides a "Run Script" build phase that lets you enter your shell script right into the IDE. A lot of handy environment variables are also there. You can easily reach your project via $SRCROOT, or modify the resources of the output bundle via "${CONFIGURATION_BUILD_DIR}/${PRODUCT_NAME}".

iainmerrick 26 days ago

That's the sort of stuff I mean when I say "horrible". :)

It'll just run the script every time, rather than doing anything smart with dependencies. Output from the script might or might not be picked up and tracked properly by the IDE. If you accidentally mess something up nothing will detect or prevent that.

(Edit: should add that I haven't given it a proper try in recent Xcode versions. I probably should.)




"- SBT is amongst the best build tools ever available. I could rant all day about the clusterfuck of Javascript (npm, Bower, Grunt, Gulp,, etc.) or Python (easy_install, setuptools, virtualenv) or .NET (MSBuild, Nuget, dotnet) or Haskell (cabal). For all its quirks, SBT is by far the sanest dependency and build management tool I've worked with. In fact, amongst the best reasons for preferring Scala.js is being able to work with SBT and avoid Javascript's clusterfuck completely. " -- [37]

" is what I dislike about SBT:

Packaging system case studies

Node.js NPM

Well-liked [38]

apparently has a problem with making it too easy to publish dotfiles in the same directory as other code:

tjholowaychuk 9 hours ago

I'm sure I've done this in the past haha, the npm workflow isn't great at times in this regard. If you have something (to test etc) that is not checked into Git, but still in the directory, it can still make its way into a publish. That's definitely what I'd advise people to be most careful of, use npm-link and use credentials elsewhere etc.

Koa I'm curious of, I've seen almost every pull-request go in there, anyway nice post.


doublerebel 9 hours ago

Npm package "irish-pub" has definitely saved my ass a few times. (It shows a dry run of "npm publish".)


mofle 8 hours ago

There's an easy way to prevent credential leakage when publishing to npm => Explicitly list the files to include in the package through the `files` property in package.json.




Javascript Yarn

Alternative to npm

Haskell Cabal

Python comparison (and read the replies, there may be some errors in that comment)

" The thing that really stands out to me though is the poor state of python tooling and the library ecosystem. Having used Rubygems and bundler pip feels like taking an enormous step back. It's much less expressive. It doesn't handle the difference between production only dependencies, regular dependencies and development only dependencies in a good way. It's difficult to differentiate between locked dependencies vs desired dependencies(the Gemfile/Gemfile.lock distinction). PyPi? and especially using private PyPi? registers is more complex than it is in Ruby. There seems to be fewer nice libraries and they seem to be spread across the web whereas Ruby centralises around GitHub?. I also find that Python libraries have lacking or hard to find documentation in many cases. " -- [39]

"gem ist superior to pip. Take a look at "pip-tools" if you haven't already, it eases some of the pain." [40]

Python easyinstall

Python setuptools

Python pip

Python Virtualenv

Python wheels

" Daniel Holth wrote a PEP for the wheel format, which allows for binary redistribution of libraries. In other words, it lets authors of packages which need a C compiler to build give their users a way to not have one." -- [41]

Python tox

Python pipenv

Python pyenv

Python pipsi

Python pyinstaller

"Freeze (package) Python programs into stand-alone executables "

Python poetry

Python pipx

Python pyenv

Go discussion:




"GitHub? Package Registry currently supports these clients and formats":


Ruby Bundler, Javascript NPM, Rust Cargo, Javascript Yarn, and the need for determinism and lockfiles and also for supporting multiple versions of a dependency used within the same project:

Need for install from git repos and private packages:

Configuration and deployment system case studies

Language-independent configuration and deployment systems

This book isn't about this but since it's related to language-associated configuration and deployment systems, we'll briefly list some:

Python Paste

Ruby Capistrano

Lints and code quality checkers

TIOBE TICS (multi-language)


todo: add stuff from

Google error-prone (Java)



recommended by [43]

Astree (C)

recommended by [44]

Polyspace (C++)

recommended by [45]


" Recent security stories confirm that errors like buffer overflow and use-after-free can have serious, widespread consequences when they occur in critical open source software. These errors are not only serious, but notoriously difficult to find via routine code audits, even for experienced developers. That's where fuzz testing comes in. By generating random inputs to a given program, fuzzing triggers and helps uncover errors quickly and thoroughly. In recent years, several efficient general purpose fuzzing engines have been implemented (e.g. AFL and libFuzzer), and we use them to fuzz various components of the Chrome browser. These fuzzers, when combined with Sanitizers, can help find security vulnerabilities (e.g. buffer overflows, use-after-free, bad casts, integer overflows, etc), stability bugs (e.g. null dereferences, memory leaks, out-of-memory, assertion failures, etc) and sometimes even logical bugs. OSS-Fuzz's goal is to make common software infrastructure more secure and stable by combining modern fuzzing techniques with scalable distributed execution. OSS-Fuzz combines various fuzzing engines (initially, libFuzzer) with Sanitizers (initially, AddressSanitizer?) and provides a massive distributed execution environment powered by ClusterFuzz?. " -- [46]

Python Hypothesis fuzzer


Program analysis tools

Code browsers

C++ Pivot

Uses an IL (intermediate language) called IPR, which is serialized as XPR, for which an overview is given in

"Currently, the Pivot does not support an annotation language. Pivot programs can annotate IPR nodes, but there is no facility for the programmer to embed annotations in the C++ source text. Providing such a facility is easy, but once programmers starts to depend on such annotations, they have created a new special-purpose language. We want to explore how much can be done with the SELL approach, relying only on standard conforming C++ source text." -- [47]


Seems to be inactive as of 2015?


notes / todo


lbotos 15 hours ago


Current Go users, What's the state of package versioning right now? Is vendorization still the answer?


dayjah 12 hours ago


We use Godep, it is very good. As per another answer to your question: with '-copy=false' it behaves a lot like bundler.lock. Having spent a lot of time working with it we've found a few areas where you can get burned a little; particularly if you've structured your repos as a set of libraries, as seems to be the encouraged golang pattern.

When you have multiple libraries you have to be very specific about when you run godep, lest you find yourself with two libraries needing different versions of a common library, for example Main imports Foo and Bar, which both import Baz. Godep provides a mechanism for handling this: each dependency is explicitly locked into a fixed revision (e.g. commit sha, in the case of git). The pain comes about when during debugging as it can be very hard to reason which version of a library you're using.

Additionally the revision aspect is also a bit of a PITA, we use a development flow which rebases our small commits into a big commit and then merges that into our master branch; if you ran godep prior to that you're now referencing a commit that no longer exists. Given the chain of references that can exist this can go a very long way down. This same pattern also forces you into needing to push your dev branches to an origin server, as godep checks out the repos during the build, which while pretty benign a concern is a PITA if you forget and your build breaks because of it.

We're strongly considering moving to "one big repo" to help combat this issue (as well as a few others) for our internal golang repositories. Referencing "published commits" in 3rd party libraries is an acceptable level of pain. We're not entirely sold on this yet... just considering it.


leef 15 hours ago


No need to vendor. Use Godep without copying (godep save -copy=false) to create the equivalent of a bundler.lock file and check that into source.


AYBABTME 15 hours ago


There are other ways around, but I'd say the community is solidifying towards godep. Someone correct me if I'm wrong.


chimeracoder 1 hour ago


There seem to be a lot of comments here recommending godep, but just to throw my experience in: none of the projects I've interacted with use godep (other than the Heroku buildpack, which was written by the author of Godep).

It seems to be a solution for some (not all) projects that are released in binary form, but that isn't relevant to most projects out there[0]. I have never felt the need for what godep provides; vendoring myself has been sufficient for the (very rare) case in which I need specific versions of dependencies other than tip/trunk.

I asked around on #go-nuts, and (though the sample size was small), the other regular contributors who idle in the channel seemed to have the same experience.

YMMV obviously.





mimog 1 day ago


I like how it says "Rock-Solid Ecosystem", yet I have had the exact opposite experience trying to install even the most basic things with Cabal. I still can't get the Sublime text haskell plugin to work due to a dependency that fails to compile.


rhizome31 1 day ago


I had a similar experience with Cabal. On one computer I haven't been able to install Yesod whereas on another one, it finally worked after I had wiped my ~/.cabal. It gave me the impression that Cabal's dependency resolution mechanism is still a bit britle.

Also I found that installing stuff through Cabal was pretty slow. It's probably partly because Haskell libraries tend to be kept narrow in scope so it's necessary to install a lot of small packages to get a piece of functionality (take for instance the dependency list of Aeson, which seems to be the recommended choice for working with JSON : ). Another reason is that Cabal compiles Haskell code into native code.




dbaupp 3 days ago


As others have said, having a package management system that deeply understands the language and tooling is awesome. Examples:

I'm sure all of this is possible with other systems, but it seems unlikely to be so nice to use.


seabee 2 days ago


It's very similar to Racket, and yes, it is nice to use!

Other systems can get you much of the way there (node, Python are the only ones I'm really familiar with) but I suspect you need a little language help to achieve the same kind of convenience.



pestaa 3 days ago


Curious to hear more about language-specific (though OS-agnostic!) package management systems. IMO composer is the best thing ever happened to PHP, Ruby gems are huge, Python eggs also make a very useful ecosystem.

OpenSUSE?'s Open Build System would be great to ship independent packages, but those are again heavily tied to Unices, hence leaving other platforms behind.


JoshTriplett? 3 days ago


> Curious to hear more about language-specific (though OS-agnostic!) package management systems.

As far as I can tell, one of the main justifications for most language package management systems is "we also run on Windows/OSX, which has no package management, so we'll invent our own". As a result, users of systems that do have sane package management get stuck with multiple package management systems, one for the distro and one for every language. Even then, I find it disappointing that nobody has built a cross-platform package management system for arbitrary languages to unify those efforts.


smacktoward 3 days ago


The other justification is generally a clash of cultures: the people who maintain distro/OS package managers generally come out of the culture of sysadmins, who value stability over feature-richness, while the people working the language communities generally come out of the culture of developers, whose priorities are the exact opposite.

When languages try to hook into existing OS-level systems, the people on the language end get frustrated by the way the people on the package-manager end don't hurry to rush out bleeding-edge versions of packages the second they hit Github. To the package-manager people, that's no big deal, their orientation towards stability and predictability makes them comfortable with waiting a little for the coffee to cool. But to the developers, who want to get their hands on the Latest and Greatest Right Now!, it feels like slogging though molasses.

So the developers eventually end up blowing their stacks and stomping off yelling "Yeah? Well fine, we'll build our own package manager then! With blackjack! And hookers!"


DennisP? 3 days ago


Maybe OS-level package managers should default to stable, but let the user check a box to get the latest and greatest. Developers want a stable system like everyone else, but for the stuff we're hacking on, we have a legitimate need to get the most recent, so our software isn't obsolete by the time we finish it.


djur 3 days ago


Most OS-level package managers also aren't designed to install more than one version of a package at a time. They don't tend to integrate with build systems as well, either.


mercurial 2 days ago


That's not so simple. A distro is a fine-tuned collections of packages which work more or less well together. Debian, for instance, comes in stable/testing/unstable/experimental flavours, depending on how daring you are. But even this isn't a universal solution. If you are deploying for instance a web application, you will want to deploy a locked down number of dependencies as well, regardless of what is present on the target system. And you may need deploy multiple applications side by side. Few system package managers have an answer to this.


DennisP? 2 days ago


So developers end up installing later versions manually. And in many cases it's no big deal. If the distro has Julia 0.2.1 and Emacs 23, I can upgrade to Julia 0.3 and Emacs 24 and it's not likely to damage anything. It'd just be nice if I could do it with the package manager instead.

But just because I'm doing that doesn't necessarily mean I want, say, the latest unstable version of the window manager.


mercurial 2 days ago


Debian will let you do that. You can run, say, your machine on testing but get the latest Firefox from experimental if you want. This may, however, upgrade other dependencies on your system, but it's pretty much unavoidable.


yxhuvud 2 days ago


I'd be happy enough if the OS-level packagers stopped modifying the package-level packages they packaged.


kungfooguru 3 days ago


The problem with language package management systems is they've been used for installing user facing software. As a developer tool I think it is the perfect way to go.

And you should add Linux to your Windows/OSX as being an issue, which Linux package management tool would you build packages for? All of them?

The end user package management provided by the OS should be for installing end user packages and the language tool for installing and publishing libraries and dev tools.


wycats 3 days ago


> The problem with language package management systems is they've been used for installing user facing software. As a developer tool I think it is the perfect way to go.

Precisely so.


pjmlp 3 days ago


> As a result, users of systems that do have sane package management

Given the diversity of OS in the IT landscape, which systems are those?


stefantalpalaru 3 days ago


I met only one package manager that I don't need to fight in order to get what I want: Gentoo's Portage. With a local overlay and language specific functionality concentrated in eclasses it's trivial to add new packages, do version bumps, have fine grained control over installed versions, enabled features, etc.


kryptiskt 3 days ago


The distro only contains a small selection of the packages (even if there are hundreds or thousands of them) and the language package system is usually the source the distro maintainers use to find the packages anyway.


yla92 3 days ago


Lately, IMHO, Gradle in Android Development(applicable for Java development as well) is a huge improvement over managing dependencies with pom.xml(ant) and linking jar files manually. Besides, you can totally customize build.gralde too.


dscrd 2 days ago


>Disappointing to see yet another language-specific package management system (Cargo), though.

As a packager in a Linux distro, I'm disappointed every time somebody tries to cram in PL-specific packages inside distro packages.


pjmlp 3 days ago


> Disappointing to see yet another language-specific package management system (Cargo), though

So what is the solution to have portable packages for:


adrusi 3 days ago


The goal of the [nix]( project is to solve this, and every time anyone brings up a package manager on HN, someone has to mention nix. The reality is that nix is really nice, but isn't any better than making a new package manager until it has wide adoption, so no one is using it.


steveklabnik 3 days ago


Nix was brought up during the discussion that led to Cargo, but no Windows support is a deal breaker.


pmahoney 3 days ago


I would probably make the same decision, but I hope in the end Caro is easy to wrap with Nix, which is a breath of fresh air particularly when needing to mix dependencies that cross language boundaries and share those build recipes with a team.

Previously, I wrote shell scripts and worried whether everyone on the team had rsync installed, or xmlstarlet, or some other less common tool. Now I wrap those scripts in a Nix package that explicitly depends on all those and distribute with confidence. It's fantastic.

Bundler and rubygems, for example, do various things that make good support within Nix rough. Two examples: 1. rubygems has no standard way of declaring dependencies on C libraries; 2. as far as I know there is no way to ask Bundler to resolve dependencies, create a Gemfile.lock, but not install any gems (I realize github gems must be downloaded to see the gemspec...)


steveklabnik 3 days ago


Cargo has the second, and there's a plan for the first.

That said, the reason that you want it to do the installation is that a lockfile is supposed to represent the way to do a build successfully. Without building everything, you can't actually be sure that the lockfile is correct. In theory, it should be...


pmahoney 3 days ago


> reason that you want it to do the installation is that a lockfile is supposed to represent the way to do a build successfully

Sure, and I'd like to do that build within Nix (and someone else might want to do it with another packager), which gives a stronger guarantee than Bundler since it incorporates C library dependencies and more. Anyway, the specifics aren't relevant to this discussion, and it seems you have a grasp of the issues, so carry on!


derefr 3 days ago


Wouldn't it still have been less effort to port Nix to Windows, than to write an entirely new package manager and then port it to every OS?


steveklabnik 3 days ago


If that were the only downside, possibly. I don't really do Windows development, so I can't tell you how difficult porting Nix would be. There's large advantage to having a packaging system that knows your language well. It's going to have tighter integration than a generic one ever could.


pjmlp 3 days ago


It seems to be only for GNU/Linux systems, what about all other OSs out there?


BruceM? 3 days ago


I've been experimenting with Nix on Mac OS X lately and it works fine. I've heard that it works on FreeBSD? as well. The big gap is Windows.

The good news is that you can integrate your language-specific tools with Nix as well, such as has been done for Haskell, node.js and other things. (I'm looking at it so that we can integrate our Dylan stuff with it.)


pjmlp 3 days ago


When this discussions happen on HN, I always see a narrow discussion of Mac OS X, GNU/Linux, Windows and with luck *BSD.

But the world of operating systems is so much bigger than the desktop under the desk.

Good work on Dylan by the way.


BruceM? 2 days ago


I'd love to have the time and the resources to deal with more OSes. :) 20 years ago, I had to keep stuff running on Solaris and lots of other platforms. About 20 years ago, I still did some work on VMS on actual VAX hardware! It wasn't that long ago, that we had the possibility of BeOS? either. Comparatively, we have quite a monoculture (of POSIX) these days with Windows being the non-POSIX representative.

Maybe unikernels like OpenMirage? will help make things interesting.

And thanks! The work on Dylan is a lot of fun and keeps me semi-sane by keeping me busy.


CMCDragonkai 3 days ago


Nix is much more than just a package manager though.


doe88 3 days ago


> Disappointing to see yet another language-specific package management system (Cargo), though.

Coming from Python I find Cargo very very smart and very well thought so far, it is not feature heavy, but everything has a very clear and useful purpose. For instance today I found that if I created a file .cargo/config I could override my dependancies to make Cargo search projects on my fs instead of grabbing them on Github, while doing developments it's a big thing I think.


Ygg2 3 days ago


> Disappointing to see yet another language-specific package management system (Cargo), though.

I don't think it is. You need support for Rust modules on various platforms Linux/Mac/Windows(possibly Android). No single tool works on all those platforms. Cargo does and it has minimal dependencies.

Not having to juggle three different configuration (CMake, Makefile, etc.) on different platform is actually pretty great.




(the following are from HN, not the previous link)

" carrja99 1 day ago


I'd have to say the biggest thing that npm has over module systems found in java, ruby, python etc. is the complete isolation of transitive dependencies. It is nice to use two dependencies and not waste a day or two because:

In all the languages you mentioned it becomes a pain because you can only use one version of module C, meaning either module A or B simply will not work until you find a way around it.

reply "

" dragonwriter 1 day ago


> Semantic versioning's raison d'être is to prevent these sorts of issues.

Semver may surface them by making it very clear (assuming all involved libraries use semver) where they can occur, but, if you have a package management/loading system that only allows one version of a particular package to be loaded, obviously can't do anything to prevent the situation where different dependencies rely on incompatible versions of the same underlying library.

Sure, with semver it won't happen if A depends on C v.1.0.1 and B depends on C v.1.4.3 (as A and B can both use C v.1.4.3), but it will still happen if A depends on C v.1.0.1 and B depends on C v.2.0.0.)

To actually avoid the problem, you need to isolate dependencies so that they aren't included globally but only into the package, namespace, source file, or other scope where they are required.

reply "

" No1 1 day ago


NPM's way of managing dependencies still can waste a day or two (or more) of your time. For example, get a C object from B, then pass it into A.

Things are even more twisted when you have a half dozen versions of C floating around in your node_modules, and the problem isn't in your code, but a dependency of a dependency.

Another issue I've run into is patching a bug in a module, and then having to figure out how to get that patch into all of the other versions that cropped up in node_modules.

NPM is one way to solve the modules problem, but it's no panacea. "

" k3n 1 day ago


That's great, but it's not without cost. Here, the cost is you end up with deeply-nested directory nodes (which breaks Jenkins ability to properly purge the directory after a job). Node modules are also extremely liberal in the number of files they create -- even a "simple" app using just a few common modules could end up with 1k+ extra files. This can produce problems in your IDE, as well as with your source control or continuous delivery systems, among other things.

So, it solves some headaches, and creates others.

reply "

npm shrinkwrap


nobleach 1 day ago


While I like using NuGet? packages with C#, I'm not really wild about how they can get magically linked in to a project, and then required. I had nunit and fluent assertions become inextricable from a project I was working on even after all the tests were removed. Just a total mind-f*ck. Python when using pip is a whole lot better but I've had some issues finding things there too. Ruby... it depends. Are we talking Rails gemfile or "gem install $package"? Conflicting versions can become an issue. Java with Gradle has been pretty cool so far. NPM as a whole, has just worked. Packages are referenced in ONE place (package.json) I can do an "npm install $package --save" during development and it gets included automatically. "


clintonb11 1 day ago


I agree. PIP in python is great, but has some extra overhead and difficulty with it, like having to setup virtual environments for each project. NPM by default installs to the local project only and with a quick --save will put it in package.json dependencies (similar to requirements.txt with pip). Node package management is awesome because it is so simple.


rhelmer 1 day ago


Virtual environments are optional though, right? You could have one big virtualenv for all projects, or simply install things into the system path (although I wouldn't recommend either)

reply "


PuercoPop? 1 day ago


I should just probably say, clearly you haven't seen Common Lisp's defpackage, modules are actually first class objects there and are completely decoupled from the file system.

But most importantly as Barbara Liskov mentions in this video[1] we don't know what is a module exactly or how to use them yet. Which is an specific statement aligned with Alan Kay's famous "We don't know how to design systems so let's not turn it into a religion yet."[2]

tl;dr; 1) Innovation is good. 2) Javascript's module is a half assed implementation of Common Lisp's defpackage. (Don't get me wrong is still way better than Python's abhorrent ninja goto: import)

[1]: [2]:

reply "


coolsunglasses 1 day ago


You have not used a good module system. Clojure's namespace system for example is really nice.


rafekett 1 day ago


Have you used ML?

reply "

" Man, global system-wide installations that require admin rights by default? That's certainly something! Quite the stark comparison to Node.js and npm, where everything is installed locally into the current directory (under node_modules) by default, and "global" installation is actually a per-user installation. Tricking pip with virtualenv seems to get you pretty close to what you get by default with npm, albeit still somewhat more clunky. But to be fair, most other package managing solutions seem to pale in comparison to npm :-)"


phren0logy 2 days ago


Nice article, but after using leiningen (the clojure solution to a similar problem, based on maven), it's really hard to go back to something like this. I really, really wish there was an equivalent in python (really, every language I use). "


arnarbi 2 days ago


I find it best to keep virtual envs completely away from the project (I use which puts them by default in ~/.virtualenvs). A virtualenv is completely machine-specific.

If your project is a package itself (i.e. it has a file), then use that file to specify dependencies. On a new machine I check out a copy, create a virtual env and activate it. Then in the local copy I run "pip install -e .". This installs all the requirements from in the virtualenv, and links the local copy of my project to it as well. Now your package is available in the virtual env, but fully editable.

If your python project is not a package, you can install its dependencies in a virtual env with pip. Then run "pip freeze" to generate a list of all installed packages. Save that to a text file in your repository, e.g. ``requirements.txt``. On a different machine, or a fresh venv, you can then do "pip install -r requirements.txt" to set everything up in one go.

reply "


"pip is vastly superior toeasy_install for lots of reasons, and so should generally be used instead."

Unless you are using Windows, as pip doesn't support binary packages. "






"The only debug tools that we found that were better than XDebug or NuSphere? were Studio And C#. Perl, Python and Ruby were all a bit rubbish in comparison." --


Elixir "Mix is like gems/bundler/rails {console, server, etc.} in one neat package done right and minus the headaches." [48]

"I find Leinigen a bit bloated respect to Mix. Mix is faster, lighter and integrated with Elixir. Lein is not. I find Lein a bit slow..." [49]

Elixir ExTest? Diffing:


ExUnit? will now include diff-ing output every time a developer asserts assert left == right in their tests. For example, the assertion:

assert "fox jumps over the lazy dog" == "brown fox jumps over the dog"

will fail with

ExUnit? diff

such that “lazy” in “lhs” will be shown in red to denote it has been removed from “rhs” while “brown” in “rhs” will be shown in green to denote it has been added to the “rhs”. "


ocaml opinions

yodsanklai 4 days ago [-]

I've been working on a side project in OCaml and I can totally relate. I'm an OCaml old timer and I find amazing the amount of development that has happened recently. There's a lot of ongoing development in the libs and the surrounding tools (more so than in the language). I've spent a lot of time just to set up my environment and I had to pin several packages to their development version to make things work (jbuilder, merlin, ppx...). Moreover, a lot of these tools lack proper documentation, and it's difficult to get answers to your questions since it's a very small community.


djs55 4 days ago [-]

I'm also an OCaml old timer and I think I can relate too. I believe the recent tooling changes are going in the right direction and will eventually fix several of these problems, for example:

There's a push to remove "optional dependencies" which are the reason why opam dependencies rebuild again and again: For example in the Mirage project we've been working on this but it has caused some breakage here and there.

jbuilder (from Jane Street) is excellent: expressive, easy to understand, builds packages extremely quickly, is actively developed, has minimal dependencies and a lovely manual It takes care of generating boilerplate for other tools like merlin (which to be honest I never got around to manually configuring). There's also work to integrate it with utop

jbuilder also supports building multiple libraries in one big source tree, so we could switch to a package lockfile model: the author uses opam to create a solution to the package constraints and checks in the specific versions known to work, the build clones the dependency sources and jbuilder builds it all simultaneously. I'm keen to try this on one of my larger projects so that "git clone; make" just works, irrespective of where the host OCaml comes from.

PPX syntax extensions depend on specific compiler versions, so when (for example) homebrew updates to OCaml 4.05 you might find that extensions you need have not been ported yet. ocaml-migrate-parsetree aims to fix this problem

There's obviously still plenty of work to do, but I think things are improving!




platz 1 day ago [-]

The only build systems that I'm aware of that are monadic are redo, SCons and Shake-inspired build systems (including Shake itself, Jenga in OCaml, and several Haskell alternatives).

One realistic example (from the original Shake paper), is building a .tar file from the list of files contained in a file. Using Shake we can write the Action:

    contents <- readFileLines "list.txt"
    need contents
    cmd "tar -cf" [out] contents

There are at least two aspects I'm aware of that increase the power of Make:

It seems every "applicative" build system contains some mechanism for extending its power. I believe some are strictly less powerful than monadic systems, while others may turn out to be an encoding of monadic rules. However, I think that an explicitly monadic definition provides a clearer foundation.

reply "


" We don't want to remember and execute the build commands by hand (at least I don't). That's why we have build tools:

    bash scripts,

A build tool usually:

    has a list of source files,
    knows how to build each source file,
    keeps a dependency graph to rebuild only files that change,
    keeps a list of directories containing header files,
    keeps a list of external libraries to link to (static/dynamic),
    manages compiler flags (optimization level, warning level),
    knows which files to link into executables and libraries.

Some build tools offer additional features:

    program installation,
    cross platform support,
    cross compilation,
    dependency installation." [50]