Linux: Argument list too long.

I first ran into this error many years ago whilst moving a Majordomo mailing list archives. This was in the days before Google–yes, there was a dark age even for the Internet. So, I ended up writing a little command line script to get each file one by one and mv them.

I never gave this much thought thereafter but recently I ran across a colleague who had encountered this problem in one of his mail spools and I decided to get to the bottom of this irritating and, to me, obscure error.

The solution is simple enough, use something along the lines of:

find ./ -name * -exec rm {} \;

Where you replace rm with the command giving the problem. The only problem with this is that he was already using find and it was still failing. An alternative formulation looks like this:

find ./ -name * | xargs rm {}

This version worked but then somebody else commented that the man page for find must be wrong because it says that the -exec xx {} \; operates on each filename as it is returned so that find should operate as expected without the pipe through xargs.

Now, I just had to get to the bottom of this. In doing so I discovered the kernel parameter MAXPAGES. Until quite recently, on most flavours of *nix, this value is set to 32 when the kernel is built. But 32 what? Well, 32 pages of 4096 bytes each or 131072 characters. So, how is find exhausting this space?

The answer apparently lies in find‘s propensity for recursion. When one types find -name * -exec rm {} \;, which is perfectly valid, what happens is that find begins a recursive traverse of the directory tree starting with its root in the pwd. It is this implicit recursion that is causing find to choke.

The answer to this seems to be:

find -name * -prune -exec {} \; which limits find‘s activity to the named argument directory, in this case the assumed “./”.