Argument list too long: Difference between revisions

From Helpful
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
Line 1: Line 1:




...in a linux shell, often happens when you used a * somewhere in your command.


The direct cause is usually having a * somewhere in your shell command.


The actual reason is a little lower level:
Shells will expand [[shell globs]] before it executes a command, so e.g. {{inlinecode|cp * /backup/}} actually might happen to expand to a long list of files.


The actual reason is a little lower level.
Either way, it may create a very large string to be handed to the exec().


Most shells expand shell globs before it executes a command, so e.g. {{inlinecode|cp * backup/}} actually might happen to expand to a long list of files and/or very long filenames.


Either way, it may create a very large string to be handed to the exec() call.
You get this error when that argument list is too long for the chunk of kernel memory reserved for passing such strings - which is hard-coded in the kernel {{comment|(MAX_ARG_PAGES, usually something like 128KB)}}.


 
You can argue it's a design flaw, or that it's a sensible guard against a self-DoS, but either way, that limit is in place.
When that argument list is too long for the chunk of kernel memory reserved for running commands {{comment|(MAX_ARG_PAGES, usually something like 128KB, and specifically for the environment + command line and probably some other details)}}, a size that is hard-coded in the kernel, you get this error.
 
 
You can argue it's a design flaw, or that it's a sensible guard against a self-DoS.
 
Short version is that it's a fact of shell life.




Line 28: Line 23:
: {{inlinecode|find . -name '*.txt' -print0 | xargs -0 echo}}  {{comment|(See also [[find and xargs]])}}
: {{inlinecode|find . -name '*.txt' -print0 | xargs -0 echo}}  {{comment|(See also [[find and xargs]])}}


* Recompiling the kernel with a larger MAX_ARG_PAGES - of course, you don't know how much you'll need, and this memory is permanently inaccessible for anything else so just throwing somehing huge at is is not ideal
* Recompiling the kernel with a larger MAX_ARG_PAGES - of course, you don't know how much you'll need, and this memory is permanently inaccessible for anything else so just throwing a huge number at is is not ideal





Revision as of 08:58, 11 July 2023


...in a linux shell, often happens when you used a * somewhere in your command.


The actual reason is a little lower level: Shells will expand shell globs before it executes a command, so e.g. cp * /backup/ actually might happen to expand to a long list of files.

Either way, it may create a very large string to be handed to the exec().


You get this error when that argument list is too long for the chunk of kernel memory reserved for passing such strings - which is hard-coded in the kernel (MAX_ARG_PAGES, usually something like 128KB).

You can argue it's a design flaw, or that it's a sensible guard against a self-DoS, but either way, that limit is in place.


There are various workable solutions:

  • if you meant 'everything in a directory', then you can often specify the directory and a flag to use recursion
  • if you're being selective, then find may be useful, and it allows doing things streaming-style, e.g.
find . -name '*.txt' -print0 | xargs -0 echo (See also find and xargs)
  • Recompiling the kernel with a larger MAX_ARG_PAGES - of course, you don't know how much you'll need, and this memory is permanently inaccessible for anything else so just throwing a huge number at is is not ideal


Note

  • that most of these split the set of files into smaller sets, and execute something for each of these sets. : In some cases this significantly alters what the overall command does.
You may want to think about it, and read up on xargs, and its --replace.
  • for filename in `ls`; do echo $filename; done is not a solution, nor is it at all safe against special characters.
ls | while read filename ; do echo $filename; done (specifically for bourne-type shells) works better, but I find it harder to remember why exactly so use find+xargs.