Hi GNU make maintainers,
I began work on some additions and refinements to make.texi fourteen months ago
but got
distracted. The version I started from has
@set EDITION 0.70
@set RCSID $Id: make.texi,v 1.45 2006/04/01 06:36:40 psmith Exp $
What's the most recent edition; and where can I find an
> I've been struggling for some time now with how to write rules for
> commands that generate multiple targets
A familiar and annoying problem: make really believes in commands that
generate just one (relevant) file, and doesn't fit so well with ones
that generate several.
> The next thing to tr
Correcting myself: I missed out the $@ in
y.tab.h y.tab.c y.output: yacc.ts
tar xf $< -m $@
which ensures that each target is the only thing extracted when the
rule to generate it is executed,
Eddy.
___
Bug-make mailing list
Bug-m
> The only downsides to this I see are:
>
> 1: The duplicate file storage. Probably not a big deal.
> 2: The extra processing time to archive and extract the files. Again,
> probably not a big deal.
> 3: The "why'd they do that?" questions coming from people unfamiliar
> with the technique.
al
> So there was a bug before 3.81, since they issued an Error in this case. And
> the bug was fixed in 3.81. Is that what you are saying?
I think this misses the objection.
> include done
> => is looking for a file named done and fails since no file named done.
>
> -include done
> => is building a
> the MSVC preprocessor doesn't support conditional compilation
> inside a macro expansion.
I take it that's
FUNCTION_LIKE_MACRO(early, args,
#ifdef SYMBOL
symbol,
#else
token,
#endif
remaining);
which is indeed no
> (While I don't think the sysV syntax is *great*, I personally think
> it's a better choice than overloading the meaning of parentheses.)
+1
It also avoids the problem of having to make sense of nesting, e.g.
>>(b1 (c1 c2)): d1
Eddy.
___
>>y.tab.h y.tab.c y.output: yacc.ts
I don't actually see that y.output serves any role in this; simply
remove every reference to it and your example should be clearer.
>>y.tab.o: y.tab.c y.tab.h
I don't understand .INTERMEDIATE well enough to know why this chain
fails to lead to y.tab.o'
> It's also unnecessary - you don't need a rule for %.d at all. You can
> just generate the dependencies as a side-effect of compilation using
> -MMD or similar.
Well, if a .d file gets deleted while its .o file exists, you do need
to regenerate it - or regenerate the .o (which may cause wasteful
> Delete a "clean-depend" rule on sight,
I cannot agree.
If I write a rule to make something, I also write a rule to get rid of
it. It's just basic hygiene ...
> or rename it to the more accurate "break-future-builds".
If you have a sensible rule to generate .d files when needed, you
haven't br
> If an update to new source code, that would compile just fine in a clean
> checkout, breaks the incremental build, the build system is errornuous.
I would like to agree with you, but this constraint is, in general,
incompatible with incremental building, which is too good a benefit to
throw awa
>>> The fix for that has been documented for years on Paul's webpage, and
>>> is most easily done now with gcc's -MP option.
>>
>> URL ?
> http://make.paulandlesley.org/autodep.html#norule
Thanks.
> (That explains the problem and what needs to be put in the dependency
> files to solve it, and th
>Do you have any idea on how to achieve what I want?
$(EXPORTED_HDRS): $(EXPORT_HDR_PATH)/%: ../include/%
with the same command as you're using.
Eddy.
___
Bug-make mailing list
Bug-make@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-mak
> I'll think about it and check my ISO C 1989 standard (I can't remember
> whether it supports %p) when I get back to work on Tuesday.
ANSI C '89 does specify the %p formatter (taking a void*).
> Finally, it seems that some of these changes are meant to avoid variable
> names conflicting with fun
> You might read: http://make.mad-scientist.net/autodep.html
Paul: do you have any plans to integrate your pages into the manual ?
What's currently there falls some way short of best practice; I have a
more sweeping set of changes than Florian's, that I've put on hold
until I've got time to look
>> > > char *alloca ();
>> Is this the right return type though? Wouldn't it be void*?
> I have no idea. You will see that I didn't touch that line in the
> patch, precisely because I don't know what is correct and for which
> platform(s).
In GNU/Linux's gcc/glibc, it is indeed void*, but it'
> Probably there should be an effort to switch to heap for anything that
> might get large and reserve alloca() usage just for things we know for a
> fact will not get too large, but that hasn't been done.
... and anywhere you use a scanf variant, glibc is also using
alloca(), without knowing any
> previous_var:
> echo $(VAR) > previous_var
> .PHONY: previous_var
I suggest you eliminate this .PHONY - previous_var is a real file on
disk, so not a phony target. There might be a case for it to be
declared .PHONY in an *else* clause, when PREVIOUS_VAR agrees with
VAR.
> /source/test $
> The .PHONY forces make to update the file, even though it has no
> dependencies and would otherwise be considered up-to-date.
Then you should force the target instead; this is an abuse of .PHONY.
previous_var: FORCE
echo $(VAR) > previous_var
FORCE: # no prerequisites
# no rule
will do
>>> Make can redirect every parallelly issued shell's output to an
>>> temporary file, and output the stored output serially, as if in a
>>> serial make.
+1 for wanting this as a make feature.
We a hack in some of our makefiles to implement essentially exactly
the above. For reference, here's th
>>> The shell wrapper buffers the recipe output and then grabs a semaphore
>>> before writing the output to it's stdout.. if another recipe has
>>> completed and is in the process of outputting to the stdout then it
>>> has to wait a few microseconds.
>> The use of semaphore may impair performance
>>> This seems like quite an extreme example. stdout is line buffered by
>>> default,
on half-way decent systems - and even then, I'm not sure, it might be
limited to when writing to a TTY.
>> I use "make -j 4" to build and test gcc, the situation above is very common.
> Then it means you're get
> If my guess is not wrong, the semaphore safeguard the consistency of
> output of one command, not the order of commands.
well, with -j, commands are being run concurrently, so there *isn't* a
strict ordering of commands to "safeguard", although output shall be
delivered in roughly the order of c
> 2x is too much. 1.5x has been the best in my experience, any more than that
> and you're losing too much CPU to scheduling overhead instead of real work.
> Any less and you're giving up too much in idle or I/O time.
This depends a bit on whether you're using icecc or some similar
distributed c
> the output I see from make is after all macro substitutions have been
> made, which can make it virtually impossible
> to recognize as far as where it came from in the original source
This, however, is an issue with how the make file is written.
It sounds like its recipes for commands are of f
> I do a lot of cross-compilation, where the platform requires a set
> of CFLAGS to always be present, such as -march=xxx and
> -isysroot= ...
Suggestion:
Set
CROSS = -march=xxx -isysroot=yyy
where you're currently trying to hack CFLAGS and have makefiles that
are actually going to use CFLAGS
> One wants to have a big build to make the most use of the parallelism
> that's available but also to deal with dependencies that span
> components or directories. A hierarchy of makefiles that run each
> other recursively can't represent these dependencies properly and also
> are not terribly go
> Roland, I think you overstate the seriousness of the problem.
... and I think you are understating it.
> There are not many makefiles that both define multiple pattern rules
> and rely on their order for selection.
Mine do. Not that I realized it until things didn't work as I
expected, afte
> Using a directory as a normal prerequisite is almost never what you
> want.
>
> You have two choices.
Three: wherever you currently declare a dependency on a directory,
instead declare a dependency on a .exists file *in* that directory.
Then have the rule for a .exists file create its directory
> it appears from running make with the debug option that the
> top level make does not see the object file created by the lower level
> make in the time when it checks the dependencies for the library.
sounds a *lot* like an issue with make's caching of stat information;
if the uppper make has lo
> Every time the subject of cache comes up its because they are always
> wrong.
well, this *is* a *bug* list - every time anything comes up here, it's
because someone's found a bug in it ! The vast amount of the time
that the cache works just fine and makes builds faster is (quite
properly) not n
> $(EXEC-FILE): $(SRCS:%.o=%.mod)
I think you mean $(SRCS:%.mod=%.o)
Eddy.
___
Bug-make mailing list
Bug-make@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-make
> $ echo 'a b:; echo $@' | make -f - "a b"
> echo a b
> a b
and what happens if you
echo 'a b:; echo $@' | make -f - a
? If that doesn't echo a, then you've broken all rules with more than
one target ...
I expected your escaping to require
echo 'a\\ b:; echo $@' | make -f - 'a b'
or
echo '
> This change is going to be a nightmare for us users.
Some of us noticed that the documentation made no promise, so always
coded to not assume sorting ...
> Things that have worked "forever" now break.
True. Then again, I'm looking forward to this as an opportunity to
catch bugs in our make fi
> This patch adds the option -J / --auto-jobs, which uses the
> sysconf(_SC_NPROCESSORS_ONLN) function to decide the number of job
> slots to use.
This sounds like a highly useful feature to me. I and my colleagues
use diverse clumsy rules of thumb to decide what value to give to -j
and -l; havin
> I'm trying to find an easy way to detect inside of a makefile if
> we're running as a parallel make or not.
See
5.7.3 Communicating Options to a Sub-`make'
and, particularly, the MAKEFLAGS variable.
Not sure of details, but I expect any -j option to appear in it.
Eddy.
__
> How about if we introduce a variable that can be set for specific
> targets (using the target-specific variable mechanism). Targets that
> set this variable to some predefined value will not have their output
> redirected, so their output will go to the screen.
IIUC, the proposal is to make the
> Perhaps it could even be mentioned in this chapter that "all" is not a
> special target:
There are quite a lot of other target names we could mention as not
being special targets !
I think the thing you need to know, to understand what *is*
documented, is that the first rule read, when parsing
> So, the circular dependency issue is because of this:
>
>> %.eps: %.pdf
>>
>> %.eps %.pdf: %.dat
Technically, not a circular dependency: in the directed graph of
dependencies, there is no cycle. If we ignore the directedness of
some edges, we get a cycle; but the edges *are* directed, and we m
> Not really sure why, but the - on the beggining of the "-unknown-exe"
> seems to cause the error to be (ignored).
The command part of a make rule can optionally begin with various
characters that modify how make runs the command or responds to the
results of running it; a command-line starting w
I like what I see ;-)
I haven't looked at the code, but one warning: it's important to not
assume that blah/../foo/ is equivalent to foo/ - blah might be a
symlink.
> *Justification (trimpath)*: trimpath can be used to shorten the
> input strings to compilers/linkers, as well as improve readabil
> Pretty weak. If a few more include paths were added to the project it would
> still break, regardless of your patch.
When you have thousands of object files, shortening the name of each
by several bytes when invoking ar can make room for really quite a lot
more files before you run up against t
I got lost in your perl script, so may have missed something; but it
*looks* as if what's happening is that your .d.cmd file records the
prior path of what was $< on your previous run; so the .o file depends
on that (as well as the newly renamed file that's $<) and this is the
problem, not the fact
> GNU make uses the standard C runtime function qsort(3) to perform its
> sorting, with a comparison function of the standard C runtime function
> strcmp().
...
> The builtin sort function DOES sort. It may not sort the way you would
> prefer, but it sorts in a standard, repeatable, well-defined w
> I have limited sympathy for this type of situation, "multifile
> compilation" is against the general idea of make.
On the other hand, running ar once per .o file cost time (at least)
quadratic in the number of files, when I tried it. So "multifile
archiving" is perfectly standard - and it's usu
>> The other thing I wonder about is the hardcoding of ASCII colorized
>> strings and the start/stop character strings (\033[...). Are there
>> other methods of colorizing?
>
> Yes.
Indeed - most obviously, anyone running a build-'bot that reports its
build logs via the web (it's a common solutio
>> The contents of these files don't seem so different to me that they
>> couldn't be consolidated, perhaps with some command-line overrides
>> or similar. Or, maybe some of them are just not needed; do we
>> really have to be able to build with nmake and smake?
>
> How about moving them to a subd
>> In the presence of a version control system, even one as basic as CVS,
>> deletion isn't fundamentally worse than leaving them to bit-rot out of
>> [sight] - they can always be recovered from the version-control system
>
> Not for people who only get the release tarballs.
Good point - didn't th
> Somtimes, it is possible that a code generator replaces the
> existing files in the code base with the same content. It might
> be a good option to enable content checking before make
> rebuilds the replaced file (with the same content) again.
Another approach: have the code generator run on a s
> Go and shoot the moron who wrote that script.
Easy to say, but no use to someone who needs to deploy a package whose
build infrastructure includes a badly-written script. If the author of
that script has been shot, there's probably no-one alive who understands
how to build the package. Quite p
> I think changing gmake's behavior to match cpp's will eliminate the
> need for a lot of hacky farting around to get non-recursive systems
> working smoothly.
I can sympathise. The present behaviour effectively requires one to cd
to (or pass a -C for) the directory of a make file in order to
> ifndef TOP
>
> include ../Makefile
>
> else
>
> SUBDIRS =
> TARGETS =
> SRCS =
>
> endif
>
> All of the complexity you allude to can be safely buried in the TOP-level
> Makefiles,
I can't help but think this is an entirely upside-down approach. You
appear to be expecting context's make-file
> ... someone, not sure who, has worked out how to solve this problem
> within the existing capabilities of GNU make. ... it would still be
> best if GNU make had (optional) native support.
... or, at the very least, its documentation included a description of
how to solve the problem, within the
> I don't think make can be expected to handle spaces in filenames
> because by design it relies on many other tools and scripts that
> cannot handle them or handle them in very idiosyncratic ways.
> You're in for a lot of trouble regardless of what make itself supports.
> Most Unix scripters know
> I have always wondered why we did not pick another
> character than ascii 32 to represent space in file names.
That would involve the disk driver - or all software that creates files
- mapping the space the user typed to the chosen codepoint. There are
then a whole load of other places software
> ... or VMS shell (whatever that is) ...
it was called DCL (Digital Command Language, I suspect) and the one
feature I remember clearly is its help. If you typed "help" at the
prompt, it was actually *helpful* in response.
I have not seen that since.
Eddy.
_
Eli:
>> cc fred.c -c -o fred.o
>> cc bob.c -c -o bob.o
>> error on line 20 -X
>> error on line 30 -
>> error on line 330 -
>> makefile:342: recipe for target 'fred.o' failed
>> makefile:350: recipe for target 'bob.o' failed
> You need to look in both anyway.
That is true of the very s
>> I think having this facility built into make is a win, especially as
>> parallel builds become predominant. I would be even more happy about it
>> if we can get it to the point where it can be enabled by default, and
>> users don't even have to worry about it.
> I agree with Paul. This is some
>> > In any case, there's no reason to use any of this just to check
>> > whether a file by a specific name exists in a directory.
>>
>> Are you saying you'd just walk along PATH doing a stat() or access()
>> or equivalent on /sh.exe?
>
> Exactly. Right now, the code already walks along PATH, and
>> I think "integer" is meant instead of "integral".
> Eg C99 uses "integral" as an adjective meaning "of integers",
How about using plain language and calling it a "whole number"
instead of using jargon ?
Eddy.
___
Bug-make mailing list
Bug-m
>> How about using plain language and calling it a "whole number"
>> instead of using jargon ?
>
> How about not catering to the lowest common denominator and devolving
> to baby-speech for fear that someone may be intimidated by a
> dictionary ?
Saying what you mean in the plainest terms possible
> Anyway, I'll stop here on that particular bikesheed. I was just a bit
> disappointed with the tendency to want to dumb down everything...
... and I was reacting primarily against the apparently knee-jerk
reaction against plain language as automatically - ipso facto, by virtue
of being plain lang
> I've never understood why someone would use $(shell ...) in a recipe...
> I mean, the recipe will be run in the shell!!
I remember we once had a library where the command-line to the archiver
was too long (about a quarter megabyte, IIRC). We worked round this by
having a temporary scratch dir,
> No, that wouldn't work. It's not the individual command (between
> simicolons) that's too long, the problem is that make can't invoke the
> shell itself because the command line + environment is too large. The
> only way to work around this limitation is to avoid invoking a single
> command tha
> The idea of diffing 2 builds is truly a cool one - especially when
> they're huge - but I'd rather it was done according to keys or other
> factors e.g. target name.
This sounds like a case for wrapping all rules in a generic output
controller that can be configured (for an incremental build, ju
>> Then commit all the log files to git and use git show to find out which
>> of them have changed since the last build. (I trust you can all work
>> out the equivalent steps for *your* preferred SCM system.) This only
>> has to happen on the server that builds from clean on a regular basis.
>
>
> If a variable is set in a parent makefile, and a child makefile is
> included, is the variable also set in the child makefile. And the other way
> around, when a variable is set in the child, is it also still usable in the
> parent, or does it fall out of scope.
>
> I have figured this out with a
> It's the long one (subdir-dtc) where the problem exists.
Obvious first step: configure quiet-command to actually show the
command, or remove this wrapping from the command, so that you can see
what the sub-make actually gets invoked with. We can guess, but it may
help to be entirely clear ! On
> I'm guessing this regression is most likely to have been caused during the
> consolidation of output generators into output.c perhaps something c99 that HP
> haven't implemented correctly?
Which tool-chain did you use to compile make ?
If you used the HP-UX native compiler, then your suggestion
> Moving the mkdir into the recipe for depend.mk does not change the behavior;
> however, removing objdir from the depends of depend.mk does.
Generally, making things depend on the directory in which they exist is
A Bad Idea. Creating any file in the directory changes the directory,
hence forces
> Note that in Unix, vsnprintf() returns the TOTAL number of chars
> needed (add 1 for the null).
This is not correct. The buffer size (that you pass in) is the total
number of bytes available (and the most the function shall use,
including the terminator); but the *return* is the strlen() that w
> I'm really not excited about the prospect of continuing to add new
> project files every year for each new version of Visual Studio. Isn't
> there any sort of backward-compatibility that allows the older files to
> work in newer Visual Studio releases?
Don't hold your breath - it might be there
> Plainly _AA_ and _$(abspath AA)_ are the same file, but make doesn't think so.
> Make seems just to use string matching (other relative paths to the same file
> are seen as different files (_subdir/../A.. for instance)).
That is correct: make targets are strings. Distinct strings are treated
as
> On Mon, Jan 5, 2015 at 1:23 PM, Paul Smith wrote:
>> I wrote some blog posts about eval and other metaprogramming techniques
>> in make that you might find interesting:
to which Norbert Thiebaud replied:
> For a real-life large scale use and abuse of these techniques see:
and, for a descriptio
> After reading over your mail a couple of times, I realized that I hadn't
> thought things through very well. In fact, rather than saying "hash
> instead of time", I should have said "optional additional hash check
> when timestamp has changed".
Even so, I'm unclear about why "hash" is the thing
> Luckily, if you are building C or C++ code someone has already done all
> the necessary work for you. I recommend you investigate the distcc
> package: https://code.google.com/p/distcc/
Likewise the icecream project's icecc:
https://en.opensuse.org/Icecream
https://github.com/icecc/icecream
I'
> We ran into similar things in Linux due to NFS, autofs, and NFS via
> apparmor not scaling well when 100 compilers are trying to search for
> header files through a long number of sourcedirs.
I suppose this can be mitigated by using #include "path/to/file.h" in
source, for paths relative to a sm
> I am a new beginner of gnu Make. in some cases, I fell that it will
> help if Make can print the executing shell command even suppressed,
> for example, to identify problem more easy.
>
> Since it seems that Make doesn't have this flag, I tried to change code.
> the code diff is in below and happ
> 4. Running under gdb I was able to get a stack trace showing the eventual seg
> fault occurring in xcalloc(). The trace is below.
This almost certainly means that, before this point, something over-ran
the end of a memory buffer and trampled malloc's data structures. So
the first line of enquir
> ifeq ($(shell if [ -f /my/file ] ; then echo 0; else echo 1; fi), 0)
Why not:
ifneq ($(wildcard /my/file),)
which tests existence just as well. Of course, it does also match if
/my/file is a directory or other non-file, but it's almost always
sufficient, unless there's some actual pract
> My problem is that Fedora have choosen to ship a broken version of
> "ar" that always sets the timestamp of all archive members to 1970-01-01
> unless one invokes it with the U flag.
For the sake of anyone else curious about what that means, here's what
man ar told me when I looked up the detail
>> a.gz b.gz:
>> touch a
>> gzip a
>> touch b
>> gzip b
Paule explained (in detail)
> multiple explicit targets are just a shorthand way of writing multiple rules,
... and if you actually want to do two things at the same time, you can
do it, via a phony intermedi
> I was thinking to build the objects but not the dependencies. We do a lot
> of one-time only builds, where we don't need dependencies at all.
In that case, leave out the include of any .d whose .o doesn't exist;
you fatuously know you need to build the .o, so you don't need to know
its dependenc
> There's an inherent race condition in the way "make" handles timestamps. If
> you save a newer version of a file while make is running and is just compiling
> that input file, the change won't be picked up by subsequent runs of "make",
> and you'll be left with an out-of-date binary.
The traditi
> It might be interesting to have a make flag that would reverse the order
> in which dependencies are considered,
... or indeed randomise the order, so that repeated testing would
routinely catch any un-noticed dependencies.
Does anything actually specify that order of prerequisites in a rule is
Pooya Taherkhani wrote:
>> To prepare to use make, you must write a file called the makefile
>> that describes the relationships among files in your program, and the
>> states the commands for updating each file.
> I think "..., and the states the commands for ..." should be "..., and
> states the
David Boyce observed:
> I was unpleasantly surprised to learn while following this discussion
> that "Except by explicit request, make exports a variable only if it
> is either defined in the environment initially _or set on the command
> line_".
... and I was surprised to find (by experiment with
On Fri, 2016-09-23 at 13:24 +, Edward Welbourne wrote:
>> This does seem like an unsound choice, for the reasons David gave,
Paul Smith replied:
> Perhaps but as discussed, it's been that way in GNU make (and other
> versions of make) forever and is required by the POSIX stan
Michael Stapelberg's patch:
diff --git a/read.c b/read.c
index b870aa8..3c67e55 100644
--- a/read.c
+++ b/read.c
@@ -1122,6 +1122,8 @@ eval (struct ebuffer *ebuf, int set_default)
one of the most common bugs found in makefiles... */
if (cmd_prefix == '\t' && strneq (
anonymous reported:
> This is a regression from GNU make 4.0. The make target
>
> clean:
> rm lib/*.{o,a}
>
> fails to remove files lib/foo.o, lib/bar.o, and lib/libfoobar.a because
> lib/*.{o,a} does not glob lib/*.o and lib/*.a, as is its intention.
This is a rule, which make hands off
> When one wants to check that it is really GNU make that is run for a
> given Makefile
The only valid use case I can think of for this is where some particular
feature of GNU make is needed. Generally, it is better to test *that
feature* rather than the version of make in use. If another projec
> If you really need some features only supported by GNU Make, why not
> rename this makefile to "GNUmakfile"?
... which, furthermore, ensures that - if the make-file is picked up by
some other make that thinks it knows what to do with a GNU make-file -
the reader is primed with the knowledge that
wu (7 March 2017 08:55):
> the software reports the following errors:
>
> ... relocation ... against `.rodata' can not be used when making a shared
> object; recompile with -fPIC
I recommend you do as it says; change your CFLAGS or CXXFLAGS to use
-fPIC (replacing -fpic, if present). If the mis
Jan Ziak (12 April 2017 20:07)
> Is this a make-4.2.1 bug, or a gcc-6.3.0 bug? The line numbers in the garbled
> error messages, in this case 79 and 84, are correct.
Given its sensitivity to the specific flags passed to gcc, it's natural
to suppose it's a gcc bug. You can test by running gcc manu
> To reproduce:
>
> $ echo -e 'all:\n\techo $(value MAKEFILE_LIST)' > /tmp/foo\$bar.mk
> $ ./make -f '/tmp/foo$bar.mk'
> echo /tmp/fooar.mk
> /tmp/fooar.mk
>
> I think this is inconsistent and contradicts the documentation, which states
> "MAKEFILE_LIST Contains the name of each makefile that is p
Jonny Grant (24 May 2017 09:40)
> $ make -f missingfile.mak
> make: missingfile.mak: No such file or directory
> make: *** No rule to make target 'missingfile.mak'. Stop.
>
> Shouldn't Make exit after the missing file?
>
> Make appears to be carrying on, and then treating the makefile as a target.
Jonny Grant (24 May 2017 12:27)
> In that successful case, would the "No such file or directory" message
> not be visible?
My guess is that it would be displayed, by the initial run of make; but
then make would re-invoke itself and do its job. As long as the rule to
make the missing make-file doe
cat
> ... the section "Communicating Variables to a Sub-`make'" ...
but I wasn't communicating with a sub-make.
No recursive make was involved.
> ... says make exports a variable to a sub-command ...
the relevant section *appears* to be saying that make exports variables
to sub-`make' invocations, t
> From make's perspective there's little difference between a sub-make
> and any other command. Both are invoked in precisely the same way,
> and both get precisely the same environment.
Right. This is what needs explained in Commands -> Execution, along
with *what* will be in that environment;
1 - 100 of 190 matches
Mail list logo