Brian Andrus via slurm-users writes:
> IIRC, slurm parses the batch file as options until it hits the first
> non-comment line, which includes blank lines.
Blank lines do not stop sbatch from parsing the file. (But commands
do.)
--
B/H
signature.asc
Description: PGP signature
--
slurm-use
IIRC, slurm parses the batch file as options until it hits the first
non-comment line, which includes blank lines.
You may want to double-check some of the gaps in the option section of
your batch script.
That being said and you say you removed the '&' at the end of the
command, which would
Since each instance of the program is independent and you are using one
core for each, it'd be better to leave slurm deal with that and schedule
them concurrently as it sees fit. Maybe you simply need to add some
directive to allow shared jobs on the same node.
Alternatively (if at your site jobs m
Dear Loris,
I just checked removing the &
it didn't work.
On Mon, Aug 19, 2024 at 1:43 PM Loris Bennett
wrote:
> Dear Arko,
>
> Arko Roy writes:
>
> > Thanks Loris and Gareth. here is the job submission script. if you find
> any errors please let me know.
> > since i am not the admin but just
Dear Arko,
Arko Roy writes:
> Thanks Loris and Gareth. here is the job submission script. if you find any
> errors please let me know.
> since i am not the admin but just an user, i think i dont have access to the
> prolog and epilogue files.
>
> If the jobs are independent, why do you want to
Thanks Loris and Gareth. here is the job submission script. if you find any
errors please let me know.
since i am not the admin but just an user, i think i dont have access to
the prolog and epilogue files.
If the jobs are independent, why do you want to run them all on the same
node?
I am running
Dear Arko,
Arko Roy via slurm-users writes:
> I want to run 50 sequential jobs (essentially 50 copies of the same code with
> different input parameters) on a particular node. However, as soon as one of
> the
> jobs gets executed, the other 49 jobs get killed immediately with exit code
> 9.