the memory estimate was anything
like correct.
In the new year I may try accounting again but with "MemLimitEnforce=no” set as
well :)
Merlin
--
Merlin Hartley
IT Systems Engineer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 15 Dec 2016, at 10:32, Uwe Sauter wrot
Hi Cyrus
I think you should specify the memory requirements in your sbatch script - the
default would be to allocate all the memory for a node - thus ‘filling’ it even
with a 1 cpu job.
#SBATCH --mem 1G
Hope this helps!
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Hi Suprita
You just need multiple NodeName lines, something like this:
NodeName=testmaster CPUs=2
NodeName=testclient CPUs=1
PartitionName=debug Nodes=testmaster,testclient Default=YES MaxTime=INFINITE
State=UP
Hope this helps!
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial
=
#AccountingStoragePass=
#AccountingStorageUser=
#AccountingStorageType=accounting_storage/filetxt
#AccountingStorageLoc=/var/spool/slurmd/slurmdb.txt
Hope this helps!
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
>
That is the default partition … i.e. if the user doesn’t specify a partition,
that is where their jobs will run.
Makes more sense when you have multiple partitions.
Hope this is useful!
M
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
>
more useful to our users...
Thanks
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 14 Sep 2017, at 10:00, Benjamin Redling wrote:
>
>
>
>
> On 14.09.2017 10:52, Taras Shapovalov wrote:
>> Hey guys!
>
S/gpu=160.0"
Many thanks for your time
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
user who exclusively uses GPU machines (4 GPUs and 16 CPUs per machine).
Any idea what I’ve missed?
Thanks
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 6 Oct 2017, at 20:30, Tim Carlson wrote:
>
> Perfect! Thanks!
You could also use a simple epilog script to save the output of ‘scontrol show
job’ to a file/database.
M
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 15 Oct 2017, at 20:49, Ryan Richholt wrote:
>
> Is there any way to ge
A workaround is to pre-configure future nodes and mark them as down - then when
you add them you can just mark them as up.
(see the DownNodes parameter)
Hope this helps!
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 22 Oct 2
Sounds like you would need 2 different NodeName lines - one in each partition.
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 3 Nov 2017, at 15:08, Ing. Gonzalo E. Arroyo
> wrote:
>
> Hi Gents!
>
> I would need som
!
Merlin
--
Merlin Hartley
Computer Officer
MRC Mitochondrial Biology Unit
Cambridge, CB2 0XY
United Kingdom
> On 3 Nov 2017, at 15:43, Ing. Gonzalo E. Arroyo
> mailto:garr...@ifimar-conicet.gob.ar>> wrote:
>
> Hi Merlin! Thanks for helping. Are you sure I can put 2 lines in the
12 matches
Mail list logo