I have a scripting question regarding submitting jobs to the cluster.
There is a limitation per user of 1000 jobs only.
Let's say I have 1200 tar.gz files
I tried to submit all the jobs together but after 1000 jobs it gave me an
error message saying per user limit is 1000 and after that it did n
We've used resource quota sets to accomplish that on a per-queue or
per-project basis. I don't know that you can limit on jobs in RQSs but you
certainly can on slots; the sge_resource_quota(5) man page has some
examples.
On Thu, Jun 13, 2019 at 12:32:51PM -0400, VG wrote:
> I have a scripting que
for i in *tar.gz;
do
while true; do
if [ $(qstat -u $USER | wc -l) -lt 900 ]; then break; fi;
sleep 60;
done
qsub -l h_vmem=4G -cwd -j y -b y -N tar -R y -q all.q,gpu.q "tar -xzf $i"
done
On Thu, Jun 13, 2019 at 12:39 PM Skylar Thompson wrote:
> We've used resource quota sets
By the way, un-tarring a file is an I/O bound process and it will usually
give you no benefit to run on more than about 4 machines. Fastest and best
for the network would be to log into the file server if you have access,
and do it sequentially from there.
On Thu, Jun 13, 2019 at 12:44 PM Daniel
Hi Daniel,
Will give it a try. If I am not mistaken, there should be another *done *in
the code snippet.
Regards
Varun
On Thu, Jun 13, 2019 at 12:45 PM Daniel Povey wrote:
>
> for i in *tar.gz;
> do
> while true; do
> if [ $(qstat -u $USER | wc -l) -lt 900 ]; then break; fi;
> sle
HI Daniel,
If I have 100 tar.gz files and I have 100 slots, submitting them in
parallel would be fast since all of them are running simultaneously. I know
un tarring a file is surely I/O bound process.
Regards
Varun
On Thu, Jun 13, 2019 at 12:51 PM Daniel Povey wrote:
> By the way, un-tarring a
Hello,
Why don't use task-array instead of separate jobs?
You can do something like:
qsub -t 1-1000 script.sh
In the script, you will get $SGE_TASK_ID environment variable with task
number in it, and you may use it to select input tgz file.
Best Regards,
Mikhail Serkov
On Thu, Jun 13, 2019 at
Hi,
Where should I put my qsub command?
On Thu, Jun 13, 2019 at 12:54 PM VG wrote:
> Hi Daniel,
> Will give it a try. If I am not mistaken, there should be another *done *in
> the code snippet.
>
> Regards
> Varun
>
> On Thu, Jun 13, 2019 at 12:45 PM Daniel Povey wrote:
>
>>
>> for i in *tar.gz
On Thu, 13 Jun 2019 at 9:32am, VG wrote
I have a scripting question regarding submitting jobs to the cluster.
There is a limitation per user of 1000 jobs only.
Let's say I have 1200 tar.gz files
I tried to submit all the jobs together but after 1000 jobs it gave me an
error message saying per u
HI Joshua,
I like the array job option because essentially it will still be 1 job and
it will run them in parallel.
I have one issue though. I can create an array script, but here I presented
a simple problem. Actually my individual tar.gz files are under respective
directories
For example
dir1 ha
You can try to write the script to first scan all the files to get their full
path names and then run the Array jobs.
> On Jun 13, 2019, at 1:20 PM, VG wrote:
>
> HI Joshua,
> I like the array job option because essentially it will still be 1 job and it
> will run them in parallel.
>
> I hav
Hi Feng,
I did something like this
for i in *
do
if [ -d "$i" ]
then cd "$i"
a=$(ls *.tar.gz)
echo $PWD/"$a"
cd ..
fi
done
This gave me the full path of my tar.gz files. Should I save this in a
separate text file and then run an array script on it?
Thanks
Regards
VARUN
On Thu, Jun 13, 2019 at
Sorry, but I think this conversation shouldn't continue.
This list is for system administrators, not for users with basic questions
about bash. People will unsubscribe if it goes on much longer.
On Thu, Jun 13, 2019 at 2:49 PM VG wrote:
> Hi Feng,
> I did something like this
>
> for i in *
> do
13 matches
Mail list logo