Ncpus=1+1:ncpus=24

I’m a little confused as to why:

#PBS -l select=1:ncpus=1+1:ncpus=1

doesn’t pull an error up, while:

#PBS -l select=1:ncpus=1:ncpus=1

produces the error:

“qsub: duplicated resource within a section of a select specification, resource is: ncpus”

Can anyone explain it for me?

When using the split chunk statements, the chunk respresentation or count is a must. You are duplicating the ncpus request twice which is not a correct syntax

#PBS -l select=1:ncpus=1+1:ncpus=3
request specifies the job needs 1 chunk with 1 cpu and another chunk with 3 cpus

  • this might be used in case one node has 1 cpu free and another node has 3 cpus free.
    If you could let us know what you are trying to achieve that would be helpful ?
1 Like

Thanks @adarsh - I’ve just never seen the syntax before, and never encountered split chunks.

I’m not entirely sure what the ncpus=1+1 part of the syntax means.

Is the + where the chunk split is - so after that is a new chunk?

Also, please refer this section 4.3.1 Quick Summary of Requesting Resources
from this guide https://www.altair.com/pdfs/pbsworks/PBSUserGuide2021.1.pdf

Thank you

1 Like

@olderandcolder I would definitely read the links @adarsh provided. But the quick answer is that you don’t have a “ncpus=1+1” in your jobs.
rather a select statement is a collection of chunks each separated by plus sign. So you are selecting this
1:ncpus=1
and also another
1:ncpus=1

in many clusters this would be the same thing as writing:

select=2:ncpus=1

I.e. give me two nodes and on each of those nodes I need 1 core, but I’m sure plenty of people at altair can site reasons why the two different syntaxes could generate different results depending on your particular cluster’s configuration.

Using the multi-chunk syntax can get interesting if in your setup it’s possible to select non-homogeneous configurations. I.e. give me 1 large memory node “plus” give me two nodes that have gpus. Of course writing a job script that can properly dispatch processes on a non-homogenous set of nodes could also be an interesting exercise.

Right! I now get it - thanks @adarsh & @arwild01