Setting number of cores, number of processes using environment variables

Hi,

Is there any way to set the number of cores or processes using environment variables?

For example I have the following line in my pbs file (NODES is an environment variable):

#PBS -lselect={NODES}:ncpus=1:mpiprocs=1:mem=1gb

I then submit to the queue using:

qsub -v NODES=1 template.pbs

However, I get the following error:

qsub: Illegal attribute or resource value Resource_List.select

Could someone please guide me on where I am going wrong?

Thanks!

Yes, please create a wrapper script as below (to accommodate your requirements)

$cat  application.sh
    echo $@
    echo "#PBS -l select=$1:ncpus=$2:mem=$3gb" > application.pbs
    echo "#PBS -N $4" >> application.pbs
    echo "#PBS -j oe" >> application.pbs
    echo "#PBS -V" >> application.pbs
    echo "#PBS -l software=$5" >> application.pbs
    echo "export VERSION="2020" " >> application.pbs
    echo "export APPLICATION_HOME=/share/apps/someapplication" >>application.pbs
    echo "inputFile=$6 " >> application.pbs
    echo "cd \$PBS_O_WORKDIR " >> application.pbs
    echo "$APPLICATION_HOME/scripts/appexe \$inputFile -nt 4 -checkel NO -minlen 5000 -licwait 6  -v \$VERSION" >> application.pbs

    chmod +x application.pbs
    source /etc/pbs.conf
    $PBS_EXEC/bin/qsub application.pbs

$./application.sh 1 1 1 TestJob ApplicationName /path/to/my/input/file # to submit the job

Thanks adarsh! That is very helpful

I actually made a python wrapper, but that does not seem to be working. Do you have any idea as to why that may be (it is just I find Python easier to work with as I use it daily)?

#!/usr/bin/env python
import csv, subprocess

parameter_file_full_path = "/rds/general/user/fk4517/home/myfiles/job_params.csv"

jobid = 0;

with open(parameter_file_full_path, "rb") as csvfile:
    reader = csv.reader(csvfile)
    for job in reader:

        print(job)

        qsub_command = "qsub -v"
        qsub_command += " NODES={},".format(int(job[0]))
        qsub_command += "CORES={},".format(job[1])
        qsub_command += "PROCESSES={},".format(job[2])
        qsub_command+= "MEM={},".format(job[3])
        qsub_command+= "NAME={},".format(job[4])
        qsub_command+= "ROWS={},".format(job[5])
        qsub_command+= "COLS={},".format(job[6])
        qsub_command+= "MAX_T={},".format(job[7])
        qsub_command+= "PROB={}".format(job[8])
        qsub_command+= " template.sh env".format(jobid)

        command = qsub_command.format(*job)

        print(command) # Uncomment this line when testing to view the qsub command

        # Comment the following 3 lines when testing to prevent jobs from being submitted
        exit_status = subprocess.call(command, shell=True)
        if exit_status is 1:  # Check to make sure the job submitted
            print "Job {} failed to submit".format(command)
print "Done submitting jobs!"

Pleae share the contents of the below file

The debug data of your python script and the error message or the issue you see.

1 Like

You can also try this method:

[pbsdata@pbspro ~]$ CHUNKS=2
[pbsdata@pbspro ~]$ CORES=2
[pbsdata@pbspro ~]$ MEM=10mb
[pbsdata@pbspro~]$ qsub -l select=$CHUNKS:ncpus=$CORES:mem=$MEM -- /bin/sleep 10
923.pbspro

Thanks adarsh - I basically made a python script that wrote to a pbs file and submitted individually - thank you for your guidance as I based it off the original bash script you wrote.

1 Like