Custom job directives and PBS hooks

I am interested in implementing custom directives in batch job scripts, somewhat similarly to the #PBS built-in (using a different prefix). The purpose of such directives would be to perform some preparatory operations on the dataset prior to the staging/job runtime phases.

I was hoping to implement this through PBS Pro hook API, but having read through the hooks documentation, I see that the variety of data exposed there via the Python structures deals primarily in job, event etc. metadata, and does not appear to contain the source of the job script itself.

I’m looking for some clarification on whether what I’m trying to do is possible - namely to read/parse the job script contents within a hook context. I would imagine the job script has to be communicated to the server node somehow, but not am not sure whether to look for it on disk on the server machine or perhaps in postgres. Some guidance would be greatly appreciated.

Thank you!

The job script is stored in PBS_HOME/mom_priv/jobs/[job ID].SC on the mom node once the job is assigned. You should be able to read it, including any #PBS (or other) directives.

If you add your own directives, I’d suggest that you NOT use #PBS, but rather anything else like #FOO. It’s unlikely there will be conflicts, but PBS itself only cares about #PBS directives.

The location of the job script on the mom node is subject to change, but it hasn’t in 20+ years. You’re probably safe assuming it won’t change anytime in the near future. That said, it’s not a “supported” interface, so use at your own risk.