I am interested in implementing custom directives in batch job scripts, somewhat similarly to the #PBS built-in (using a different prefix). The purpose of such directives would be to perform some preparatory operations on the dataset prior to the staging/job runtime phases.
I was hoping to implement this through PBS Pro hook API, but having read through the hooks documentation, I see that the variety of data exposed there via the Python structures deals primarily in job, event etc. metadata, and does not appear to contain the source of the job script itself.
I’m looking for some clarification on whether what I’m trying to do is possible - namely to read/parse the job script contents within a hook context. I would imagine the job script has to be communicated to the server node somehow, but not am not sure whether to look for it on disk on the server machine or perhaps in postgres. Some guidance would be greatly appreciated.
Thank you!