I’ve got multiple users running simulations sharing the same 50GB of data. Rather than them attaching the same 50GB file each time they submit their job via Altair web access, is it possible to set the application to accept a URL. Or should I use a bash script to do it automatically on their behalf when they submit the job? Bash sounds like a possibility to me. I guess it’s wether qsub will work with URLs or not.
There are many ways to achieve this based on the accessibility of this data
-
create a bash script and you can add it to start.py or sh file
wget
main batch command line -
if you have this 50GB file in a common share
within your start.sh/py file try to cp /path/toshared/50gb .
main execution command line -
also you can have the link to this or different link and add actios to copy or wget in the main script and submit
Hope this helps
[pbsdata@pbsserver ~]$ qsub -N getdata -l select=1:ncpus=1:mem=1gb -l walltime=00:10:00 -- /bin/bash -c "wget http://127.0.0.1:8000/array.sh && ls -lh array.sh"
3005.pbsserver
[pbsdata@pbsserver ~]$ cat getdata.o3005
-rw------- 1 pbsdata pbsdata 301 Jan 18 2025 array.sh
[pbsdata@pbsserver ~]$ cat getdata.e3005
--2025-12-05 18:23:16-- http://127.0.0.1:8000/array.sh
Connecting to 127.0.0.1:8000... connected.
HTTP request sent, awaiting response... 200 OK
Length: 301 [application/x-sh]
Saving to: 'array.sh'
0K 100% 103M=0s
2025-12-05 18:23:16 (103 MB/s) - 'array.sh' saved [301/301]