Run batch process
Setup required packages for submitting a job
pip install -r requirements.txt
Compress data into compressed files by month and sync to separate buckets per call
python3 compress_by_month.py --bled-path-in /PAM_Analysis/BatchDetections/Blue_D/final --bled-path-out /PAM_Analysis/BatchDetections/Blue_D/final/gzip
python3 compress_by_month.py --bled-path-in /PAM_Analysis/BatchDetections/Blue_A/final --bled-path-out /PAM_Analysis/BatchDetections/Blue_A/final/gzip
Note that here an AWS credential profile was setup for the mbari AWS account to separate this from the OpenData account.
export AWS_PROFILE=mbari
aws s3 sync /PAM_Analysis/BatchDetections/Blue_A/final/gzip s3://901502-blue-a-bled-batch-in
aws s3 sync /PAM_Analysis/BatchDetections/Blue_D/final/gzip s3://901502-blue-d-bled-batch-in
Run batch
Once data is uploaded by month, submit jobs with
python3 submit_job.py --call-type blue-d --start 201507 --end 202112