Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 23131

Automatic Execution of Sagemaker Autopilot Generated Notebook

$
0
0

I'm able to upload data from my local machine to my s3 bucket, create an autopilot job given that data, have the file outputs from the autopilot job put back into my s3 bucket, and download the outputs to my local machine. I'm doing this all within Python code. Only thing is, I would like to run the notebooks the autopilot job generates within Sagemaker, then extract those executed notebooks from my bucket. I would like to stick with doing this programmatically.

Has anyone ran into a similar task and been able to accomplish it?

I was able to take the autopilot generated notebook that I had downloaded onto my local computer from my s3 bucket, create a notebook instance with that notebook using this lifetime cycle config:

lifecycle_config_script = """#!/bin/bashset -eBUCKET_NAME="{bucket_name}"NOTEBOOK_PATHS=({notebook_paths})pip install papermillfor NOTEBOOK in "${{NOTEBOOK_PATHS[@]}}"do    aws s3 cp "s3://${{BUCKET_NAME}}/${{NOTEBOOK}}" "/home/ec2-user/SageMaker/${{NOTEBOOK}}"    papermill "/home/ec2-user/SageMaker/${{NOTEBOOK}}" "/home/ec2-user/SageMaker/executed-${{NOTEBOOK}}"    aws s3 cp "/home/ec2-user/SageMaker/executed-${{NOTEBOOK}}" "s3://${{BUCKET_NAME}}/executed-${{NOTEBOOK}}"done""".format(bucket_name=bucketName, notebook_paths=''.join(['"'+ notebook +'"' for notebook in notebook_files]))

After taking longer than 5 minutes to configure my instance, I looked at my cloudwatch logs for lifecycleconfig and got this import error:

ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with 'OpenSSL 1.0.2k-fips 26 Jan 2017'. See: https://github.com/urllib3/urllib3/issues/2168


Viewing all articles
Browse latest Browse all 23131

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>