Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 18732

boto3 only uploads 1KB of a large video from stream

$
0
0

I run the following code in a microservice deployed in a Kubernetes cluster. I run a ffmpeg subprocess and pipe the output into a upload_fileobj. When I run this with a 20MB video, it completes without an error. When I do the same with a 10GB video, the job finishes without any exceptions but the uploaded file is 640B, containing only metadata. I am not sure why this happens and how to troubleshoot this behaviour

session = boto3.session.Session(      aws_access_key_id=aws_access_key_id,      aws_secret_access_key=aws_secret_access_key,      region_name="eu-west-2")  client = session.client('s3')  input_url = client.generate_presigned_url('get_object',    Params={'Bucket': aws_s3_bucket,'Key': inputFile    }  )  with (    ffmpeg    .input(input_url)    .filter("v360", inputProjection, outputProjection, in_stereo=inputStereo, out_stereo=outputStereo)    .output('pipe:', format=outputFormat)    .run_async(pipe_stdout=True)  ).stdout as dataStream:     client.upload_fileobj(dataStream, aws_s3_bucket, outputFile)

I ran the above code in an HTTP microserice with a 20MB and 10GB video. 20MB video job finished with no exceptions, uploading the entire video to S3. The 10GB video job uploaded the header metadata (about 1KB) to S3 and then finished with no exceptions.

I expect either the entire file to get uploaded, or the upload to fail. The fact that only a small part of the file gets uploaded to S3 is surprising to me.

When I run the code on my local machine with 16GB RAM the job also finished with no problem.


Viewing all articles
Browse latest Browse all 18732

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>