I'm trying to figure out how to load multiple large files (like 4K images) to Google Cloud Storage via django using the default admin interface.
For example, I have a model with multiple images:
MyModel(models.Model): image_1 = models.ImageField( null=True, upload_to="myapp/images" ) image_2 = models.ImageField( null=True, upload_to="myapp/images" )
However, if I enter data for this model in the admin interface, this causes an error if I load two large files that go over GAE's 32 MB post limit.
I've tried using django-gcp
and BlobField
, but this also causes issues because the temporary uploads overwrite each other before transferring to Google Cloud Storage -- and it doesn't look like this is solvable with django-gcp
out of the box.
So right now I'm wondering if it's possible to break out upload into multiple POST requests -- that way each one can be a separate ImageField (if under 32MB) or BlobField, and I won't have any issues.
Is there a way I can upload a model in multiple POSTs?