I am trying to upload images from Swift to my Django backend so that I can store them in Google Cloud Storage and access them as "foreign keys" in my database models in django. There seems to be a built in library to bridge Django models and Google Storage https://django-storages.readthedocs.io/en/latest/backends/gcloud.html.
Reference upload image to server using Alamofire.
Before sending an image to the backend, the image is converted into a UIImageJPEGRepresentation. How exactly is Alamofire then sending the image via "multipartFormData"? Is it simply sending the raw UIImageJPEGRepresentation as a string?
I am asking this because I need to somehow convert this image data I receive in my backend and store it in a Django model and simultaneously upload it to Google Storage.
From the documentation:
>>> class Resume(models.Model):
... pdf = models.FileField(upload_to='pdfs')
... photos = models.ImageField(upload_to='photos')
To store a local file to cloud storage and at the same time in a model field:
>>> obj1.pdf.save('django_test.txt', ContentFile('content'))
>>> obj1.pdf
<FieldFile: tests/django_test.txt>
Would something like this work for an image sent from Alamofire?
>>> obj1.photos.save(<Insert whatever was sent from Alamofire>)
If so, how do you access the data sent from Alamofire?
Related
I'm running web app based on Firebase Realtime Database and Firebase Storage.
I need to upload new images to Firebase google bucket every hour via Python google-cloud-storage lib.
Here are the docs.
My code for image upload (img_src path is correct):
bucket = storage.bucket()
blob = bucket.blob(img_src)
blob.upload_from_filename(filename=img_path, content_type='image/png')
Image seem to be uploaded successfully, but when manually viewing it in Firebase Storage, it doesn't load. All the image's specs seem to be correct. Please compare specs of manually uploaded image (loads fine) with corrupted one.
Thanks for help!
Whenever you upload an image using Firebase Console, an access token will be automatically generated. However, if you upload an image using any Admin SDK or gsutil you will need to manually generate this access token yourself.
Here is an example on how to generate and set an access token for an image using the Admin Python SDK.
import firebase_admin
from firebase_admin import credentials
from firebase_admin import storage
# Import UUID4 to create token
from uuid import uuid4
cred = credentials.Certificate("path/to/your/service_account.json")
default_app = firebase_admin.initialize_app(cred, {
'storageBucket': '<BUCKET_NAME>.appspot.com'
})
bucket = storage.bucket()
blob = bucket.blob(img_src)
# Create new token
new_token = uuid4()
# Create new dictionary with the metadata
metadata = {"firebaseStorageDownloadTokens": new_token}
# Set metadata to blob
blob.metadata = metadata
# Upload file
blob.upload_from_filename(filename=img_path, content_type='image/png')
Here is quick explanation:
Import the UUID4 library to create a token. from uuid import uuid4
Create a new UUID4. new_token = uuid4()
Create a new dictionary with the key-value pair. metadata = {"firebaseStorageDownloadTokens": new_token}
Set it as the blob's metadata. blob.metadata = metadata
Upload the file. blob.upload_from_filename(...)
This solution can be implemented for any Admin SDK.
Firebase Support says that this is being fixed, but I think anyone having this problem should go this way instead of waiting for Firebase to fix this.
I am attempting to implement the ability to upload multiple images (along with some other data) to an API using data entered via a form on my flutter application.
I am using https://pub.dev/packages/multi_image_picker which stores all the images as a List<Asset>
The API I am attempting to connect with says it requires the following fields.
firstname, lastname and images[].
I have started to encode the json body using:
var body = json.encode({"firstname": firstNameField, "lastname": lastNameField, "images": imageList});
but this failed. Does anyone have any suggestions?
Hey #Bollie, You can do using the flutter_uploader package, it's very simple you can post your data in formdata separately with multiple files/images.
here is more info on how you can do,Hope it'll work for you.. https://github.com/BlueChilli/flutter_uploader/issues/9
feel free to ask any questions regarding this, Actually I recently did it so...
The messenger Send API gives me back the response
(#546) The type of file you're trying to attach isn't allowed. Please try again with a different format. error code: 546, error_subcode: 154502
However, if I host the same exact image on Google Cloud instead of Amazon S3, then the image sends fine.
My link to the AWS image:
https://s3.amazonaws.com/paloma-staging-public/files/conversation-step-56-80925.gif
My link to the google cloud image:
https://storage.googleapis.com/callparty/thumbsup.gif
are there any special reasons that a link to an image stored on S3 would not work as an image attachment, but a link to an image stored on google cloud would work?
The answer was that for the AWS link the ContentType of the file was not set.
While uploading to S3 I had to manually set the ContentType of the file appropriately ("image/gif", "image/png" etc.) and for the google cloud storage this must have been automatically set.
This is why the S3 link causes an auto-download, and the google cloud link displays the image in the browser.
I have a core data application using rest kit to retrieve data from a web service. The response i get from the webservice is a json string like this -
"nodeRef": "workspace:\/\/SpacesStore\/b1d51831-990d-47a8-a018-f1c1bg33f594",
"type": "document",
"name": "x.jpg",
"displayName": "x.jpg",
"title": "myTestProject",
(This is some meta data around an image. If i want to retrieve this image then i should access it using this url - http://x.co.uk/share/proxy/alfresco/api/node/workspace/SpacesStore/b1d51831-990d-47a8-a018-f1c1bg33f594/content/thumbnails/ipad1024?c=force) - note ive changed the URL slightly so you wont be able to access it. Also you need to authentication to access the URL. So i was wondering how i would go about getting this image and storing it locally on my SQLite DB? I can obviously access the URL but how exactly do i prompt it to download the image
What i did in this situation was to do a lazy load on the image with something like:
image=[UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:self.absoluteUri]]];
NSData *imageData=UIImageJPEGRepresentation(image, JPGCompression);
Then store it. Outside of Restkit. This was a while ago, but it worked for me.
Basically extract the URL from restkit, do a lazy load when the image is presented on the screen, and thne store it back to core data (or wherever) when you have loaded it once.
Hope this helps.
If I'm using GWT File widget and form panel, can someone explain how to handle upload on blobstore on google application engine??
Take a look at gwtupload. There are examples on how to use it with GAE Blobstore.
Google blobstore is specifically designed to upload and serve blobs via http. Blobstore service (obtained using BlobstoreServiceFactory.getBlobstoreService()) generates http post action for you to use in the html form. By posting file to it you upload your blob to the blobstore. When you generate this action you provide a path to the handler (servlet) where you have access to uploaded blob key:
Map<String, BlobKey> blobs = blobstoreService.getUploadedBlobs(req);
BlobKey blobKey = blobs.get("data");
Note, that "data" is the file field in your form. All you have is a key to the blob (your file). From here you take control - you can save this key for later and/or immediately serve the blob on a page (using key):
BlobKey blobKey = new BlobKey(req.getParameter("blob-key"));
blobstoreService.serve(blobKey, res);
Of course, for details see Google documentation.
One nice feature of the blobstore that it's integrated with Google Mapper (rudimentary map-reduce) service (work in progress) which lets you process files uploaded as blobs line by line: http://ikaisays.com/2010/08/