I am making requests to my API to fetch images, it returns the bytes for the image. So what I do is I assign response.bodyBytes to a Uint8List.
Then I can display the image in a Image.memory() widget.
How do I store large amounts of images in the phone memory, and check if it exists in memory before making more requests to my API?
I figured it out. I was using a Flask for my API and I used send_file() to send it as a rendered image instead, now I can use the standard cached_network_image as a URL. I just had to change it from POST to GET, and use params instead of headers so I could use the imageURL property.
Related
I am trying to send an API request in my flutter App for filtering the items in my app. I want the data to be filtered between certain 2 numbers.
This is what I want.
http://3.237.223.130/careWorker/get-parttime-jobList?workingHoursFrom=02.30&workingHoursTo=03.00
but the API call going from the app looks like this
http://3.237.223.130/careWorker/get-parttime-jobList?workingHoursFrom=02%3A30&workingHoursTo=03%3A00
The '.' is being converted into '%3A'
How can I send the API request in the form I want?
Both of them is the same value.
Seems like the query parameters are being urlEncoded while sending via API.
I have attached the image to show you the example.
You can test the URL encoded and decoded value HERE.
I beginner in iOS, how to parse an image url from server response and display image in image view. Before to display an image i used a variable of type String to store server url and convert to string url to of type NSURL, But now bunch of images coming from JSON response.
In above image which contain only one element, like this a 20 to 30 elements are coming from server. I want access that profile element and photo (Key).
In the above image just neglect that search loop code.
In the above image I am Mapping to cell of tableview.
Can anyone suggest me how to parse this JSON response using Alamofire Framework and i used a Alamofire Image Framework also to parse an image.
Thanks in Advance
After converting it to NSUrl, You need to download that image from server , for that you can use NSData or other libraries like https://github.com/SDWebImage/SDWebImage or https://github.com/onevcat/Kingfisher.
Imagine I am creating an API to allow a user to attach an image to their profile, where the image may come from a binary submission in the body, or a url, which the server will retrieve and process.
Assuming the API expects a PUT with binary image data to
/user/jon/image
When adding the URl functionality, which of the following would be preferable?
A:
PUT to /user/jon/image/url
passing the url in the body
or
B:
PUT /user/jon/image/
passing in a url in the body and setting a MIME type to advise the host whether or not the content is an image or an URL?
Is there a standard way of dealing with this situation? I feel that using MIME types to dictate the payload is more semantically correct, but a little less discoverable
Thanks
Once I had this same problem. I solved it by first posting the image by "PUT /user/jon/image/" and then posting the URL with PUT to /user/jon/image/url.
The problem is, that the user posts an image and forgets about the URL. I solved this by saving the image temporally in a session and when the URL is posted, I saved the URL and the image at the same time.
Problem is, this is not Restful, because a restful server doesn't have sesions. But being 100% restful is almost imposible, so its your choice.
We have stored images of users in a database, and we have exposed the rest services to read the images. Now how it works is that if I invoke the rest service in a browser, the browser directly displays the image in the browser. I directly send the image as a rest response. Now we want to send some other attributes like does the user has image associated and so on. So we decided to create a bean with attributes like that, and an byte array. The byte array contains the binary content of the image. Is it possible to achieve as mentioned, can we reconstruct the image in the client side with the binary array.
Yes, definitely it is possible to send images through Rest service and client of calling service can use that image. What you have to do is, convert your binary data to Base64 encoded string and set the proper content type and then return this response from your Rest service.
In the app, I want to post a photo, and some text. I am able to post if I am using local stored data in resources but when data (in JSON format) is coming from server at the run time, I am not able to post that image and text which is coming from server in the JSON format.
Is there any way to post data at the runtime or I have to store the data at the client side, but in that case, the app will be bulky because data could be different at different locations?
I am not sure but you may be asking about posting an image using a URL instead of assuming data is local. If so, see this blog post - https://developers.facebook.com/blog/post/526/ and it introduced the ability to post an image by passing in a "url" parameter through the Graph API.