I have an application which uploads file from iPhone to web server.
Problem is that I want to give users a control like from which they can select the file / photo from the device and which is than uploaded on server.
Can anyone help me
You can get the UIImage content as a JPEG or PNG wrapped in NSData:
NSData *data = UIImageJPEGRepresentation(image, 1);
if( data ) { // it's possible nil may be returned
;// do stuff here.
// see: link to other answer on SO below.
// if you want to write to file, try:
[data writeToFile:filePath atomically:YES];
}
File Upload to HTTP server in iphone programming
I think you want to use an ImagePickerController that is the system component to allow a user to choose a photo.
Related
I am working on an Iphone application.
The idea of the application is to allow the user to download a picture and to save it on his phone. and he can later open it. (Normal caching system)
What I am doing now is I get the UIIamge from the server and then I save it on the phone using:
UIImageWriteToSavedPhotosAlbum
And then I use the imagePickerController to allow the user to load the image.
My question is:
Is there a way I can save the image in the application folder? The idea is that I don't want to allow the user to delete the image from outside the application(I have a "clear images" button). Is there a place I can cache the images other than the Photo library? maybe in the application bundle or folder?
Thank you
Use either of the following methods to save the UIImage into your app's document directory
[UIImageJPEGRepresentation(image, JPEG_COMPRESSION_QUALITY)
writeToFile:photoPath atomically:YES];
or
[UIImagePNGRepresentation(image) writeToFile:photoPath atomically:YES];
I currently need to upload large files from an iDevice to a server API. For this I'm attempting to use the ASIHTTPRequest library as it automatically supports uploading large files in queued chunks (before this I simply created an NSData instance with the bytes for the entire file at once and attached this to the POST message, but this causes the application to crash on larger files due to an excessive amount of RAM usage).
The problem is that when you need to upload files and add a file to the HTTP Post message, this is the syntax you need to use:
[theUploadRequest setFile:#"" forKey:#"videoupload"];
setFile requires a file path in a string format. The problem I'm currently having is that it does not seem like you are allowed to simply take the file path from a file which is not in your applications sandbox? Since I need to upload a file which is not in my application, but outside of it in the standard cameraroll.
I tried to make this quick test to see if I could create an NSData object and fill it with data from a file in the cameraroll, providing a path to it like this:
NSData *testData = [NSData dataWithContentsOfURL:theContent.defaultRepresentation.url];
NSLog(#"THE SIZE OF THE TEST DATA: %i", testData.length);
Note that "theContent" is an instance of an ALAsset, and is a file retrieved from the cameraroll. The result of this is simply a length of 0, which I suppose means you can't simply do that.
Is there any way around this? Or would I have to somehow import the video file into the application's sandbox?
So if I understand correctly, you want to upload stuff straight from the camera roll? Based on the docs I'd say the important piece of code you need is:
NSString *filePath = [[info objectForKey:
UIImagePickerControllerMediaURL] path];
The info dictionary is passed to your media picker's delegate:
- (void) imagePickerController: (UIImagePickerController *) picker
didFinishPickingMediaWithInfo: (NSDictionary *) info;
You should then be able to use that filePath in your call to setFile.
Take a look here probably you'll find something useful.
Hope this helps.
EDIT:
For something simpler you can use the bit of code posted in this answer
Implying that you need to copy the file somewhere before uploading it, especially if it is a big one.
I am working an app that uses the photos and videos in the AssetsLibrary, and I am just trying to determine once and for all if there is any way around asking the user for permission to access Location data in order to get these assets. I understand that the EXIF data includes GPS information, and that makes enough sense to me.
Note: I have searched through StackOverflow and I have found similar questions, and I am not making this post simply to add one more to the list. I am asking specifically about one (apparent) counterexample.
When using Instagram for the first time, I am able to browse through my photo album, select photos, edit them and share them all without ever being prompted about location services. I am only prompted when I choose to click on the button labeled "Enable Geotagging." Checking the settings tab, it looks like if I don't ever click that button, Instagram doesn't even come up in the Location Services section of my Settings.
My question is, how did Instagram get away with this? Anyone have any ideas? I'd like to figure out if I can mimic their implementation somehow so my users aren't shut out from getting their camera assets if they say no to this prompt.
the explanation is quite simple. Instagram uses the UIImagePickerController. UIImagePickerController works without Location Services enabled, but you don't get EXIF data using this method.
UIImagePickerController can retrieve metadata (incl. GPS) only through the UIImagePickerControllerReferenceURL. UIImagePickerControllerReferenceURL you have to pass then AssetsLibrary methods, which needs again location services enabled.
Cheers,
Hendrik
As holtmann mentioned, reading the UIImagePickerControllerReferenceURL triggers the location services prompt. If you only need the image data, not the metadata, you can get that from the UIImagePickerControllerOriginalImage and UIImagePickerControllerEditedImage keys instead (I'm checking EditedImage first and then checking OriginalImage if EditedImage is null). This doesn't require the use of the assets library and doesn't require location access.
Here's how I'm using this in my app, including saving a local copy of the image for further editing:
- (void)imagePickerController:(UIImagePickerController *)controller didFinishPickingMediaWithInfo:(NSDictionary *)info {
// get the selected photo as a UIImage
UIImage *photo = [info objectForKey:#"UIImagePickerControllerEditedImage"];
if (!photo) {
photo = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
}
// save the photo to the app's Documents folder
if (photo) {
NSString *extension = #"jpg";
NSString *filename = [NSString stringWithFormat:#"%#.%#", self.defaultTitle, extension]; // self.defaultTitle is defined elsewhere in my app
NSString *path = [[NSHomeDirectory() stringByAppendingPathComponent:#"Documents"] stringByAppendingPathComponent:filename];
[UIImageJPEGRepresentation(photo, 0.8) writeToFile:path atomically:YES];
}
}
On the iPhone I saved an image to the photo library with modifications (appending data after the image code) using the assest library to save the NSData directly to the photo library. Now I want to read the image back from the photo library as an NSData to read it. If I read it as a UIImage, it changes the data inside the image.
How can I read the photo from the photo library as NSData? I tried looking into the reference URL in 4.1 but no luck.
Edit: I edited to explain why I'm saving it as an NSData. The URL method in the answers works if you just want pure NSData, but does not help in the context that I am asking.
Edit 2: The answer with the getBytes was the answer that did it for me. The code I used was:
int8_t *bytes = malloc([representation size]);
NSUInteger length = [representation getBytes:bytes fromOffset:0 length:[representation size] error:&error];
This was able to get me everything inside the file which gave me the image code PLUS what I added in NSData form.
Edit 3:
Is there a way to do this now in iOS 10 with the PHPhotoLibrary which replaces AssetLibrary? I need the image in NSData being the original RAW image with no modifications.
You should look into the getBytes method of ALAssetRepresentation.
This gives you the original RAW image data of your image and also makes it possible to use a buffer to process the image instead of reading the image into memory all at once.
And of course you can always generate NSData from the getBytes method.
Cheers,
Hendrik
If you already have your ALAsset, you should be able to get an NSData of the default representation using the following:
NSData *data = [NSData dataWithContentsOfURL:[[asset defaultRepresentation] url]];
Playing Youtube video in the app is easy and well documented around.
There are two problems with that:
after closing Youtube player, if user wants to play it again it has to wait for online streaming again
can't play offline (load video at home to watch on the road)
Does anyone have code to:
download Youtube video to documents folder and show progress of download
play downloaded video by loading file from documents folder (meaning even when not connected to the internet)
To download the video from YouTube:
Get the URL to download from, via the YouTube API or whatever other method.
Create an NSOutputStream or NSFileHandle opened on a temporary file (in NSTemporaryDirectory() or a temp-named file in your Documents directory).
Set up your progress bar and whatever else you need to do.
Allocate and start an NSURLConnection to fetch the file from the URL. Do not use sendSynchronousRequest:returningResponse:error:, of course.
In the connection:didReceiveResponse: delegate method, read out the length of data to be downloaded for proper updating of the progress bar.
In the connection:didReceiveData: delegate method, write the data to the output stream/file handle and update the progress bar as necessary.
In connectionDidFinishLoading: or connection:didFailWithError:, close the output stream/file handle and rename or delete the temporary file as appropriate.
To play it back, just use NSURL's fileURLWithPath: to create a URL pointing to the local file in the Documents directory and play it as you would any remote video.
Ive used classes from this project: https://github.com/larcus94/LBYouTubeView
It works fine for me.
I can download youtube videos.
I used this code:
LBYouTubeExtractor *extractor = [[[LBYouTubeExtractor alloc] initWithURL:[NSURL URLWithString:[NSString stringWithFormat:(#"http://www.youtube.com/watch?v=%#"), self.videoID ]] quality:LBYouTubeVideoQualityLarge] autorelease];
[extractor extractVideoURLWithCompletionBlock:^(NSURL *videoURL, NSError *error) {
if(!error) {
NSLog(#"Did extract video URL using completion block: %#", videoURL);
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSData *data = [NSData dataWithContentsOfURL: videoURL];
NSString *pathToDocs = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filename = [NSString stringWithFormat:(#"video_%#.mp4"), self.videoID ];
[data writeToFile:[pathTODocs stringByAppendingPathComponent:filename] atomically:YES];
NSLog(#"File %# successfully saved", filename);
});
} else {
NSLog(#"Failed extracting video URL using block due to error:%#", error);
}
}];
You can show progress of downloading using technique described in the posts above.
Here is my example: https://github.com/comonitos/youtube_video
I used PSYouTubeExtractor.h class by Peter Steinberger It can get youtube mp4 video url and than downloading and viewing is not a problem
NSURLConnection
+
NSNotificationCenter
+
PSYouTubeExtractor
+
NSMutableData
check these projects -
https://github.com/iosdeveloper/MyTube
https://github.com/pvinis/mytube
these will definitely help you!!
I don't personally know how to download youtube videos (and the code is too big to put in an answer here).
However, here's a complete youtube downloading example here.
It's an open source youtube downloader called LoadTube: here's the a link to the source code.
I would play the video and figure out where the temp file is being stored. If you can get access to it, copy it into some document folder for offline viewing.