I'm using this tutorial for capturing images with AVFoundation. I'm trying to get the captured image from CaptureSessionManager (an NSObject class) back to the AROverlayViewController. How do I go about doing this?
I tried putting #property (nonatomic, retain) UIImage *finalImage; in AROverlayViewController.h, then setting arController.finalImage = image; in CaptureSessionManager.m, but the log is saying that finalImage is null.
Any thoughts on how best to do this? Thanks!
There is a property on the class CaptureManager
#property (nonatomic, retain) UIImage *stillImage;
which you can get by this code
UIImage *final = [captureManager stillImage];
I finally got it - add UIImage *test = [captureManager stillImage]; in AROverlayViewController.m. God, that took me forever.
NSData *imageData;
- (void) imagePickerController: (UIImagePickerController*) reader
didFinishPickingMediaWithInfo: (NSDictionary*) info
{
[self performSelectorInBackground:#selector(showLoading) withObject:nil];
UIImage *capturedImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
imageData = [[NSData alloc]init];
imageData = UIImagePNGRepresentation(capturedImage);
[imageData retain];
[reader dismissModalViewControllerAnimated: YES];
}
You can pass this NSData to other view you can acces this image.
Related
I have an IOS app that is using RestKit to pull json formatted data from a server into a CoreData Entity. One of the attributes in the data is a URL for the image associated with the particular article. I'm trying to load that image to my collectionViewController. This is what I've been trying with no success.
From within
- (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
I am trying this
NSManagedObject *object = [self.fetchedResultsController objectAtIndexPath:indexPath];
NSURL *photoURL = [object valueForKey:#"imageUrl"];
NSData *photoData = [NSData dataWithContentsOfURL:photoURL];
cell.cellimg = [[UIImage alloc] initWithData:photoData];
[cell.title setText:[object valueForKey:#"title"]];
If I comment out the attempt to grab and load the image the view loads great with a default image and the titles of the articles received from the json data. Based on that I know everything is coming in correctly.
I have also determined that *photoData is being assigned the correct URL. But each time I encounter the NSData line the console prints out "-[__NSCFString isFileURL]: unrecognized selector sent to instance...."
I honestly don't know if this is even the correct way to do this or if it will even work. I am pretty new at this so any help would be great. Given that I am new some small code examples would really help as well as explaining the proper way to approach this.
Just in case here is the header file where I define *cellimg and *title
#import <UIKit/UIKit.h>
#interface GlobismNewsCollectionViewCell : UICollectionViewCell
#property (strong, nonatomic) IBOutlet UIImage *cellimg;
#property (weak, nonatomic) IBOutlet UILabel *title;
#property (weak, nonatomic) NSURL *imageUrl;
#end
I think you are using NSString as NSURL. So try to use this one...
NSURL *photoURL = [NSURL URLWithString:(NSString *) [object valueForKey:#"imageUrl"]];
Thanks to Dharmbir I had this issue wrapped up within 25 minutes of asking the question!
To clarify for any new guys like me I changed the line Dharmbir suggested so the block looks like this now
NSManagedObject *object = [self.fetchedResultsController objectAtIndexPath:indexPath];
NSURL *photoURL = [NSURL URLWithString:(NSString *) [object valueForKey:#"imageUrl"]];
NSData *photoData = [NSData dataWithContentsOfURL:photoURL];
[cell.cellimg setImage:[[UIImage alloc] initWithData:photoData]];
[cell.title setText:[object valueForKey:#"title"]];
Notice the 4th line in the block also changed from this
cell.cellimg = [[UIImage alloc] initWithData:photoData];
to this
[cell.cellimg setImage:[[UIImage alloc] initWithData:photoData]];
I also had to change my *cellimg property to this
#property (strong, nonatomic) IBOutlet UIImageView *cellimg;
from this
#property (strong, nonatomic) IBOutlet UIImage *cellimg;
I had to change to the UIImageView class because the cell in my storyboard is based on the UIImageView Class. Hope this helps a guy in need thanks again Dharmbir!
I am integrating Instagram into my application using the instagram-ios-sdk. I can successfully login to Instagram and obtain an access token, but after that when I am try to post a picture using UIDocumentInteractionController from UIImagePickerController, the image is not posting. The code to send a picture is given below:
(void)_startUpload:(UIImage *) image {
NSLog(#"Image Object = %#",NSStringFromCGSize(image.size));
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Test.igo"];
[UIImageJPEGRepresentation(image, 1.0) writeToFile:jpgPath atomically:YES];
NSLog(#"file url %#",jpgPath);
NSURL *igImageHookFile = [[NSURL alloc] init];
igImageHookFile = [NSURL fileURLWithPath:jpgPath];
NSLog(#"File Url = %#",igImageHookFile);
documentInteractionController.UTI = #"com.instagram.photo";
[UIDocumentInteractionController interactionControllerWithURL:igImageHookFile];
[self setupControllerWithURL:igImageHookFile usingDelegate:self];
[documentInteractionController presentOpenInMenuFromRect:CGRectZero inView:self.view animated:YES];
}
(UIDocumentInteractionController *) setupControllerWithURL: (NSURL*) fileURL usingDelegate: (id <UIDocumentInteractionControllerDelegate>) interactionDelegate {
NSLog(#"%#",fileURL);
UIDocumentInteractionController *interactionController =
[UIDocumentInteractionController interactionControllerWithURL: fileURL];
interactionController.delegate = interactionDelegate;
return interactionController;
}
I converted the image to .ig format, with a resolution of (612 *612). But still the image is not posting on Instagram. Am I missing something? Can any one help me with this?
Thanks
First off, in your code, you're not assigning the return value of setupControllerWithURL: usingDelegate: to an object, so that method isn't actually accomplishing anything, just creating a new instance of UIDocumentInteractionController and discarding it.
Secondly, from the documentation:
"Note that the caller of this method needs to retain the returned object."
You're not retaining the document controller (or assigning it to a strongly referenced property in the case of ARC) from what I can tell.
Try this -
In your #interface:
#property (nonatomic, strong) UIDocumentInteractionController *documentController;
In your #implementation:
self.documentController = [UIDocumentInteractionController interactionControllerWithURL:igImageHookFile];
self.documentController.delegate = self;
self.documentController.UTI = #"com.instagram.photo";
[self.documentController presentOpenInMenuFromRect:CGRectZero inView:self.view animated:YES];
Also, the line NSURL *igImageHookFile = [[NSURL alloc] init]; is unnecessary because in the next line igImageHookFile = [NSURL fileURLWithPath:jpgPath]; you're creating a new instance and discarding the first one. Just use NSURL *igImageHookFile = [NSURL fileURLWithPath:jpgPath];
While loading the images from local XML file like this ,
<?xml version="1.0" encoding="UTF-8"?>
<Books>
<book id="1">
<img>/Users/admin/imagefolder/Main_pic.png</img>
</book>
</Books>
Am parsing the file and it parse the image from file. I can see in NSLog output as
<UIImage: 0x71883d0>
but when i view it in iphone as UIImage view, no image is shown. i had set the image in viewDidLoad Method as like this,
- (void)viewDidLoad{
[super viewDidLoad];
app = [[UIApplication sharedApplication]delegate];
UIImage *image = [UIImage imageWithContentsOfFile:#"/Users/admin/imagefolder/Main_pic.png"];
[theLists setImg:image];
}
Where am doing wrong? how to get the image as output in UIImageView of iphone
Edited:
"theLists" is a object of the file where i had defined the UIImage Object say i have a file "list.h" like this, where i define the image
#interface list : NSObject
#property(nonatomic,retain) UIImage *img;
Now in the appdelegate.m file am importing the "list.h" and use like this,
#property(nonatomic,retain) list *theLists;
and this "thelists" is used in viewDidLoad()
Edited-2
My parsing code is here,
NSString *path = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:#"sample.xml"];
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
NSXMLParser *xmlParser = [[NSXMLParser alloc] initWithData:data];
egsParser *theParser =[[egsParser alloc] initParser];
[xmlParser setDelegate:theParser];
If your images is stored in array you should use something like this code:
UIImage *image = [UIImage imageWithContentsOfFile:[NSString stringWithFormat:#"%#",(((list *)[yourImagesArray objectAtIndex:i]).img)]];
To access your data if you storing it as NSString. i-number of image in array
or
UIImage *image = (((list *)[yourImagesArray objectAtIndex:i]).img);
If you getting it like UIImage
I have array of image url. I want to show the images into UIImageView.
Now I convert the URL to NSData and then convert that into UIImage and then try to load that into UIImageView.
But it takes a lot of time to do this.
Is there a better way where in I can load the images in a faster and better manner?
Despite all of the answers on here telling you to do this in one line of code, it will sadly make no difference to the URL connection speed OR data / image decoding. If you want a faster way to TYPE the code then fine, but I would use category added to UIImageView....
#interface UIImageView (URL)
- (void)loadFromUrl:(NSString *)aUrl;
#end
#implementation UIImageView (URL)
- (void)loadFromUrl:(NSString *)aUrl {
NSURL *url = [NSURL urlWithString:aUrl];
NSData *data = [NSData dataWithContentsOfURL:url]
UIImage *image = [UIImage imageWithData:data];
if(image != nil) {
[self setImage:image];
}
}
#end
Now you can include the header and do...
[myImageView loadFromUrl:#"http://myurl.com/image.jpg"];
For more categories (I will be adding this one to my list!) check here. Those are all my useful ones, you may find them useful too! :)
Use everything in a single statement.
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:MyURL]]];
[UIImage imageWithData:[NSData dataWithContentsOfURL:]]
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:imageUrl]];
try this:-
image is UIImage and imageView is UIImageView
NSData *receivedData = [NSData dataWithContentsOfURL:#"yoururl"];
self.image=nil;
UIImage *img = [[UIImage alloc] initWithData:receivedData ];
self.image = img;
[img release];
[self.imageView setImage:self.image];
I am trying to pick an image from the photo library or from the camera.The delegate method:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo
Gives me the UIImage object. I need to find the size of the image in bytes for my application.
Is there any way I can get the file type of the image and also the size in the bytes?
Any kind of help would be highly appreciated.
Thanks in advance
Try the following code:
NSData *imageData = [[NSData alloc] initWithData:UIImageJPEGRepresentation((image), 1.0)];
int imageSize = imageData.length;
NSLog(#"SIZE OF IMAGE: %i ", imageSize);
I know this is an old question but creating a NSData object just to get the byte-size of an image can be a really expensive operation. Image can have over 20Mb and creating equally sized object just to get the size of the first one...
I tend to use this category:
UIImage+CalculatedSize.h
#import <UIKit/UIKit.h>
#interface UIImage (CalculatedSize)
-(NSUInteger)calculatedSize;
#end
UIImage+CalculatedSize.m
#import "UIImage+CalculatedSize.h"
#implementation UIImage (CalculatedSize)
-(NSUInteger)calculatedSize
{
return CGImageGetHeight(self.CGImage) * CGImageGetBytesPerRow(self.CGImage);
}
#end
You simply import the UIImage+CalculatedSize.h and use it like this:
NSLog (#"myImage size is: %u",myImage.calculatedSize);
Or, if you want to avoid using categories:
NSUInteger imgSize = CGImageGetHeight(anImage.CGImage) * CGImageGetBytesPerRow(anImage.CGImage);
EDIT:
This calculation of course has nothing to do with JPEG/PNG compression. It relates to underlaying CGimage:
A bitmap (or sampled) image is a rectangular array of pixels, with
each pixel representing a single sample or data point in a source
image.
In a way a size retrieved this way gives you a worst-case scenario information without actually creating an expensive additional object.
From:#fbrereto's answer:
The underlying data of a UIImage can vary, so for the same "image" one can have varying sizes of data. One thing you can do is use UIImagePNGRepresentation or UIImageJPEGRepresentation to get the equivalent NSData constructs for either, then check the size of that.
From:#Meet's answer:
UIImage *img = [UIImage imageNamed:#"sample.png"];
NSData *imgData = UIImageJPEGRepresentation(img, 1.0);
NSLog(#"Size of Image(bytes):%d",[imgData length]);
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)editInfo{
UIImage *image=[editInfo valueForKey:UIImagePickerControllerOriginalImage];
NSURL *imageURL=[editInfo valueForKey:UIImagePickerControllerReferenceURL];
__block long long realSize;
ALAssetsLibraryAssetForURLResultBlock resultBlock=^(ALAsset *asset)
{
ALAssetRepresentation *representation=[asset defaultRepresentation];
realSize=[representation size];
};
ALAssetsLibraryAccessFailureBlock failureBlock=^(NSError *error)
{
NSLog(#"%#", [error localizedDescription]);
};
if(imageURL)
{
ALAssetsLibrary *assetsLibrary=[[[ALAssetsLibrary alloc] init] autorelease];
[assetsLibrary assetForURL:imageURL resultBlock:resultBlock failureBlock:failureBlock];
}
}