I want to generate screen shots of a EAGLView using Everyplay SDK but unable to do it. In Everyplay's github page, change log say we can call [[[Everyplay sharedInstance] capture] takeThumbnail]; as many times we want but I don't find any way to obtain image from that.
Also, EveryplayCapture class provides a property "thumbnailTextureId" (I guess it refers to the opengl texture) but is always 0 or nil.
I don't want to use EAGLView's -snapshot method because the resultant image is not good enough (contains rough white border around sprites) for use.
Is there any way to obtain a screenshot image from everyplay?
You are close.
In your class (then one you pass to Everyplay init method) implement this callback methods:
- (void)everyplayThumbnailReadyAtFilePath:(NSString *)thumbnailFilePath;
Now if you call
[[[Everyplay sharedInstance] capture] takeThumbnail];
Everyplay will call the method above to inform you that it's ready - from there you can read it as file.
Related
Can I use the ZXing Library and scan a qr code in the background of my iphone app? I do not want the camera overlay with the square that looks for the qr code and the cancel button (as is shown in the ScanTest Example). What I need is that pressing the scan button will activate the reading of the QR code, and when the QR Code is read, how do I return the text to my application, so I can display it in a UILabel on the screen.
Can anyone show some example code in Objective-C for this? Thanks.
I did something similar, and can provide you with some guidance, but can't share source code.
Take a look at ZXingWidgetController.mm,.h files. This is a fully functioning QRcode scanning app that you can compile, so it can be reverse engineered into just containing the backend code. Edit the .h so the class extends NSObject instead of UIViewController, then delete any class properties and instance variables that are GUI objects.
That will cause xcode to find and mark all the methods and variables that you no longer need with warnings/errors in the .mm file (willAppear, etc.). Most of this code can be deleted, but be mindful to move allocations/deallocations to constructors/deconstructors.
In the viewController you can create an instance of this class, and call the class to start scanning. You need to modify the didDecodeImage in the ZXingWidgetController.mm file to do what you want it to do when it successfully gets a result from the QR code. One possibility is to modify the constructor to take your parent view controller as a parameter, store it in an instance variable as a delegate (__weak), then use that to call one of its functions in the didDecodeImage method. Other people might pass the data back to your main code using notifications.
Hope this helps!
There are a set of classes in the zxing objc directory that operate at the CA level rather than the UIView level that might be easier to modify than the widget, which operates at the UIViewController level.
This would still take a little tweaking, though, because the core capture code tracks whether the view is on screen or not to automatically start and stop the capturing of frames.
Does anyone know how to do this?
Is anyone able to provide an example? I believe this is out of NDA now as was available in version 4.0 ?
Take a look to AVFoundation framework , specially to avcapture avsession avinputdevice, etc. You can find some listings in the iPhone dev center forums: search "avcapture"
AVFoundation is the framework you want to use to record, modify raw frames, show them, an offcourse add some overlay
If you want to do only overlay then, UIImagePickerController should b enough.
If i understand, what you are trying to do corectly, you have set up video capture someting like suggested by Apple in this Q&A:http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html
From there it shouldn't be much of a Problem to use the method described in this answer:blend two uiimages based on alpha/transparency of top image to blend the preview with your overlay, provided you draw it in a UIImage first. Feed the resulting images to a buffer and save it accordingly.
Though, this method most certainly wouldn't give you a lot of frames per second.
I'm currently developing an collage application for iPhone and I've now at the point of saving the image and I therefore need the correct path so that the images shows in your image library.
Does anyone know the address to the path?
BR,
drisse
You can't write to that directly, you have to have an in-memory UIImage and then call the method to send it to the library using UIImageWriteToSavedPhotosAlbum() (a normal function defined in UIKit).
You can pass arguments to be notified when the save completes (which you might want to put up an indicator for) but you don't have to use them (pass in nil).
I am creating an app in which, as soon as the UIImagePickerController loads (i.e. the camera view), it should start taking pictures without any click and store images in an array. How can I do this without clicking on the "shoot" button?
In reference library, UIImagePickerController contains an instance method, -takePicture. Can somebody tell me if this function will do the trick, if I call it through timer?
Thanks in advance.
-takePicture should do the trick in deed. You have to provider a custom UI for the camera controls, because otherwise (for me) it doesn't work. Check out the developer documentation in Xcode and search for takePicture. The method description has everything you need.
It seems like the built in UIImagePickerController cannot accept sources other than its device camera Roll.
I would like to get the functionality of picking and enlarging pictures within my own app. Also I would like to allow the user to select the pictures and save it into their camera roll (so they can later use it as wall paper)
1) what is the recommended way of building a custom UIImagePickerController that supports what I need ? Is there another built in controller I am missing?
3) Is there a way to take a UIImage and save it to the desktop background of the device directly? Or is it a two step (first save to camera roll), then have the user load the picture from there to save as wallpaper
Thanks in advance!!
At this point, it seems that the only way to build a custom UIImagePickerController functionality is to subclass, and then muck with the view hierarchy directly. This allows you to move, hide, and replace UI elements and access the non-public classes that control the operation of the camera, but as you probably gather, this technique is both unsupported (in that it may, and probably will, break with future updates) and not recommended (in that it may, and probably will, get your app rejected from the App Store if detected).
As far as your second point (somehow numbered 3), John is right: there is no call in the public SDK to accomplish this. You could probably hack something together if you're clever, but remember the warnings in my first paragraph...
Regarding #3) there is no public call in the SDK to save an UIImage to the desktop.
I don't know about the rest of your questions though.
Just today I started a open source UIImagePickerController clone, it is not perfect but it works quite ok. Feel free to fork http://github.com/jeena/JPImagePickerController