Is there any way to test the iPhone camera in the simulator without having to deploy on a device? This seems awfully tedious.
There are a number of device specific features that you have to test on the device, but it's no harder than using the simulator. Just build a debug target for the device and leave it attached to the computer.
List of actions that require an actual device:
the actual phone
the camera
the accelerometer
real GPS data
the compass
vibration
push notifications...
I needed to test some custom overlays for photos. The overlays needed to be adjusted based on the size/resolution of the image.
I approached this in a way that was similar to the suggestion from Stefan, I decided to code up a "dummy" camera response.
When the simulator is running I execute this dummy code instead of the standard "captureStillImageAsynchronouslyFromConnection".
In this dummy code, I build up a "black photo" of the necessary resolution and then send it through the pipelined to be treated like a normal photo. Essentially providing the feel of a very fast camera.
CGSize sz = UIDeviceOrientationIsPortrait([[UIDevice currentDevice] orientation]) ? CGSizeMake(2448, 3264) : CGSizeMake(3264, 2448);
UIGraphicsBeginImageContextWithOptions(sz, YES, 1);
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0, 0, sz.width, sz.height));
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
The image above is equivalent to a 8MP photos that most of the current day devices send out. Obviously to test other resolutions you would change the size.
I never tried it, but you can give it a try!
iCimulator
Nope (unless they've added a way to do it in 3.2, haven't checked yet).
I wrote a replacement view to use in debug mode. It implements the same API and makes the same delegate callbacks. In my case I made it return a random image from my test set. Pretty trivial to write.
A common reason for the need of accessing the camera is to make screenshots for the AppStore.
Since the camera is not available in the simulator, a good trick ( the only one I know ) is to resize your view at the size you need, just the time to take the screenshots. You will crop them later.
Sure, you need to have the device with the bigger screen available.
The iPad is perfect to test layouts and make snapshots for all devices.
Screenshots for iPhone6+ will have to be stretched a little ( scaled by 1,078125 - Not a big deal… )
Good link to a iOS Devices resolutions quick ref : http://www.iosres.com/
Edit : In a recent project, where a custom camera view controller is used, I have replaced the AVPreview by an UIImageView in a target that I only use to run in the simulator. This way I can automate screenshots for iTunesConnect upload. Note that camera control buttons are not in an overlay, but in a view over the camera preview.
The #Craig answer below describes another method that I found quite smart - It also works with camera overlay, contrarily to mine.
Just found a repo on git that helps Simulate camera functions on iOS Simulator with images, videos, or your MacBook Camera.
Repo
Related
I'm working with ML on the device for Flutter that requires to have UIImage to feed into the model. The requirement is to use Livestream to detect objects in near real-time.
I use the Flutter camera with startImageStream function and get CameraImage from the streaming. I ask the camera to return ImageFormatGroup.bgra8888 for iOS, No need for Android since it's already working fine.
I convert bgra8888 on Isolate spawn to convert to a jpg image using Image Lib and send the binary of the image to Swift via Flutter Method Channel, rebuild that binary into UIImage and feed it into the model. I feed the image every 0.5 seconds (didn't feed in realtime from the camera stream image since it will be too much data feeding into the Model)
Everything seems working fine until I tested with old devices, iPhone 6s, and iPhone7 Plus. iPhone X is working fine, The model responded around 0.3 seconds which is faster than I feed.
while iPhone 6s, and iPhone 7 Plus spend around 1.5 - 2 seconds.
I tested the model with Native by creating camera view and feed the UIImage from didOutputSampleBuffer like below sample code. My iPhone 6s, and iPhone 7 Plus response around 0.5-0.6 seconds which is a lot faster
After I've done some research and found out that
https://github.com/flutter/plugins/blob/main/packages/camera/camera_avfoundation/ios/Classes/FLTCam.m
Flutter actually has camera stream which is the same as iOS and create RGBA and send to Flutter
- (void)captureOutput:(AVCaptureOutput *)output
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
if (output == _captureVideoOutput) {
CVPixelBufferRef newBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFRetain(newBuffer);
I could get CMSampleBufferRef and create UIImage and feed to model directly without sending back and forth between Flutter and iOS, and I don't have to convert image in Flutter which is slower so If I can get CMSampleBufferRef directly from Camera, I believe that iPhone6S, and iPhone 7 Plus should run faster
The question is: is it possible to get CMSampleBufferRef directly without have to go around via Flutter? I've founded that FLTCam.h has a function called copyPixelBuffer . I debug it via Xcode and this return the image that I want but I can't find the way to use it.
and I found that FlutterTexture mention that texture can be share via Flutter
https://api.flutter.dev/objcdoc/Protocols/FlutterTexture.html
but no idea how to get that share texture
Anyone has any idea how I can access the image before Flutter camera send to Flutter?
I have another solution that I might clone their camera and expose copyPixelBuffer to public so I can access it. I didn't try yet but I want it to be last resource since other developers has to maintain 2 camera versions that we use in the app.
i'm sorry for my english.
i'm new in iphone development and happens to me a strange things.
I have a set of jpg images to show in a table view. When i test the app in iphone simulator everything is ok and work properly but when built and run the same code in iphone test device the same images aren't displayed.
Another strange behavior is that with a set of png images instead of jpg are shown perfectly in simulator like as in iphone test device.
Anyone can suggest me a solution?
I detect the name of image to load from a json file. This is the code that i use to show the image:
UIImageView *immaginePiadina = [[UIImageView alloc] initWithFrame:immagineValueRect];
[immaginePiadina setImage:[UIImage imageNamed:[item objectForKey:#"immagine"]]];
where [item objectForKey:#"immagine"] is an element of my json file like this:
"nome": "66",
"immagine": "pianetapiadabufalavesuviani",
"prezzo": "€ 7,50",
"nomeingrediente": [
"bufala",
"vesuviani"
]
How you can see i refer to the image with only the name of the file and without the file extension. I did it in this way to show image retina, it's wrong?
I exclude that i wrote a different case sensitive name because the png set works properly.
thanks a lot!!
There are possibly two separate issues that need separation, here is how to solve your problem.
First write a method using the NSFileManager, that given a file path verifies a file exists and has a size greater than 0. Insert a call to this everywhere you fail to open an image. If you use "imageNamed" then get the path to the bundle and create the path. If this method fails to find an image, fix it.
Second, the image decoding is different for the simulator and the device - the simulator uses the full OSX libraries. So take one image that fails to open and move it somewhere. Open it in Preview and export it using the same name but with png+alpha format. Change your code to expect a png not a jpg, and retest. Make sure you still use the first method to insure the file is there.
Once you get one success you can try other options, like using Preview exported jpegs. The reason this should work is that all of these image formats permit a huge range of options all of which iOS does not support.
Given that the current format us the problem, you can script changes using the "sips" program, which is what Preview uses.
I solve my problem using the following code:
NSString* path = [[NSBundle mainBundle] pathForResource:[item objectForKey:#"immagine"] ofType:#"jpg"];
NSLog(#"%#",path);
UIImage* theImage = [[UIImage alloc] initWithContentsOfFile:path];
I don't know the reason but in this way i haven't any problem to display jpg in my iPhone test device. Maybe because NSBundle specify also the extension of the file.
Thanks #David-h that directed me in the right way to solve my problem.
Check the extensions of your images. If you write .PNG, in the simulator is Ok, but not in the device.
Please help me with my question.
Is there any way to get image from camera without UIImagePickerController?
I need to render current image(from camera) into image on my view and update it by timer.
May be AVCaptureStillImageOutput? I didn't find any examples.
Any ideas?
Yes, you can do it easily using AVCamCaptureManager and AVCamRecorder classes. Apple has a demo program build on its developer site here. It is named AVCam. In simple words what it does is when you click to open the camera, it calls the classes and methods which are responsible for opening the iPhone's camera and record video or capture audio. It calls the same classes which are called by UIImagePickerController.
I hope it helps.
hey i m trying the record the gameplay of my game so that i can upload its video to youtube from device itself...m trying to do same thing as Talking tomcat app for iphone..recording the video then playing it ,etc...
i m using glReadPixels() for reading framebuffer data and then writing it to video with the help of AVAssetWriter in AVFoundation framwork. But reading the data on each drawing decreases the FPS from around 30-35 to 2-3 only while using glReadPixels.
i think Talking tomcat is also made with the help of Opengl ES it also has the video recording facility but it doesnot slows down while reading each frame any idea.... ?
In case someone want to implement the same..i figured out myself.
First of all to my surprise i found that talking tomcat is not a 3D game app...it uses frame animations for all movements. and if some one wants to capture the that kind of view then they can use following code---
UIGraphicsBeginImageContext(self.view.bounds.size); //self.view.window.frame.size
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);
and then use AVAssetWriter for creating the video from that frames. ofcourse you can find code for that in some other post.. For me its not useful as i have to capture 3D part..
Cheers
The question isn't new, but I thought I'd pitch in:
We provide an SDK called "Everyplay" that allows you to do exactly what you're looking for. It's free to use, and is lightweight.
We provide out-of-the-box integrations for Unity3D, cocos2d (1.x, 2.x), cocos2d-x, and you can of course integrate to a custom OpenGL-based game engine.
The documentation is available at https://developers.everyplay.com/doc
The documentation contains an example app key to use when developing, but you can of course sign up for your own client key at https://developers.everyplay.com/
I want to create application, that will process images(mostly - photographs from mobile camera).
So i need at first to provide the way to open image file from phone memory.
how can i do that?
Sorry for such a stupid question, but i am newbie in iphone programming(yet:)).
P.S. I use xcode, cocoa
You can use UIImagePickerController to get a standard interface for selecting a photo from the user's library or camera roll. Depending on the configuration, it can also be used to capture a new image with the camera. Note however that you get a UIImage back, not a file, if you want to upload it somewhere, you will first have to create a file or NSData object from the image, using either UIImagePNGRepresentation() or UIImageJPEGRepresentation().