I am developing a qrcode scanner and I am using ZBarSdk for that. I am able to successfully reading all the qrcode and parsing them into meaningful information. While watching some of the example I found that ZBarReaderViewController can place a image while scanning mimicking the scan area. They are using the cameraOverlay feature for the the same, however I am not finding the same for ZBarReaderView. How can I put an image on the scan surface?
I could have used the ZBarReaderViewController however my app is designed in such a manner that if I use ZBarReaderViewController, I will face issue in modalviewcontroller present so I am badly in need of cameraoverlay with ZBarReaderView. Below is the link where I got to know about the cameraOverlayView feature for ZBarReaderViewController.
Is it possible to put a square bracket for the focus when the camera appears in the ZBar SDK?
I made it with the below code, after initializing the ZBarReaderView:
UIImage *ivg = [UIImage imageNamed:#"green.png"];
UIImageView *bh = [[UIImageView alloc]initWithFrame:CGRectMake(x, y, height, width)];
bh.image = ivg;
[self.view addSubview:bh];
Related
I want to implement blur effect to Video.
To play video, I use MPMoviePlayerViewController.
But, I don't know how to implement blur effect to movie.
I tried to overlay blur image above the movie so implement blur effect, but it isn't possible.
When video play, It should change with real time. I found libraries to implement image blurr effect, but can't find library to apply blur effect to video with realtime.I use mp4 file for blur effect. I tried to use GPUImage framework, but it doesn't run exactly.
How can implement that?
Please help me.
Thanks.
I believe GPUImage from Brad Larson is the best way to achieve.
Following is the excerpt from the github page of GPUImage
//Filtering live video
//To filter live video from an iOS device's camera, you can use code like the following:
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:#"CustomShader"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];
// Add the view somewhere so it's visible
[videoCamera addTarget:customFilter];
[customFilter addTarget:filteredVideoView];
[videoCamera startCameraCapture];
Since GPUImage has open-source code, you can open up the GPUImageVideoCamera class and study the portion where it perform the live filtering of video data. Then use Apple's CoreVideo framework to grab the video data of a movie video being played and make this GPUImage portion work for it.
Sometimes, boiler-plate codes are not available, but there's always a way out.. Good luck. :)
I am having trouble reading small QR codes with ZBar using ipod. It seems I cannot get it to scan in 7mm x 7mm codes. Is this a configuration issue or a limitation of the ZBar library? I can scan these codes with other apps like pic2shop.
So,please tell me where i am mistaking.
Thank you.
ZBarReaderViewController *reader = [ZBarReaderViewController new];
reader.readerDelegate = self;
[reader.scanner setSymbology: ZBAR_I25
config: ZBAR_CFG_ENABLE
to: 0];
reader.showsZBarControls=NO;
reader.cameraOverlayView.frame=CGRectMake(0, 0, 320, 460);
reader.readerView.allowsPinchZoom = YES;
//reader.readerView.alpha=1.0;
reader.readerView.backgroundColor=[UIColor clearColor];
reader.wantsFullScreenLayout = YES;
reader.readerView.zoom=1.0;
[reader.readerView setScanCrop:(CGRect){ { 0.125, 0 }, { 0.75, 1} }];
[self.view addSubview:reader.view];
ZBarCaptureReader *cap=[[ZBarCaptureReader alloc]init];
cap.scanCrop=CGRectMake(10, 20, 100, 460);
[self presentModalViewController: reader animated: YES];
[reader release];
2 years later = There are issues with the resulting image size that Z Bar can work with. In particular as per documentation you need a minimum of 3 pixels per module (black or white stripe) so the image size needs to be bigger to scan properly. I had this issue in my iPad 2 which can read the barcode using the front camera but cannot read it with the rear camera (which produces grainy photos as well , this is contrary to the published camera specs for the iPad 2 which list the rear camera as the better of the 2 - I believe the specs were switched at some point in time). With Large QR Codes I need an iPhone 4S or iPhone 5 to scan properly and the videoQuality need to be set to high from the default VGA.
To remedy the problem above:
I would first try to remove the setScancrop line and initially start with the default (full screen) and capture as much of the barcode as possible. In particular CGRect values refer to origin (x,y) and size(width,length). I understand specifying {0,0}, {1.1} defaults to the full screen ((0,0),(320,480)) for iPhone 4GS, 4. 4s and old iTouch gen 3-4.
The next thing I will try is to use the ZBarReaderView class which takes manual still shots with higher resolution than the other class which gets the images from the video feed. the relevant parameters in this case is maxscansize, zoom, max zoom which are active in this mode but not in the other class.
I am using the new Retina display feature of ios 4.0 in my iphone application.
I added the images for higher resolution with the naming convention as image#2x.png to my existing image folder.
eg. I am adding the image in the following way:
UIImageView *toolBarBg=[[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,88) ];
NSString *toolBarBgImgPath=[[[NSBundle mainBundle]resourcePath]stringByAppendingPathComponent:#"bg_btm_bar.png"];
[toolBarBg setImage:[UIImage imageWithContentsOfFile:toolBarBgImgPath]];
I am having the image named "bg_btm_bar#2x.png" in my image folder as well.
But when I am running my application it is not taking the higher res image.
I am not understanding that how to make the application use the higher res image.
Please help me out!
Use imageNamed: instead of imageWithContentsOfFile:
I know that there is an MPMoviePlayerController present in iPhone to play movies from. But it isn't customizable in the real sense. Also, if I wanted to play movies is a non-full screen mode, it isn't possible today.
I wanted to inquire if anyone knows about the future roadmap of iPhone SDK, is there going to be any such customizable video player upcoming?
Also, is it possible to write a new video player from scratch (maybe including the codecs), then in that case which part of the SDK should be referred to and how?
Some help in this area will be really appreciated.
Thanks,
Saurabh
I don't know of anything in the future about a customizable video player. However, if there are just a few videos you want to play in non-fullscreen mode, there is an option. UIImageView supports animation. So, if you can export your videos as images, you can have them display in an UIImageView.
For example,
//load your images here
youImageView.animationImages = [NSArray arrayWithObjects:[UIImage imageNamed:#"1.png"],
[UIImage imageNamed:#"2.png"],
[UIImage imageNamed:#"3.png"],
[UIImage imageNamed:#"4.png"],
[UIImage imageNamed:#"5.png"],
youImageView.animationDuration = 1; //this is the duration of the animation and can be any value
[youImageView startAnimating];
This blog post http://www.nightirion.com/2010/01/scaling-a-movie-on-the-iphone/ mentions a method that will allow you to play non-fullscreen video. However, I'm not sure if this method will be approved by the app store verification process.
Is there any way to test the iPhone camera in the simulator without having to deploy on a device? This seems awfully tedious.
There are a number of device specific features that you have to test on the device, but it's no harder than using the simulator. Just build a debug target for the device and leave it attached to the computer.
List of actions that require an actual device:
the actual phone
the camera
the accelerometer
real GPS data
the compass
vibration
push notifications...
I needed to test some custom overlays for photos. The overlays needed to be adjusted based on the size/resolution of the image.
I approached this in a way that was similar to the suggestion from Stefan, I decided to code up a "dummy" camera response.
When the simulator is running I execute this dummy code instead of the standard "captureStillImageAsynchronouslyFromConnection".
In this dummy code, I build up a "black photo" of the necessary resolution and then send it through the pipelined to be treated like a normal photo. Essentially providing the feel of a very fast camera.
CGSize sz = UIDeviceOrientationIsPortrait([[UIDevice currentDevice] orientation]) ? CGSizeMake(2448, 3264) : CGSizeMake(3264, 2448);
UIGraphicsBeginImageContextWithOptions(sz, YES, 1);
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0, 0, sz.width, sz.height));
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
The image above is equivalent to a 8MP photos that most of the current day devices send out. Obviously to test other resolutions you would change the size.
I never tried it, but you can give it a try!
iCimulator
Nope (unless they've added a way to do it in 3.2, haven't checked yet).
I wrote a replacement view to use in debug mode. It implements the same API and makes the same delegate callbacks. In my case I made it return a random image from my test set. Pretty trivial to write.
A common reason for the need of accessing the camera is to make screenshots for the AppStore.
Since the camera is not available in the simulator, a good trick ( the only one I know ) is to resize your view at the size you need, just the time to take the screenshots. You will crop them later.
Sure, you need to have the device with the bigger screen available.
The iPad is perfect to test layouts and make snapshots for all devices.
Screenshots for iPhone6+ will have to be stretched a little ( scaled by 1,078125 - Not a big deal… )
Good link to a iOS Devices resolutions quick ref : http://www.iosres.com/
Edit : In a recent project, where a custom camera view controller is used, I have replaced the AVPreview by an UIImageView in a target that I only use to run in the simulator. This way I can automate screenshots for iTunesConnect upload. Note that camera control buttons are not in an overlay, but in a view over the camera preview.
The #Craig answer below describes another method that I found quite smart - It also works with camera overlay, contrarily to mine.
Just found a repo on git that helps Simulate camera functions on iOS Simulator with images, videos, or your MacBook Camera.
Repo