Captured Image Squeezed After Taking a Picture : Swift - swift

Short and simple, My captured image gets squeezed from both sides when loading it on the preview.
Long version: I am currently using the CameraManager Pod found at https://github.com/imaginary-cloud/CameraManager/blob/master/camera/CameraManager.swift. My problem is that after I take a picture, it squeezes the image a little bit on both sides. The width is not what I see. The bounds of my preview (what I see through the camera) is view.frame.bounds and my bounds of the imageView (what I just captured) is preview.frame.bounds. If need be, I can take a image of whats happening if you need more information.
If you need the code just comment and I can show you. Any feedback is greatly appreciated.

The problem was that I wasn't setting the contentMode of my image view. All I had to do was add this line
imageView.contentMode = UIViewContentMode.ScaleAspectFill
Duplicate of:
How to manage UIImageView content mode?
and
capture image is stretched using avcapture session
Sorry for posting a question without doing thorough research. I wasn't Google-ing the right query.

Related

How to take UIView screenshot in which AVPlayer is playing?

I am trying to generate a video from the screenshots taken from a specific UIVIew. This view have an AVPlayer subview and also a canvas subview(in which we can draw shapes). Just tried the normal way of UIGraphicsGetImageFromCurrentImageContext and the screenshot was black. I think its because the video is rendered in GPU.
So I tried to take a screenshot of a GLKView to which the above two subviews were added. But still its giving the GLKView background colour as images. Referred the below two links for my purpose.
Tried adding glkview as in this post.
Taking screenshot from a GLKView
Am I going the right way? If yes, Please help me to understand what I am missing.
OR
Is there anyother better way to do this? AVAssetImageGenerater is not at all suitable for my requirement and so I cant use it too.
AVAssetImageGenerater is not at all suitable for my requirement and so
I cant use it too.
What is the reason for this i don't know but by using AVAssetImageGenerater it is possible to take snap of the video and draw the snap image over the screenshot image is easy as you know the frame of the black part.
See below post to put a UIImage over another UIImage.
Draw another image on a UIImage

Image aspect ratio swift ios

So, I'm learning iOS Swift. I'm going by some tutorials, and I'm stucked at image positioning.
I'm trying to figure out how uploaded picture is rescaled and positioned.Since I cannot post the screenshot, image that should be shown in my simulator as whole, I can only see like 25% of the picture. Should I change something in Attributes or in Size Inspector?
What I did so far (clearly wrong), was setting Intrinsic Size field -> select Placeholder, w/h = 320.
Then, I pinned and selected Aspect Ratio.
Any help, please?
Thanks.
If you're asking how to make sure an image fills a UIImageView:
myImageView.contentMode = UIViewContentMode.ScaleAspectFill
You may also need to set constraints if you placed your image view in interface builder.
It's not entirely clear from your question how you want it positioned, but hopefully these get you on the right path.

UIImage resize, crop and rotate with frame

I'm searching for library or just code snippet to provide function like resize, crop with frame and rotate image after user pick image from image picker. Something like on this video: http://www.youtube.com/watch?v=Gb6xncXg1PY (0:55). Maybe someone has the same issue?
This one could help you relating the images. It's for iPad but this shouldn't make much differences. The principles should be the same. (icodeblog.com is really a good page, could be that there is much more relating this issue)
http://icodeblog.com/2010/10/14/working-with-uigesturerecognizers/
But there are also so much out which you can find via Google. I had a good one but I don't find it anymore.

Image "frame/window" on iphone

I have numerous photos in portrait and landscape format. I want to be able to display a "crop" of the photo in a UIImageView on the iphone. If the picture is in landscape format I want it to resize to fit the frame, if it is in portrait format I need it to resize to fit the width, and then be cropped at the top and bottom. As if there is a "window" over the image - so it looks like a landscape picture.
It would be even better if this could be a UIButton - although I am aware that I can use touchesBegan and such on images to make it behave like a button if needs be.
Thanks for any help you can give me.
Thanks
Tom
In an ImageView, you can change the View Mode to Aspect Fill. It will give you the right scaling/cropping you want.
For interactions, you can use a Custom button with no drawing at all (no Title, no Image, no Background) with the same size as your ImageView. That would be safe with regards to the image aspect. I've tried similar stuff using the button's Background or Image properties, it can have some undesired effects (I've ran into a resizing issue on iOS 3.1.3 for instance).
1°) To Crop the Image, try to add a method to UIImage class to do it (you can google it without problem, or even on StackOverFlow)
2°) To add a "window" over your image, just add an UIImageView over your image wich has transparency. It should work ;-)
3°) To know when an image is touched, you can use "touchesBegan" to detect which image were selected. But I think it's the last of your problems ^^
What you cant to achieve isn't so hard, just to it step by step !
If you want more help in one step, say it. But I can't code it all for you ;-)
Good Luck

Captured UIView image has distorted colors (iPhone SDK)

I'm using the UIGraphicsGetImageFromCurrentImageContext() function to capture the screen contents into an UIImage object (previously rendered into an Image context). This works great for both the simulator and a real device, however in the latter the resulting image has a few pixels with distorted colors, as seen here:
http://img300.imageshack.us/img300/2788/screencap.png
Please notice the few fucsia pixels at the top navigation bar, at both sides of the search field and to the right of the button. There are also such pixels at the right of the bottom-left button.
The code I'm using to capture the screen view into an UIImage object is pretty straightforward:
UIGraphicsBeginImageContext(self.view.window.frame.size);
[self.view.window.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *anImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
One thing to note is that all the graphics that get distorted belong to custom PNG files, used to draw the search field background as well as the buttons background.
Does anyone knows what could be possible causing this strange color distortion?
Best regards,
Just checked my own code that is doing the same thing you are. Yours is nearly identical to mine, except that I am asking the view's layer to render instead of the window's, i.e.:
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
I don't know why that would make a difference, but it's worth a try.
Solved it by using the just-approved private function UIGetScreenImage().
For more info, please check http://www.tuaw.com/2009/12/15/apple-relents-and-is-now-allowing-uigetscreenimage-for-app-st/ and https://devforums.apple.com/message/149553
Regards,
This article explains the issue with image corruption (caused by partially transparent pixels) and provides a workaround which corroborates Chris's comment:
http://www.kaoma.net/iphone/?p=9
UIGetScreenImage() is quite annoying when you just want to capture a view.
I found a nice trick, just re-save all your PNG images into TIFF format using Preview.app :)