Reset CoreImage filter in Swift - swift

I am developing simple coreImage filter project in Swift.I am trying to achieve a function like resetting the filtered image back to original. I am using the following code for SepiaTone filter and I am trying to reset the filter using CIColorControls filter.But I am getting fatal crash.I wondering is there any other way to reset the image.
#IBOutlet weak var originalImage: UIImageView!
#IBAction func SepiaToneFilter(sender: AnyObject) {
let mySepiaFilter = CIFilter(name: "CISepiaTone")
mySepiaFilter!.setValue(CIImage(image: originalImage.image!), forKey: kCIInputImageKey)
let myOutputImage : CIImage = mySepiaFilter!.outputImage!
originalImage.image = UIImage(CIImage: myOutputImage)
}
#IBAction func ResetFilter(sender: AnyObject) {
let currentFilter = CIFilter(name: "CIColorControls")
let beginImage = CIImage(image: originalImage.image!)
currentFilter!.setValue(beginImage, forKey: kCIInputImageKey
let output = currentFilter!.outputImage
let cgimg = context.createCGImage(output!, fromRect: output!.extent) **//CreateWrappedSurface() failed for a dataprovider-backed CGImageRef.fatal error: unexpectedly found nil while unwrapping an Optional value
(lldb)**
let processedImage = UIImage(CGImage: cgimg)
originalImage.image = processedImage
}
//Pick Image Process
func imagePickerController(picker: UIImagePickerController, didFinishPickingImage image: UIImage!, editingInfo: [NSObject : AnyObject]!) {
originalImage.image = image
self.dismissViewControllerAnimated(true, completion: nil);
}
I am not sure how to run the above function inside my ResetFilter(UIButton).
Thanks in Advance

Why not just hold a reference to the original image? For example:
#IBOutlet weak var originalImage: UIImageView!
var userImage?:UIImage
#IBAction func SepiaToneFilter(sender: AnyObject) {
let mySepiaFilter = CIFilter(name: "CISepiaTone")
if let image = self.userImage {
mySepiaFilter!.setValue(CIImage(image: image), forKey: kCIInputImageKey)
let myOutputImage : CIImage = mySepiaFilter!.outputImage!
originalImage.image = UIImage(CIImage: myOutputImage)
}
}
#IBAction func ResetFilter(sender: AnyObject) {
if let image = self.userImage {
self.originalImage.image = image
}
}
func imagePickerController(picker: UIImagePickerController, didFinishPickingImage image: UIImage!, editingInfo: [NSObject : AnyObject]!) {
self.userImage = image
originalImage.image = image
self.dismissViewControllerAnimated(true, completion: nil);
}

Related

CIFilter can't be applied to SCNMaterial [duplicate]

This question already has an answer here:
Why can't I invert my image back to original with CIFilter in my Swift iOS app
(1 answer)
Closed 2 years ago.
Any CIFilter works fine when it's applied to UIImageView.
import UIKit
import CoreImage
#IBOutlet var imageView: UIImageView!
let ciBlurFilter = CIFilter(name: "CIGaussianBlur")!
func gaussianBlur() -> UIImage? {
let uiImage = UIImage(named: "texture.png")!
let ciImage = CIImage(image: uiImage)
ciBlurFilter.setValue(ciImage, forKey: "inputImage")
let resultedImage = ciBlurFilter.value(forKey: "outputImage") as! CIImage
let blurredImage = UIImage(ciImage: resultedImage)
return blurredImage
}
override func viewDidLoad() {
super.viewDidLoad()
imageView.image = self.gaussianBlur()
}
But it doesn't work if it's applied to SceneKit's material:
import SceneKit
#IBOutlet var sceneView: SCNView!
let ciBlurFilter = CIFilter(name: "CIGaussianBlur")!
func gaussianBlur() -> UIImage? {
let uiImage = UIImage(named: "texture.png")!
let ciImage = CIImage(image: uiImage)
ciBlurFilter.setValue(ciImage, forKey: "inputImage")
let resultedImage = ciBlurFilter.value(forKey: "outputImage") as! CIImage
let blurredImage = UIImage(ciImage: resultedImage)
return blurredImage
}
override func viewDidLoad() {
super.viewDidLoad()
sceneView.scene = SCNScene()
let sphereNode = SCNNode(geometry: SCNSphere(radius: 0.1))
sphereNode.geometry?.firstMaterial?.diffuse.contents = self.gaussianBlur()
sceneView.scene?.rootNode.addChildNode(sphereNode)
}
Why SCNMaterial with CIFilter is invisible (although it supports UIImages)?
What's the matter?
The UIImage you create with that constructor is not actually rendered at that moment. The receiver of the image needs to know that the image needs to be rendered before use, which is seemingly not handled by SceneKit.
Please see my answer here for details.
Here's how you render the CIImage in Swift:
// ideally you create this once and re-use it;
// you should not create a new context for every draw call
let ciContext = CIContext()
let cgImage = ciContext.createCGImage(ciImage, from: ciImage.extent)
let uiImage = cgImage.flatMap({ UIImage.init(cgImage: $0) })
You can pass the CGImage to the material or wrap it into a UIImage, both should work.

Image Classifer does not show results in Classification Label in Xcode

I tried to make an image classification app. For some reason, the classification label doesn't show any results. Below is my code, would appreciate all your helps.enter image description here
===
import UIKit
import CoreML
import Vision
class ViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
#IBOutlet weak var myImageView: UIImageView!
let picker = UIImagePickerController()
#IBAction func cameraButton(_ sender: UIBarButtonItem) {
let vc = UIImagePickerController()
vc.sourceType = .camera
vc.allowsEditing = false
vc.delegate = self
present(vc, animated: true)
}
#IBAction func photoButton(_ sender: UIBarButtonItem) {
picker.allowsEditing = false
picker.sourceType = .photoLibrary
picker.mediaTypes = UIImagePickerController.availableMediaTypes(for: .photoLibrary)!
present(picker, animated: true, completion: nil)
}
#IBOutlet weak var classificationLabel: UILabel!
/// Image classification
lazy var classificationRequest: VNCoreMLRequest = {
do {
let model = try VNCoreMLModel(for: AnimalClassifier().model)
let request = VNCoreMLRequest(model: model, completionHandler: { [weak self] request, error in
self?.processClassifications(for: request, error: error)
})
request.imageCropAndScaleOption = .centerCrop
return request
} catch {
fatalError("Failed to load Vision ML model: \(error)")
}
}()
func updateClassifications(for Image: UIImage) {
classificationLabel.text = "Classifying..."
let orientation = CGImagePropertyOrientation(Image.imageOrientation)
guard let ciImage = CIImage(image: Image) else { fatalError("Unable to create \(CIImage.self) from \(Image).") }
DispatchQueue.global(qos: .userInitiated).async {
let handler = VNImageRequestHandler(ciImage: ciImage, orientation: orientation)
do {
try handler.perform([self.classificationRequest])
} catch {
print("Failed to perform classification.\n\(error.localizedDescription)")
}
}
}
func processClassifications(for request: VNRequest, error: Error?) {
DispatchQueue.main.async {
guard let results = request.results else {
self.classificationLabel.text = "Unable to classify image.\n\(error!.localizedDescription)"
return
}
let classifications = results as! [VNClassificationObservation]
if classifications.isEmpty {
self.classificationLabel.text = "Nothing recognized."
} else {
// Display top classifications ranked by confidence in the UI.
let topClassifications = classifications.prefix(2)
let descriptions = topClassifications.map { classification in
// Formats the classification for display; e.g. "(0.37) cliff, drop, drop-off".
return String(format: " (%.2f) %#", classification.confidence, classification.identifier)
}
self.classificationLabel.text = "Classification:\n" + descriptions.joined(separator: "\n")
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
picker.delegate = self
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
var Image: UIImage
if let possibleImage = info[.editedImage] as? UIImage {
Image = possibleImage
} else if let possibleImage = info[.originalImage] as? UIImage {
Image = possibleImage
} else {
return
}
myImageView.image = Image
dismiss(animated: true)
updateClassifications(for: Image)
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
dismiss(animated: true, completion: nil)
}
}
To make your label support multiple lines you need to set the property numberOfLines to 0. So in viewDidLoad for instance do
classificationLabel.numberOfLines = 0

Allow the user to choose the profile picture he wants and change whenever he wants

I need help please i try to do UIImage without a picture for the user can choose a picture he wants! And it's important that the user-selected image holds the same once the user is completely signed out of the app
And he can change it whenever he wants to see what picture he wants
displayed on the profile picture (UIImage)
Swift 4
class ViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
#IBOutlet var imageView: UIImageView!
#IBOutlet var chooseBuuton: UIButton!
var imagePicker = UIImagePickerController()
func imagePickerController(picker: UIImagePickerController!, didFinishPickingImage image: UIImage!, editingInfo: NSDictionary!){
self.dismiss(animated: true, completion: { () -> Void in
})
imageView.image = image
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
dismiss(animated: true, completion: nil)
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
let image = info[UIImagePickerControllerOriginalImage] as! UIImage
imageView.image = image
imageView.contentMode = .scaleAspectFill
dismiss(animated: true, completion: nil)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let controller = UIImagePickerController()
controller.delegate = self
controller.sourceType = .photoLibrary
present(controller, animated: true, completion: nil)
}
}
If i am not Wrong You want the Image to be Stored in As it if also the User sign out from app.
You can possibly Store the Image in Document Directory
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if let image = info[UIImagePickerControllerOriginalImage] as? UIImage {
let path = try! FileManager.default.url(for: FileManager.SearchPathDirectory.documentDirectory, in: FileManager.SearchPathDomainMask.userDomainMask, appropriateFor: nil, create: false)
let newPath = path.appendingPathComponent("image.jpg") //Possibly you can Use the UserName to fetch easily User-wise
let jpgImageData = UIImageJPEGRepresentation(image, 1.0)
do {
try jpgImageData!.write(to: newPath)
} catch {
print(error)
}
}
}
To Fetch the Image Back:-
let nsDocumentDirectory = FileManager.SearchPathDirectory.documentDirectory
let nsUserDomainMask = FileManager.SearchPathDomainMask.userDomainMask
let paths = NSSearchPathForDirectoriesInDomains(nsDocumentDirectory, nsUserDomainMask, true)
if let dirPath = paths.first
{
let imageURL = URL(fileURLWithPath: dirPath).appendingPathComponent("image.png")
let image = UIImage(contentsOfFile: imageURL.path)
// Do whatever you want with the image
}
Hope this Helps you.
You can use Scenario Like. Initially The image view will be Empty.
User will select the image using picker View.
It will be stored in Document Directory
User Log out from App
User log in back to app(At that time check if any image is there with the users name then fetch from Document directory and directly
show or else blank)
User can change profile picture and step 1-4 Repeated
Edited : Integrated in you Code:
class ViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
#IBOutlet var imageView: UIImageView!
#IBOutlet var chooseBuuton: UIButton!
var imagePicker = UIImagePickerController()
override func viewDidLoad() {
//Check if image exist in Document Directory
let nsDocumentDirectory = FileManager.SearchPathDirectory.documentDirectory
let nsUserDomainMask = FileManager.SearchPathDomainMask.userDomainMask
let paths = NSSearchPathForDirectoriesInDomains(nsDocumentDirectory, nsUserDomainMask, true)
if let dirPath = paths.first
{
let imageURL = URL(fileURLWithPath: dirPath).appendingPathComponent("image.png")
let image = UIImage(contentsOfFile: imageURL.path)
imageView.image = image
}else{
// Image not present
// Do whatever you want to do here
}
}
func imagePickerController(picker: UIImagePickerController!, didFinishPickingImage image: UIImage!, editingInfo: NSDictionary!){
self.dismiss(animated: true, completion: { () -> Void in
})
imageView.image = image
}
func imagePickerControllerDidCancel(_ picker: UIImagePickerController) {
dismiss(animated: true, completion: nil)
}
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
if let image = info[UIImagePickerControllerOriginalImage] as? UIImage {
imageView.image = image
imageView.contentMode = .scaleAspectFill
let path = try! FileManager.default.url(for: FileManager.SearchPathDirectory.documentDirectory, in: FileManager.SearchPathDomainMask.userDomainMask, appropriateFor: nil, create: false)
let newPath = path.appendingPathComponent("image.jpg") //Possibly you can Use the UserName to fetch easily User-wise
let jpgImageData = UIImageJPEGRepresentation(image, 1.0)
do {
try jpgImageData!.write(to: newPath)
} catch {
print(error)
}
}
dismiss(animated: true, completion: nil)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let controller = UIImagePickerController()
controller.delegate = self
controller.sourceType = .photoLibrary
present(controller, animated: true, completion: nil)
}}
Possibly This would help You. Let me know if it works

Swift: CoreData load my image (portrait) at 90 degrees

When I save, my image to coreData, when I re-open it from CoreData, all image who was took in portrait, are in landscape orientation.
I fund lot of previews question a bout it but all in Objective C not in Swift.
How can I fix the problem?
This is my code: ( it is also a text application when it work I will add it to my project)
This text app has two image view one for loading from library and one for loading from coreData.
class ViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate{
var monimage: String!
let imagePicker = UIImagePickerController()
#IBOutlet weak var MaPhoto: UIImageView? = UIImageView()
#IBOutlet weak var maPhoto2: UIImageView! = UIImageView()
var cameraUI:UIImagePickerController = UIImagePickerController()
var yourContacts:NSMutableArray = NSMutableArray()
override func viewDidLoad() {
imagePicker.delegate = self
super.viewDidLoad()
}
#IBAction func LabraryImage(sender: AnyObject) {
imagePicker.delegate = self
imagePicker.sourceType = .PhotoLibrary
imagePicker.allowsEditing = true
presentViewController(imagePicker, animated: true, completion: nil)
}
func imagePickerControllerDidCancel(picker: UIImagePickerController) {
dismissViewControllerAnimated(true, completion: nil)
}
#IBAction func takePhoto(sender: UIButton) {
if (UIImagePickerController.isSourceTypeAvailable(.Camera)){
cameraUI = UIImagePickerController()
cameraUI.delegate = self
cameraUI.sourceType = UIImagePickerControllerSourceType.Camera
cameraUI.allowsEditing = true
self.presentViewController(cameraUI, animated: true, completion: nil)
}else{
//no camera available
let alert = UIAlertController(title: "Error", message: "There is no camera available", preferredStyle: .Alert)
alert.addAction(UIAlertAction(title: "Okay", style: .Default, handler: {(alertAction)in
alert.dismissViewControllerAnimated (true, completion: nil)
}))
}
}
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]) {
if let pickedImage = info[UIImagePickerControllerOriginalImage] as? UIImage {
MaPhoto!.contentMode = .ScaleAspectFit
MaPhoto!.image = pickedImage
}
dismissViewControllerAnimated(true, completion: nil)
}
#IBAction func btnSavePressed(sender : AnyObject) {
let appDel:AppDelegate = UIApplication.sharedApplication().delegate as! AppDelegate
let context:NSManagedObjectContext = appDel.managedObjectContext!
let ent = NSEntityDescription.entityForName("ImageData", inManagedObjectContext: context)
var newUser = ImageData (entity: ent!, insertIntoManagedObjectContext: context)
let contactImageData:NSData = UIImagePNGRepresentation(MaPhoto!.image)
newUser.monimage = contactImageData
context.save(nil)
self.navigationController?.popViewControllerAnimated(true)
}
#IBAction func loadImage(sender: AnyObject){
let appDel:AppDelegate = UIApplication.sharedApplication().delegate as! AppDelegate
let context:NSManagedObjectContext = appDel.managedObjectContext!
let request2 = NSFetchRequest (entityName: "ImageData")
request2.returnsObjectsAsFaults = false;
var results2:NSArray = context.executeFetchRequest(request2, error: nil)!
if results2.count > 0 {
for user in results2{
var thisUser2 = user as! ImageData
let profileImage:UIImage = UIImage(data: thisUser2.monimage)!
maPhoto2.image = profileImage
}
}
}
I also working to get the image square so it is for that "allowEditing is = true"
Thank s for your help!
this is the answer to my question:
#IBAction func btnSavePressed(sender : AnyObject) {
let appDelegate = UIApplication.sharedApplication().delegate as! AppDelegate
let managedContext = appDelegate.managedObjectContext!
let entity = NSEntityDescription.entityForName("ImageData",
inManagedObjectContext: managedContext)
let options = NSManagedObject(entity: entity!,
insertIntoManagedObjectContext:managedContext)
var newImageData = UIImageJPEGRepresentation(MaPhoto!.image,1)
options.setValue(newImageData, forKey: "monimage")
var error: NSError?
managedContext.save(&error)
}
JPGs are great for photos. However, saving in jpeg might loose some quality which png tends to excel as it has a lossless compression format.
Its just a matter of preference and what you need at the moment. If you don't want to convert to jpg you can call this method then convert to pngData to preserve orientation before saving to coreData. :)
func rotatedCopy() -> UIImage {
if self.imageOrientation == UIImage.Orientation.up {
return self
}
UIGraphicsBeginImageContext(size)
//draws the image in current context respecting orientation
draw(in: CGRect(origin: CGPoint.zero, size: size))
let copy = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return copy!
}

Swift images get stretched

So I am totally new to programming and swift, this is my second week of trying to code. A lot of fun but a lot of errors as well. So I want to make an app where the user can choose a photo from their gallery or make a photo using there camera, and after a press of a button, this image will get pixalised(using the Core Image function).
The problem is whenever I press the button, the image seems to get stretched, and I can't figure out why. After browsing a picture:
After pressing the button:
Thanks for any answers!
My code is as follows:
import UIKit
class ViewController: UIViewController,UIImagePickerControllerDelegate,UINavigationControllerDelegate {
#IBOutlet weak var myImageView: UIImageView!
let picker = UIImagePickerController()
func noCamera(){
let alertVC = UIAlertController(title: "No Camera", message: "Don't try it on a computer Dumbass!", preferredStyle: .Alert)
let okAction = UIAlertAction(title: "Sorry about that :(", style:.Default, handler: nil)
alertVC.addAction(okAction)
presentViewController(alertVC, animated: true, completion: nil)
}
#IBAction func photofromLibrary(sender: UIBarButtonItem) {
picker.allowsEditing = false //2
picker.sourceType = .PhotoLibrary //3
picker.modalPresentationStyle = .Popover
presentViewController(picker, animated: true, completion: nil)//4
picker.popoverPresentationController?.barButtonItem = sender
}
#IBAction func shootPhoto(sender: UIButton) {
if UIImagePickerController.availableCaptureModesForCameraDevice(.Rear) != nil {
picker.allowsEditing = false
picker.sourceType = UIImagePickerControllerSourceType.Camera
picker.cameraCaptureMode = .Photo
presentViewController(picker, animated: true, completion: nil)
} else {
noCamera()
}
}
#IBAction func pixelise(sender: UIButton) {
// 1
let ciImage = CIImage(image: myImageView.image)
// 2
var filter = CIFilter(name: "CIPixellate")
filter.setDefaults()
filter.setValue(ciImage, forKey: kCIInputImageKey)
myImageView.contentMode = .ScaleAspectFit
// 3
var outputImage = filter.outputImage
var newImage = UIImage(CIImage: outputImage)
myImageView.image = newImage
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
picker.delegate = self
}
//MARK: Delegates
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]) {
var chosenImage = info[UIImagePickerControllerOriginalImage] as! UIImage //2
myImageView.contentMode = .ScaleAspectFit //3
myImageView.image = chosenImage //4
dismissViewControllerAnimated(true, completion: nil) //5
}
func imagePickerControllerDidCancel(picker: UIImagePickerController) {
dismissViewControllerAnimated(true, completion: nil)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
The process of converting CIImage to UIImage consists of creating a CIContext, then creating a CGImage using that context, and then creating a UIImage from that:
// 1
let ciImage = CIImage(image: image)
// 2
let filter = CIFilter(name: "CIPixellate")
filter.setDefaults()
filter.setValue(ciImage, forKey: kCIInputImageKey)
// 3
let context = CIContext(options: nil)
let cgImage = context.createCGImage(filter.outputImage, fromRect: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
let outputImage = UIImage(CGImage: cgImage)
That yields: