UITapGesture not recognised on the simulator when tapping a UIImageView - swift

I am trying to display a new image based on a user tapping the image view. I have added a UITapGestureRecognizer on top of the UIImageView which has been defined a "displayPhoto" and connected it as an outlet to the view controller.
#IBOutlet weak var displayPhoto: UIImageView!
#IBAction func changeImage(_ sender: UITapGestureRecognizer) {
displayPhoto.image = UIImage(named: "myimage")
}
The sent action is listed as follows:
sent actions
When I run the app and click on the image, nothing happens. I've even tried to make the first line of the IBAction function a fatal error but nothing happens. What am I missing?
Thanks in advance for any help.
The full view and connection
Tap Gesture Recognizer Window

Firstly you try; clean your project
command+option+shift+K
The first case cannot be empty
You should add a picture.
override func viewDidLoad() {
super.viewDidLoad()
displayPhoto.image = UIImage(named: "firsCase")
}

Related

Similar UITapGestureRecognizer Actions for Many ImageView

I am relatively new to Swift.
I have many images (though right now I am testing with four) which I am trying to hide (temporarily to make sure the foundational code is working, instead I really want to insert an image under the tapped image) when they are tapped.
I have created an array of ImageViews which I plan to expand once I have working code. I tried to add UITapGestureRecognizers to each ImageView using a for loop in addGestures() and then have selectImage() hide the tapped ImageView. The code compiles without error, but fails with uncaught NSException when one of these images is tapped. Any tips on how to do this effectively without too much manual coding for each image?
Code attached
Please check below code
func addGestureRecognizer(){
for imageView in imageArray{
imageView.isUserInteractionEnabled = true
let tap = UITapGestureRecognizer(target: self, action: #selector(handleTap(gesture:)))
imageView.addGestureRecognizer(tap)
}
}
#objc func handleTap(gesture:UITapGestureRecognizer){
let imageView = gesture.view
imageView?.isHidden = true
}
in your code you're passing imageView instead of gesture on selector.however you can get imageView by gesture.view.
Also there is no need for separate function for enable userInteraction.
Here is what you can try
class GestureStackVC: UIViewController {
/// Image Outlets
#IBOutlet weak var img1: UIImageView!
#IBOutlet weak var img2: UIImageView!
#IBOutlet weak var img3: UIImageView!
/// ImageView Array
var imagesArray : [UIImageView]?
/// Image Gesture
var imageTapGesture : UITapGestureRecognizer?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
/// Allocate Array
imagesArray = [UIImageView]()
/// Add Required Values
imagesArray = [img1,img2,img3]
/// Add gesture
for imageView in imagesArray!{
let tap = UITapGestureRecognizer(target: self, action: #selector(imageTapHandler(_:)))
imageView.isUserInteractionEnabled = true
imageView.addGestureRecognizer(tap)
}
}
/// tap Handler
#objc func imageTapHandler(_ sender: UITapGestureRecognizer) {
/// Hide the Sender View
sender.view?.isHidden = true
/// Done
}
}
First Output with 3 Images
Output when clicked On first ImageView1 - Results its hidden
Output when clicked On first ImageView2 - Results its hidden

Adding views with IBAction to a NSStackView crashes application

I want to use the NSStackView to stack views above each other, I also want them to de able to expand so I can't use the NSCollectionView if i understood it correctly.
So, in storyboard, I've created a NSStackView(embedded in scroll view) in the main view controller and a view controller that I want to fill it with:
The button will fill the stack view with ten views:
#IBOutlet weak var stackView: NSStackView!
#IBAction func redrawStackView(_ sender: Any) {
for i in 0..<10 {
let stackViewItemVC = storyboard?.instantiateController(withIdentifier: "StackViewItemVC") as! StackViewItemViewController
stackViewItemVC.id = i
stackView.addArrangedSubview(stackViewItemVC.view)
}
}
And the ViewController on the right simply looks like this:
class StackViewItemViewController: NSViewController {
var id: Int = -1
override func viewDidLoad() {
super.viewDidLoad()
// Do view setup here.
}
#IBAction func buttonPressed(_ sender: Any) {
debugPrint("StackViewItemViewController" + id.description + "pressed")
}
Running this small application works fine, every time I press the button ten more stack view items appears. But, when I have the audacity to press one of the buttons to the right the application crashes:
Where am I going wrong?
I have tried to work around the IBAction to verify that this what breaks, and the application will not crash if I subclass the button and make a "buttonDelegate" protocol with a function being called from mouseUp.
I guess the problem is that the viewController objects, which you create in the loop, are released immediately.
Even though the view is attached to the stackView, it's viewController is destroyed.
You can fix this issue by keeping a reference to each viewController.
You can do this by creating a new variable
var itemViewControllers = [StackViewItemViewController]()
and then add each newly created viewController to it:
itemViewController.append(stackViewItemVC)

UIButton inside table cell not changing attributes

I have a UIButton inside my cell together with an image and a text label. I manage to change the image and label programatically, but the UIButton does not seem to respond to anything except isHidden.
This is my code, the button that is not changing is followButton:
import UIKit
class ProfileTableCell: UITableViewCell {
#IBOutlet weak var name: UILabel!
#IBOutlet weak var profileImage: UIImageView!
#IBOutlet weak var followButton: UIButton!
override func awakeFromNib() {
super.awakeFromNib()
self.profileImage.layer.borderWidth = 0.0;
self.profileImage.layer.cornerRadius = self.profileImage.frame.size.width/2;
self.profileImage.clipsToBounds = true
self.profileImage.image = UIImage(named: "belt")
self.name.text = "Bar Refaeli"
self.followButton.layer.borderColor = UIColor.black.cgColor
self.followButton.layer.borderWidth = 3.0;
self.followButton.layer.cornerRadius = self.frame.size.width/4
self.followButton.backgroundColor = UIColor.black
}
func setCell(image: UIImage, name: String){
}
override func setSelected(_ selected: Bool, animated: Bool) {
super.setSelected(selected, animated: animated)
// Configure the view for the selected state
}
}
The profileImage and name outlets change the appearance fine, like mentioned above.
I also tried to remove the button and bring it back in, clean xcode project, remove the outlet reference and connecting it again. Pretty frustrated by now.
I also tried to change the background color of the button through the storyboard, just for testing, and it does not change it! what does change is the titleLabel and the text color.
awakeFromNib()- Prepares the receiver for service after it has been loaded from an Interface Builder archive, or nib file.
Given that, move your code to a view initiating method like viewDidLoad or viewDidAppear(_:)
Child objects that are attributes like textLabels act differently than child view objects.
Eventually I actually solved this by tossing the table view to the garbage and implementing the same needs using a collection view. there was no problem there..

Swift UNavigationItem button display menu labels

I'm new in Swift and I would like to how to do this.
When I touch rightBarButtonItem button I would like the following to appear:
The Test and Test2 text should display in the same view controller.
If I don't touch rightBarButtonItem the Test and Test2 should not display. (Test and Test2 isHidden will be true.)
Is this possible or do I need another way?
I have been searching for a long time on the internet. But I have not been able to find anything. Please help or try to give some ideas of how to achieve this.
This is possible, you can add Test and Test2 in a view or stackView, then change the isHidden property of the view.
but as Matthew said, apple prefer to use tab bars.
set the view isHidden property to true in ViewDidLoad:
override func viewDidLoad() {
super.viewDidLoad()
customView.isHidden = true
}
#IBOutlet weak var customView: UIView!
#IBAction func rightBarButtonClick(_ sender: UIBarButtonItem) {
customView.isHidden = !customView.isHidden
}
you can also use SWReveal pod.
or you can create it by yourself in swift using this raywenderlich document

Swift - Cropping images *outside* of allowsEditing

I have a very very simple project set up that allows you to click a "browse photo" button. The user then selects a photo from their photo gallery, and it's displayed on a programmatically created UIImageView.
Works like a charm. However - I am missing key functionality that is required.
I need the user to be able to scale the image (via pinching and dragging) after it is displayed within the UIImageView. allowsEditing = true, lets the user crop before. I need similar functionality, however, allowing them to edit once it's on the main UI.
Help is appreciated. Please and thank you!!
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var imageView: UIImageView!
var imageViewLayer: CALayer{
return imageView.layer
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
imageViewLayer.contents = UIImage(named: "ss3.jpg")?.CGImage
imageViewLayer.contentsGravity = kCAGravityResizeAspect
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
#IBAction func newGesture(sender: AnyObject) {
imageViewLayer.transform = CATransform3DMakeScale(sender.scale, sender.scale, 1)
}
}
I did something similar a while back. I added the image to UIImageView's layer property, added gesture recognizer to the view and implemented the gesture call backs modifying the layer property and not the view. Adding the image to the UIImageView's layer did the trick. As a side note, I would like to add that every UIView is supported by CALayer class. It has a lot of methods and properties which help to dynamically change the view, which in your case will be done by gestures.
As an alternative, you can also use CALayer's hitTest method instead of implementing the call backs for gesture recognizers.
EDIT- Sample Code
You could do some thing like this:
#IBOutlet weak var imageView: UIImageView!
var imageViewLayer: CALayer{
return imageView.layer
}
In the viewDidLoad, set up the image
imageViewLayer.contents = UIImage(named: "CoreDataDemoApp")?.CGImage
imageViewLayer.contentsGravity = kCAGravityResizeAspect
Add pinch gesture to the imageview in storyboard (or programmatically) and in it's call back you could do something like this:
#IBAction func pinchGestureRecognized(sender: AnyObject) {
imageViewLayer.transform = CATransform3DMakeScale(sender.scale, sender.scale, 1)
}
Again this is just to give you an idea of how it could work and it is not the complete code. Hope this helps!
This is another way of doing it:
Stackoverflow link to related question