In Swift, how do you find UILayoutGuide objects in a view controller? - swift

I'm creating a sudoku style swift app using 81 buttons (organized in stack views). I'm referencing the buttons using their tag value 0 to 80 which aligns with the arrays I'm using to store the values. But, the first button is causing problems, I think because there are UILayoutGuide objects that also have tag 0.
This solution was working but I've just upgraded to Xcode Version 8.0 (8A218a) and converted to swift 3.
I added the following code to the viewDidLoad function of the view controller:
// Get the subviews of the view
var subviews = view.subviews
// Return if there are no subviews
if subviews.count == 0 {
return
}
for subview : AnyObject in subviews{
// Do what you want to do with the subview
print("\(subview.tag) - \(subview)")
}
This produces the following log:
Optional(200) - UIImageView: 0x7ff3e240f3b0; frame = (-4 0; 383 667); autoresize = RM+BM; userInteractionEnabled = NO; tag = 200; layer =
Optional(200) - UIStackView: 0x7ff3e240f590; frame = (7 103; 360 360); opaque = NO; autoresize = RM+BM; tag = 200; layer = CATransformLayer: 0x6100002316a0
Optional(200) - UIImageView: 0x7ff3e24111b0; frame = (0 47; 375 436); autoresize = RM+BM; userInteractionEnabled = NO; tag = 200; layer = CALayer: 0x6100002344e0
Optional(200) - UIToolbar: 0x7ff3e2637ff0; frame = (0 623; 375 44); opaque = NO; autoresize = RM+BM; tag = 200; layer = CALayer: 0x6000000343e0
Optional(200) - UIImageView: 0x7ff3e26015f0; frame = (63 228; 248 211); autoresize = RM+BM; userInteractionEnabled = NO; tag = 200; layer = CALayer: 0x600000034ea0
Optional(200) - UILabel: 0x7ff3e263e2e0; frame = (117 313; 141 41); text = 'Nice Work'; opaque = NO; autoresize = RM+BM; userInteractionEnabled = NO; tag = 200; layer = _UILabelLayer: 0x600000093060
Optional(200) - UIStackView: 0x7ff3e263e750; frame = (127 491; 120 120); opaque = NO; autoresize = RM+BM; tag = 200; layer = CATransformLayer: 0x600000035520
Optional(0) - _UILayoutGuide: 0x7ff3e263ee80; frame = (0 0; 0 0); hidden = YES; layer = CALayer: 0x600000035bc0
Optional(0) - _UILayoutGuide: 0x7ff3e263f030; frame = (0 0; 0 0); hidden = YES; layer = CALayer: 0x600000035c60
So the last 2 output lines are my suspected UILayoutGuide objects. I didn't code them and they are not objects in the hierarchy. How do I see them/get rid of them?

If you just want to prevent the loop from adding the _UILayoutGuide (which is of class UILayoutSupport), do this:
// Get the subviews of the view
var subviews = view.subviews
// Return if there are no subviews
if subviews.count == 0 {
return
}
for subview : AnyObject in subviews{
// Do what you want to do with the subview
print("\(subview.tag) - \(subview)")
// Check for whether is it a UILayout class or not
if(subview is UILayoutSupport){
//This section is executed when UILayoutGuide will come into play
print("Don't add it to the stack")
}else{
// Write your code in this section
}
}
you can also print your stack, check for UILayoutGuide in it and remove it.

Simple answer is that the UILayoutGuides don't have tags. When you print the subview.tag its showing tag = 0 when there is no tag. So the issue with button 0 is not related to the UILayoutGuide.
I didn't figure out what the issue was with the tag 0 button but recoded to avoid using tag 0. Not exactly fixed but the solution works.

Related

how to know the view that is receiving the current gesture

I have a UIButton that is only reacting to my tap gestures when I tap on a very small sub-area within it. I'm assuming that maybe another view is swallowing my gestures when I tap in other places within the button.
the question is.. is there a way to know which UIView exactly is receiving the taps? I know I can brute force it an attach all gesture handlers to all of my views.. but seems like too much work. any ideas?
update:
Ahmed's answer below worked perfectly.. I also added to it UIView's recursiveDescription to find the offending view quickly.
ie if i put a breakpoint on the NSLog line:
- (void)sendEvent:(UIEvent *)event
{
[super sendEvent:event];
UIView *touchReceipientView =((UITouch *)[event.allTouches anyObject]).view;
NSLog(#"View = %# ",touchReceipientView);
}
I get this output (as an example):
(lldb) po [touchReceipientView recursiveDescription]
<UIView: 0x1fd16470; frame = (0 430; 320 40); layer = <CALayer: 0x1fd164d0>>
| <UIButton: 0x1fd13030; frame = (297 0; 18 19); opaque = NO; tag = 11; layer = <CALayer: 0x1fd130f0>>
| | <UIImageView: 0x1fd13190; frame = (-41 -40.5; 100 100); alpha = 0; opaque = NO; userInteractionEnabled = NO; tag = 1886548836; layer = <CALayer: 0x1fd131f0>>
| | <UIImageView: 0x1fd14d00; frame = (1 2; 16 15); clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x1fdc8e50>>
which made me identify the offending UIView immediately!
You can subclass the UIApplication and see who is receiving the events.
and in main.m,
#interface MyApplication : UIApplication
#end
#implementation MyApplication
- (void)sendEvent:(UIEvent *)event
{
[super sendEvent:event];
NSLog(#"View = %#", ((UITouch *)[event.allTouches anyObject]).view);
}
#end
int main(int argc, char *argv[])
{
#autoreleasepool {
return UIApplicationMain(argc, argv, NSStringFromClass([MyApplication class]), NSStringFromClass([MyAppDelegate class]));
}
}

How do I set an lldb watchpoint on a property of self.view?

I want to trace when something changes the size of self.view. What's the correct format?
(lldb) po self.view
(UIView *) $1 = 0x0a8aba20 <UIView: 0xa8aba20; frame = (0 0; 480 864); autoresize = W+TM+BM; layer = <CALayer: 0xa8aba50>>
(lldb) watch set variable self.view.frame.size.width
error: "self" is a pointer and . was used to attempt to access "view". Did you mean "self->view.frame.size.width"?
(lldb) watch set variable self->view
error: "view" is not a member of "(PlayViewController *) self"
(lldb) watch set variable self->view.frame.size.width
error: "view" is not a member of "(PlayViewController *) self"
I've tried the documentation and other lldb watchpoint questions but can't find anything for this specific case.
Thanks for your help.
The view controller references its view from its _view instance variable.
The view doesn't store its frame directly. It just returns its layer's `frame'.
The view references its layer from its _layer instance variable.
The layer doesn't store the frame either. It computes its frame from its bounds, position, anchorPoint, and transform. The size is part of bounds.
The layer doesn't store its bounds directly in an instance variable. Instead, its layer instance variable references an instance of a private C++ class, CA::Layer. The member layout of this class is undocumented.
In other words, you can go self->_view->_layer->layer to get to the CA::Layer instance, but then you're stuck because you don't know where in the CA::Layer to find the bounds.
So, trying to use a watchpoint to detect changes to the view's size is rather difficult.
It is easier to put a breakpoint on -[CALayer setBounds:].
On the simulator
Remember to use the layer address in the breakpoint condition, not the view address.
(lldb) po self.view
(UIView *) $1 = 0x0a034690 <UIView: 0xa034690; frame = (0 20; 768 1004); autoresize = W+H; layer = <CALayer: 0xa034780>>
(lldb) break set -F '-[CALayer setBounds:]' -c '((int*)$esp)[1] == 0xa034780'
Breakpoint created: 2: name = '-[CALayer setBounds:]', locations = 1, resolved = 1
When the breakpoint is hit, the CALayer instance is referenced by ((int *)$esp)[1], and the new bounds is *(CGRect *)($esp+12):
(lldb) po ((int*)$esp)[1]
(int) $8 = 167987072 <CALayer:0xa034780; position = CGPoint (384 480); bounds = CGRect (0 0; 768 1004); delegate = <UIView: 0xa034690; frame = (0 -22; 768 1004); autoresize = W+H; layer = <CALayer: 0xa034780>>; sublayers = (<CALayer: 0xa033010>); backgroundColor = <CGColor 0xa034960> [<CGColorSpace 0xa02b3b0> (kCGColorSpaceDeviceRGB)] ( 1 1 1 1 )>
(lldb) p *(CGRect*)($esp+12)
(CGRect) $9 = origin=(x=0, y=0) size=(width=768, height=960)
(lldb) finish
(lldb) po 0xa034780
(int) $10 = 167987072 <CALayer:0xa034780; position = CGPoint (384 480); bounds = CGRect (0 0; 768 960); delegate = <UIView: 0xa034690; frame = (0 0; 768 960); autoresize = W+H; layer = <CALayer: 0xa034780>>; sublayers = (<CALayer: 0xa033010>); backgroundColor = <CGColor 0xa034960> [<CGColorSpace 0xa02b3b0> (kCGColorSpaceDeviceRGB)] ( 1 1 1 1 )>
On the device
Remember to use the layer address in the breakpoint condition, not the view address.
(lldb) po self.view
(UIView *) $0 = 0x1f031a10 <UIView: 0x1f031a10; frame = (0 20; 768 1004); autoresize = W+H; layer = <CALayer: 0x1f031b00>>
(lldb) break set -F '-[CALayer setBounds:]' -c '$r0 == 0x1f031b00'
Breakpoint created: 2: name = '-[CALayer setBounds:]', locations = 1, resolved = 1
When the breakpoint is hit, the CALayer instance is referenced by $r0, the new X origin is in $r2, the new Y origin is in $r3, and the new size is *(CGSize *)$sp:
(lldb) po $r0
(unsigned int) $7 = 520297216 <CALayer:0x1f031b00; position = CGPoint (384 480); bounds = CGRect (0 0; 768 1004); delegate = <UIView: 0x1f031a10; frame = (0 -22; 768 1004); autoresize = W+H; layer = <CALayer: 0x1f031b00>>; sublayers = (<CALayer: 0x1f030840>); backgroundColor = <CGColor 0x1f031ce0> [<CGColorSpace 0x1e530ad0> (kCGColorSpaceDeviceRGB)] ( 1 1 1 1 )>
(lldb) p/f $r2
(unsigned int) $14 = 0
(lldb) p/f $r3
(unsigned int) $15 = 0
(lldb) p *(CGSize *)$sp
(CGSize) $16 = (width=768, height=960)
(lldb) finish
(lldb) po 0x1f031b00
(int) $17 = 520297216 <CALayer:0x1f031b00; position = CGPoint (384 480); bounds = CGRect (0 0; 768 960); delegate = <UIView: 0x1f031a10; frame = (0 0; 768 960); autoresize = W+H; layer = <CALayer: 0x1f031b00>>; sublayers = (<CALayer: 0x1f030840>); backgroundColor = <CGColor 0x1f031ce0> [<CGColorSpace 0x1e530ad0> (kCGColorSpaceDeviceRGB)] ( 1 1 1 1 )>
On 64 bit simulators the break command should be:
break set -F '-[CALayer setBounds:]' -c '$rdi == 0x...'

Text from UITextFieldTextDidChangeNotification

I have a UITextField with this NSNotification:
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(textFieldDidChange:) name:#"UITextFieldTextDidChangeNotification" object:_searchTextField];
- (void)textFieldDidChange :(NSNotification *)notif
{
//
}
The NSLog is when I type in r
NSConcreteNotification 0x193c20 {name = UITextFieldTextDidChangeNotification; object = <UITextField: 0x15a980; frame = (20 20; 280 31); text = 'r'; clipsToBounds = YES; opaque = NO; autoresize = RM+BM; layer = <CALayer: 0x15aad0>>}
How do I get the text r out of the notif object?
The notification's object property stores the text field whose text changed, so notif.object.text would contain the text "r".

iPhone, the best way to change the background color of the UIAlertView , and button properties?

as topic... I want a simple to change the color of AlertView, but after googling, I find it is trouble, anyway, can give a tip on this problem ?
I also trace the UIAlertView, the structure will be like as follows :
CAlertView: 0x6463050; baseClass = UIAlertView; frame = (3.8 175.1; 312.4 129.8); transform = [1.1, 0, 0, 1.1, 0, 0]; opaque = NO; tag = -1; animations = { transform=<CABasicAnimation: 0x616cfb0>; opacity=<CABasicAnimation: 0x616d050>; }; layer = <CALayer: 0x6407f90>>
above is base view
subViews as follows:
"<UIImageView: 0x6470dc0; frame = (0 0; 284 118); opaque = NO; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x6470df0>>",
"<UILabel: 0x64649c0; frame = (12 15; 260 0); text = ''; clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x6444490>>",
"<UILabel: 0x6454180; frame = (12 22; 260 21); text = 'Submit successfully!'; clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x6453660>>",
"<UIThreePartButton: 0x64690f0; frame = (11 59; 262 43); opaque = NO; tag = 1; layer = <CALayer: 0x6468800>>"
I think that the UIImageView is the background view, I try to change the color of the view, but it does not effect the color of the UIAlertView...
anyone can share a good and simple way to change the color of the alert view ?
thanks for your time .
Regards
This is app going to be submitted to the app store? If so, I can almost guarentee that Apple will reject your app on Human Interface Guidelines.
You would probably be better creating your own custom version of a UIAlertView and showing that instead (you would also have a lot more control over the appearance and behaviour of it).

hitTest returns wrong UIView

I have a view hierarchy which contains smaller views on a scroll view. Each view can have subviews in it such as buttons etc.
For some reason, buttons on the view aren't clicked; exploring this further showed that while the scroll view receives the touchBegan event, the button does not. Calling the hitTest:event: message shows that the button is not returned, even though it is within the limits.
I've included a log output describing the touch's location on the scroll view, the item returned from hitTest, the touch's location if I called locationInView: using the expected item, and the hierarchy of the expected item (with frames printed). From this output I can deduce that the button should have been called...
Can anyone explain this? Am I missing something?
touched ({451, 309}) on <VCViewContainersView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>> (location in expected item: {17, 7.5})
expected touched item is:
view: <UIButtonLabel: 0x482b920; frame = (32 5; 36 19); text = 'Click'; clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x4831370>>, layer transform: [1, 0, 0, 1, 0, 0]
view: <UIRoundedRectButton: 0x482c100; frame = (50 50; 100 30); opaque = NO; layer = <CALayer: 0x482c450>>, layer transform: [1, 0, 0, 1, 0, 0]
view: <UIImageView: 0x480f290; frame = (0 0; 320 255); opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x480e840>>, layer transform: [1, 0, 0, 1, 0, 0]
view: <VCViewContainer: 0x4b333c0; frame = (352 246.5; 320 471.75); layer = <CALayer: 0x4b33d50>>, layer transform: [1, 0, 0, 1, 0, 0]
view: <UIScrollView: 0x4b32600; frame = (0 0; 1024 748); clipsToBounds = YES; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x4b32780>>, layer transform: [1, 0, 0, 1, 0, 0]
view: <VCViewsContainerView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>>, layer transform: [0, 1, -1, 0, 0, 0]
view: <UIWindow: 0x4b1d590; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <CALayer: 0x4b1d6d0>>, layer transform: [1, 0, 0, 1, 0, 0]
Update: Other than the UIWindow and VCViewsContainerView, all views are created programmatically using initWithFrame: or in the case of the button, buttonWithType:. The VCViewContainer is initialized using CGRectZero and when the UIImageView is created, its frame is set to the image's size + additional space for labels on the bottom of it.
Update 2: When calling [self.layer hitTest:location] with the same location, I get the layer of the correct view! What's going on here...?
hitTest:withEvent: starts at the window. Each view tests its subviews before testing itself, and so on, recursively. If a view's userInteractionEnabled is NO, however, it returns nil from hitTest:withEvent:, without testing its subviews. Such a view is certainly hit-tested, but it immediately replies that neither it nor any of its subviews is the hit view.
You UIScrollView has its userInteractinonEnabled set to NO. Thus, when the VCViewContainersView tests its subview the UIScrollView, and the UIScrollView returns nil because its userInteractionEnabled is NO, the VCViewContainersView uses pointInside:withEvent: on itself, finds that the touch is within itself, and returns itself as the hit view (and the search ends). This explains the result you are getting.
The reason this doesn't happen when you do the hit-test by way of the layers is that layers are not touchable and know nothing about the rules for touches, so they ignore userInteractionEnabled, which is a view feature, not a layer feature. Layer hit-testing is sort of a kludge, intended only for when a view contains a whole layer hierarchy (without a view hierarchy) and you want to simulate that a particular layer is touchable. The docs do tell you that the logic is different for a layer's hitTest: than it is for a view's hitTest:withEvent:, though they fail to explain exactly how.
I don't know why you have set your scroll view to be non-touchable, but if that's important to you, you can override hitTest:withEvent: in a UIScrollView subclass so that it tests its subviews but returns nil if all of them return nil.
You might want to try subclassing your UIScrollView to receive touches on objects within it.
By default your touches will be received by your UIScrollView, so if you want to receive touches on objects inside of it, you need to set that up explicitly.
Here's a working subclass to do just that. Thanks to the SO community for this, because I believe I originally found something like this here, just can't dig up the original post now.
Here's the .h:
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#interface TouchScroller : UIScrollView
{
}
#end
And the .m:
#import "TouchScroller.h"
#implementation TouchScroller
- (id)initWithFrame:(CGRect)frame
{
return [super initWithFrame:frame];
}
- (void) touchesEnded: (NSSet *) touches withEvent: (UIEvent *) event
{
// If not dragging, send event to next responder
if (!self.dragging)
[self.nextResponder touchesEnded: touches withEvent:event];
else
[super touchesEnded: touches withEvent: event];
}
#end
That should do it.
If I understand view stack correctly then your button is subview of some UIImageView - it has userInteractionEnabled property set to NO (by default) - so the image view and all its subviews won't receive any touch events.
Setting image view's userInteractionEnabled property to YES must solve the problem