SKView.presentScene with transition not working if have shown another SKView - sprite-kit

Update - see my own answer, it was a retain cycle with very indirect results.
I have a very weird bug apparently as some kind of side-effect of having shown an SKView in a different UIView. It seems a manifestation of the old iOS9 bug presentScene not working with transition.
Background
Touchgram is an iMessage app extension which lets you create and play interactive messages. It's like a cross between PowerPoint and a game editor, on top of messages. The interface is built in UIKit and SpriteKit (for playback).
Touchgram's engine creates SKScene objects and can present them with or without transitions, in a multi-page message.
The editing interface starts with a tree view and shows detail editor screens to allow editing things like touch detection, page change actions, playing sounds etc.
Some of our editing screens for details also show an SKView. Specifically, the Touch editor has a mode where an SKView is shown to record either a bounding area or gesture to match.
The Text Element editor shows an SKView as a preview of how styling affects a small piece of text. It renders the text element on the main background of the page, which may be a solid colour or image.
The Problem
So, what works is, launching the app (actually an app extension inside iMessage) and playing a saved message - it all runs fine, with scene transitions.
However, for one editing screen, if you just go into that screen, without it even drawing anything on its embedded SKView, it breaks playback. (This is the Text Element screen mentioned above).
When you then play any of the messages that have transitions, they get stuck with the nodes from the previous scene still showing.
I've narrowed it down to just the literal presence of that SKView on the errant screen. If the only change I make is to remove that SKView, the problem does not occur. There's (now) not even an outlet bound to the SKView and no code interacting with it.
A different screen (Touch editor) has an SKView on it and does not cause this side-effect.
I've spent days narrowing this down to realise it's this side-effect. It wasn't until I read the SO post linked above that I realised the apparent inconsistency of the bug was because some messages had transitions between scenes and others didn't.
This is blocking the first release of the product after over a year of core engine work.
Good screen - Touch Editor
Portion of the xib showing the nesting of views:
<objects>
<view opaque="NO" contentMode="scaleToFill" id="Mia-xp-RE9">
<rect key="frame" x="0.0" y="0.0" width="414" height="736"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMaxY="YES"/>
<subviews>
...
<view hidden="YES" contentMode="scaleToFill" translatesAutoresizingMaskIntoConstraints="NO" id="rlf-kc-zvh" userLabel="RecordingOverlay">
<rect key="frame" x="0.0" y="0.0" width="414" height="736"/>
<subviews>
...
<skView contentMode="scaleToFill" translatesAutoresizingMaskIntoConstraints="NO" id="AnC-CG-Aqq" userLabel="RecordingArea">
<rect key="frame" x="57" y="191" width="300" height="543"/>
<constraints>
<constraint firstAttribute="width" secondItem="AnC-CG-Aqq" secondAttribute="height" multiplier="414:750" priority="950" id="zMg-Iv-5nu"/>
</constraints>
</skView>
</subviews>
...
</view>
</subviews>
...
</view>
Problem screen - Text Editor
Portion of the xib showing the nesting of views:
<objects>
<view opaque="NO" contentMode="scaleToFill" id="upw-ro-x1m">
<rect key="frame" x="0.0" y="0.0" width="375" height="812"/>
<autoresizingMask key="autoresizingMask" flexibleMaxX="YES" flexibleMaxY="YES"/>
<subviews>
<view contentMode="scaleToFill" translatesAutoresizingMaskIntoConstraints="NO" id="KTl-Xs-TIo" userLabel="MainContentView">
<rect key="frame" x="0.0" y="44" width="375" height="734"/>
<subviews>
...
<skView contentMode="scaleToFill" translatesAutoresizingMaskIntoConstraints="NO" id="Wch-3x-CLz" userLabel="preview">
<rect key="frame" x="91" y="307.33333333333326" width="193" height="349.66666666666674"/>
<constraints>
<constraint firstAttribute="width" secondItem="Wch-3x-CLz" secondAttribute="height" multiplier="414:750" priority="950" id="jeK-un-SXq"/>
</constraints>
</skView>
...
</subviews>
...
</view>
</subviews>
...
</view>
Things to Explore
Trying to work out differences between these screens and what may trigger this UIKit/SpriteKit behaviour. This is visible brainstorming - basically looking for side-effects of what's different between the Working vs Problem screens, without a theory in advance. These are mostly things which feel highly unlikely to have an effect but am just trying to identify differences.
Working - adjacent/overlapping scrollView. The screen which works has a scrollView and one other which are visible depending on different modes.
NO Problem screen has a UIViewController subclass (and when bound, the SKView was bound to an outlet on that parent).
NO Problem screen is an SKSceneDelegate and assigned as the delegate in viewDidLoad but noticed the way it is cleared as delegate is through an optional which may result in it not being cleared.
NO Other VC delegation mixed in. Problem screen implements delegate protocols for 3rd party RichText font picking as well as being a UIText
NO Nesting the skView - the Working screen has deeper nesting now than the problem screen. Moving the skView element up to be a sibling of "MainContentView" makes no difference.
Deallocation - add a deinit in the Working and Problem screens to confirm both are deleted after returning to the main tree view, prior to playing. FINALLY FOUND A DIFFERENCE! - the Problem screen never calls deinit.
Irrelevancies
Things which probably have no effect.
Being bound or not - at the start of debugging, the SKView on the problem screen was bound to an outlet. The problem still occurs with it unbound.
Being visible - in both cases the SKView starts as hidden and is shown by code. The problem occurred regardless of whether it was shown.
Showing something in the SKView - with it unbound to an outlet and no code interacting with it, just the presence of the SKView on the xib causes the side-effect.

So, as spelled out in the course of my meandering question.
How to diagnose
Adding a deinit to do logging let me zero in on the difference between the two screens - one gets cleaned up and the other doesn't.
The Cause
in viewDidLoad the problem screen had this
// this line, in a method called from viewWillAppear. This is calling a simple drawing kit created by PaintCode
self.thicknessSwatch.drawer = { self.hasThickness ? LinePickerKit.drawEdgeThickness() : LinePickerKit.drawNoEdge() }
The fix, which is horribly obvious in retrospect:
self.thicknessSwatch.drawer = {[weak self] in (self?.hasThickness ?? false) ? LinePickerKit.drawEdgeThickness() : LinePickerKit.drawNoEdge() }
ie: my closure had a retain cycle on the UIViewController. That retained the SKView, even though it was not used, not bound to an outlet, only loaded from the xib.
The confusing behaviour was that the presence of that floating, semi-live SKView had no effect on most SpriteKit behaviour, until there was a presentScene(transition)

Related

Open window in macOS application in modal mode [duplicate]

I have an NSStatusItem that is properly displaying in the MenuBar. One of the items (when clicked) displays a modal NSWindow from my application, which is meant to perform a one-off task, then disappear. (Eg. the user enters a small bit of text, clicks "Save", and the modal NSWindow goes away.)
The issue occurs when the application is running in the background. The modal window properly appears above whatever application is running in the foreground, but when the user clicks the "Save" button, the rest of the application's windows also are made active. This is undesirable, as the user then has to click back to whatever app they were using. (Destroying the convenience of the NSStatusItem.) I'm displaying the modal window using:
[myWindow setFrame:finalRect display:YES animate:NO];
[myWindow setLevel:NSPopUpMenuWindowLevel];
[NSApp runModalForWindow:myWindow];
Is there any way to prevent clicks/events in my popup window from causing the rest of the application to become active? Or a way to let NSApp know that this particular panel shouldn't automatically activate the rest of the app? Thanks!
Instead of creating an NSWindow, create an NSPanel with the style NSNonactivatingPanelMask. You can then do the usual makeKeyAndOrderFront: and orderOut: to show/hide panel as needed.
NSApp's beginModalSessionForWindow, runModalSession, endModalSession are methods you need.
Have a look here for example how to use it:
Creating a fully customized NSAlert
A solution by Ken Thomases on the cocoa-dev list a couple years ago looks applicable here too:
[[NSApplication sharedApplication] hide:self];
[[NSApplication sharedApplication] performSelector:#selector(unhideWithoutActivation)
withObject:nil
afterDelay:0.05];
Which in theory tells the application to hide itself and unhide at the bottom of the window stack.
You could also intercept the mouse click event and use [NSApp preventWindowOrdering]
You can try something like:
...
if ([NSApp isHidden])
[myWindow makeKeyAndOrderFront:self];
else
[NSApp runModalForWindow:myWindow];
...
and when finish:
...
if ([NSApp isHidden])
[myWindow orderOut:self];
else
[NSApp stopModal];
...

VoiceOver ignores visible views, and speaks accessibilityLabels of hidden views

I have UIView, that can contain one of two views. When I removeFromSuperview first view and addSubview second view I can still hear accessibiliyLabel of hidden view. And only in 1-2 seconds I can hear correct accessibiilityLabel.
I see that it is common situation when hidden state of view is changed, accessibility can be frustrated and still speak hidden views, and does not note visible views.
Also if in UITableViewCell UIButton is hidden and then hidden state changes to NO, VoiceOver ignores it like it is still hidden. Only manual implementation of UIAccessibilityContainer protocol for cell resolves mentioned problem
No Notifications can solve this issue. Even playing with accessibilityElementsHidden did not help. Struggling with this during several days
Please can you recommend is there any way to say Accessibility that hierarhy of views was changed
You can post a UIAccessibilityScreenChangedNotification or UIAccessibilityLayoutChanged to alert UIAccessibility that the view changed. Since you didn't post any code, I can only give you a generic example, e.g.:
UIAccessibilityPostnotification(UIAccessibilityLayoutChanged,accessibilityelement)
...where "accessibilityelement" would be a button or text field or other accessibility element that VoiceOver switches to next.
Reference: UIKIt Reference
Just ran into this myself with a third party side menu library and had to use accessibilityElementsHidden to fix it. I first tried leveraging the accessibilityViewIsModal property, but that only works on sibling views.
#pragma mark - IIViewDeckControllerDelegate
- (void)viewDeckController:(IIViewDeckController *)viewDeckController didOpenViewSide:(IIViewDeckSide)viewDeckSide animated:(BOOL)animated
{
if (viewDeckSide == IIViewDeckLeftSide) {
[self.topViewController.view endEditing:YES];
self.viewDeckController.leftController.view.accessibilityElementsHidden = NO;
}
}
- (void)viewDeckController:(IIViewDeckController *)viewDeckController didCloseViewSide:(IIViewDeckSide)viewDeckSide animated:(BOOL)animated
{
self.viewDeckController.leftController.view.accessibilityElementsHidden = YES;
}

UI Automation: which delegate methods are called when scrolling in a scrollview

I'm new to UIAutomation introduced by iOS4. I'm scripting a test which requires to scroll in a scrollview.
So UIAScrollView has the following methods:
scrollUp
scrollDown
scrollLeft
scrollRight
scrollToElementWithName
scrollToElementWithPredicate
scrollToElementWithValueForKey
I want to know which UIScrollView delegate methods are invoked in the app when using these functions in my script.
UI Automation has nearly nothing to do with your Application in fact you can even run automation for apps that aren't yours. Indeed you have to know the accessibility label for each element. Therefore it should call the normal UIScrollViewDelegate which contains:
Responding to Scrolling and Dragging
– scrollViewDidScroll:
– scrollViewWillBeginDragging:
– scrollViewDidEndDragging:willDecelerate:
– scrollViewShouldScrollToTop:
– scrollViewDidScrollToTop:
– scrollViewWillBeginDecelerating:
– scrollViewDidEndDecelerating:
Managing Zooming
– viewForZoomingInScrollView:
– scrollViewWillBeginZooming:withView:
– scrollViewDidEndZooming:withView:atScale:
– scrollViewDidZoom:
Responding to Scrolling Animations
– scrollViewDidEndScrollingAnimation:
simply implement all into your application and NSLog() which one gets called.
Not sure if that's what you wanted to know.

becomeFirstResponder without hiding Keyboard

I have a view that supports copy and shows the edit menu using the following code:
if ([self becomeFirstResponder]) {
// bring up edit menu.
UIMenuController *theMenu = [UIMenuController sharedMenuController];
[theMenu setTargetRect:[self _textRect] inView:self];
[theMenu setMenuVisible:YES animated:YES];
}
The problem is, that when becomeFirstResponder gets called, the keyboard get's hidden. A good example of the correct behavior is in the SMS app. Double tap a message while the reply box is visible and the reply box looses focus, but the keyboard stays in place. Also, when the bubble is deselected, the reply box regains focus.
Unfortunately, Apple can do a lot of things that are not available to third-party apps.
I believe what you want is possible in iOS 3.2+ if you make the view that is to become the first responder accept keyboard input. You do that by having your view class adopt the UIKeyInput protocol:
A subclass of UIResponder can adopt this protocol to implement simple text entry. When instances of this subclass are the first responder, the system keyboard is displayed.
The protocol consists of 3 required methods that you have to implement. In your case, you would probably apply the inputs you receive in these methods to your text field and make it the first responder again. I haven't tried this but it should work.

hold-to-copy in uiwebview subclass

For my app, I subclassed UIWebView (the method described here http://ryan-brubaker.blogspot.com/2009/01/iphone-sdk-uiwebview.html).
I did this so that I could intercept touch events; when I detect certain types of taps, I perform the corresponding custom action, and then pass the event along to the underlying UIWebView.
So for example I can doubletap the view to make a toolbar appear/disappear, but a single tap on a link works the same as a regular UIWebView.
Under 3.0, Everything works just the same as it did under 2.2.1 (my doubletap + the standard single tap and scroll actions), but hold-to-copy does not.
I thought perhaps there was something new in UIResponder that I had to override, but as far as I can tell it's the same.
Any clues?
You shouldn't have to do anything special, as long as you're passing all the touch events through. It's certainly possible to disable that functionality using -webkit-user-select: none; in your CSS file.
<style>
body {-webkit-user-select:none;}
</style>