SwiftUI - Making a View focusable and trigger an action on macOS - swift

How is it possible to make a View on macOS focusable. I tried the it like below but the action is never triggered. I know that for NSView you have to implement acceptsFirstResponder to return true but can not find similar in SwiftUI.
Is this still a Beta related bug or a missing functionality for macOS ?
struct FocusableView: View {
var body: some View {
Group {
Text("Hello World!")
.padding(20)
.background(Color.blue.cornerRadius(8))
.focusable(true) { isFocused in
print("Focused", isFocused)
}
}
}
}

I think I found a work around for the problem to shift the focus to a SwiftUI view.
I am working on macOS Catalina, 10.15, Swift 5, Xcode 11.1
The problem is to shift the focus to a SwiftUI view. I could imagine that this has not been perfectly implemented, yet.
Only upon shifting the focus to the required SwiftUI view within the SwiftUI framework the onFocusChange closure of a focusable view will be called and the view will be focused.
My intention was: when I click into the SwiftUI view I want to execute the onFocusChange closure and I want it to be focused (for subsequent paste commands)
My SwiftUI view is built into the Window using an NSHostingView subclass - ImageHostingView.
When I add this ImageHostingView to the view hierarchy I define its previous key view:
theImageHostingView.superview.nextKeyView = theImageHostingView
I added a mouseDown handler to the ImageHostingView:
override dynamic func mouseDown(with event: NSEvent) {
if let theWindow = self.window, let theView = self.previousValidKeyView {
theWindow.selectKeyView(following:theView)
}
}
Calling the NSWindow's selectKeyView(following:) triggers within the SwiftUI framework a focus change to the desired view (the root view of ImageHostingView).
This is clearly a work-around and works only for final views which will be represented by a NSHostingView. But it highlights the problem (shifting the focus to a SwiftUI view upon a certain action) and it might be helpful in many cases.

The only way I've been able to get this to work is by checking the "Use keyboard navigation to move focus between controls" checkbox in Keyboard System Preferences

Related

Pass through clicks/taps to underlying UIViewRepresentable

I have a UIViewRepresentable that I am using the blend a project created with UIViews and SwiftUI.
My views are full screen and my issue is that I cannot find a way to pass touches through the transparent UINavigationBar at the top of the screen to my underlying UIView. That means that the top 100 or so pixels don't respond to touch even though they're visible. I have seen this post and solution, but it does not work in my case. I believe it's because I'm using UIViewRepresentable.
For my content view I have:
struct ContentView: View {
let values: [String] = ViewType.allCases.map { "\($0.rawValue)" }
var body: some View {
NavigationView {
List {
ForEach(values, id: \.self) { value in
NavigationLink(value, destination: CustomView(viewName: value).ignoresSafeArea())
}
.navigationTitle("Nav Title")
}
}
.accentColor(.white)
}
}
struct CustomView: UIViewRepresentable {
let viewType: String
func makeUIView(context: Context) -> CustomView {
CustomView(view: viewType)
}
func updateUIView(_ view: CustomView, context: Context) {
view.draw(view.frame)
}
}
Here is an image of my view hierarchy that shows the UINavigationBar blocking the top 100 or so pixels of my underlying UIView and preventing touches from being registered by the view.
I know I can disable the navigation bar and that does work, but I do need the back button. Ideally, I'd like to understand how I can apply the solution I linked to above which will pass touches unless there is a UIControl in the way.
UPDATE
I tried Asperi's suggestion of using transparency and then disabling the navigation bar and it occurred to me to check the view debugger with it disabled.
It turns out that the back button is still there in the view hierarchy, but it is not visible in the app when the nav bar is disabled. If there were a way to keep the nav bar disabled but enable the button, that would be the ideal situation.

MacOS: Commit TextField when clicked outside

From a UX point, the goal is to commit the input when people click anywhere outside the TextField.
For example, if the input is for renaming an item, then when clicked outside, we store the input as the new item name, and replace the TextField with a Text to show the new item name.
I assume it is an expected standard behavior, but please let me know if it is against Apple's MacOS standards.
Is there a standard / conventional way to achieve it with SwiftUI, or with some AppKit workaround?
As I see, onCommit is only triggered when we hit the Return key, not when we click outside the TextField. So what I think what I need to figure out is how to detect clicking outside.
What I considered:
Some built-in view modifier on the TextField which activates this behavior, but I couldn't find any.
Detect focus loss or editingChanged of the TextField, but by default, when we click on the background / a button / a text, the TextField doesn't lose focus, nor is the editingChanged event triggered.
If the TextField is focused, add an overlay on the ContentView with an onTapGesture modifier, but then it would take two taps to trigger a button when a TextField is focused.
Add an onTapGesture modifier on the ContentView, which calls a focusOut method, but it doesn't receive the event when people tap on a child view that also has an onTapGesture on it.
Improving on 4, also call focusOut from the onTapGesture callback of all child views. So far this is the only viable option I see, but I'm not sure it is a good pattern to put extra code in all onTapGestures, just in order to customize TextField behavior.
Example code:
import SwiftUI
#main
struct app: App {
#State private var inputText: String = ""
var body: some Scene {
WindowGroup {
ZStack {
Color.secondary.onTapGesture { print("Background tapped") }
VStack {
Text("Some label")
Button("Some button") { print("Button clicked") }
TextField(
"Item rename input",
text: $inputText,
onCommit: { print("Item rename commit") }
)
}
}
}
}
}

How to disable lazy loading in NSTabViewController?

I am designing a SwiftUI wrapper for NSTabViewController with the toolbar style. I want it to be a drop-in replacement for TabView. TabView uses a modifier tabItem(_:) to specify the tab name and icon. So I designed a similar modifier for my own ToolbarTabView:
extension View {
func toolbarTabItem(_ label: LocalizedStringKey, nsImage: NSImage? = nil, tooltip: LocalizedStringKey? = nil) -> some View {
self.preference(key: ToolbarTabItemPreferenceKey.self, value: ToolbarTabItemPreference(label: label, nsImage: nsImage, tooltip: tooltip))
}
}
I wrap each View in a NSHostingController and create a NSTabViewItem. Then I use onPreferenceChange to set the NSTabViewItem's label and image property. Finally, I have a NSViewControllerRepresentable to pass my array of NSTabViewItem to a NSTabViewController. This all works well except for the following issue.
By design NSTabViewController will only load its first tab. This loads the first NSHostingController which lays out the first View. That calls onPreferenceChange and sets the label for the first tab. However, the remaining tabs are not loaded and therefore the label remains unset.
I know that I can re-design my APIs to pass in the labels and images explicitly and that works, but then how does Apple implement their TabView? They must have the same issue with the views being lazy loaded because the macOS implementation of TabView looks like NSTabViewController.
I think a workaround would be to force all the tabs to load, which is the title of this question, but I am open to other ideas as well.
Reference:
https://github.com/utmapp/UTM/blob/dev/Platform/macOS/ToolbarTabView.swift
https://github.com/utmapp/UTM/blob/dev/Platform/macOS/ToolbarTabViewController.swift
Here is the dumb workaround I came up with
public class UTMTabViewController: NSTabViewController {
public override func viewDidAppear() {
super.viewDidAppear()
for i in self.tabViewItems.indices {
self.selectedTabViewItemIndex = i
}
self.selectedTabViewItemIndex = 0
}
}
Basically I force load every tab once the view appears. I really hope there's a better answer than this but I'll leave it here just in case.

SwiftUI onMoveCommand actions aren't executed

I'm making a macOS app with SwiftUI, and I would like to offset a SwiftUI view using the arrow keys on the built-in keyboard.
I couldn't find many resources online, but onMoveCommand() appears to be the event handler I need. Upon trying it out, I discovered that the action I specified for onMoveCommand() does not appear to be executed. Here's some code I wrote just to test it out:
struct ContentView: View {
var body: some View {
Text("Hello")
.onAppear() {
print("Appeared!")
}
.onMoveCommand() { (direction) in
print("Moved!")
}
.onTapGesture() {
print("Tapped!")
}
}
}
onMoveCommand() does not print "Moved!" when I press the arrow keys, instead I get the error alert sound played, and nothing is printed. onAppear() successfully prints the "Appeared!" message when the view appears, and onTapGesture() prints "Tapped!" correctly whenever I click the text. This seems to tell me that the basic syntax I got for these view events is correct, but I implemented onMoveCommand() incorrectly.
For now I only want my app to print something to the Xcode console when the arrow keys are pressed, and to be able to distinguish which arrow key was pressed. Can someone please explain what I did wrong?
Keyboard events are handled only by view in focus, so fix is
var body: some View {
Text("Hello")
.focusable() // << here !!
.onAppear() {
print("Appeared!")
}
.onMoveCommand() { (direction) in
print("Moved!")
}
.onTapGesture() {
print("Tapped!")
}
}
Tested with Xcode 11.4 / macOS 10.15.4. Make sure you have turned on keyboard navigation in System Preferences.

NSSearchField and NSSegmentedControl inside an NSToolbar

The setup:
I'm writing an document-based app targeting OS X 10.11 using storyboards. The main window has an NSToolbar with a 3-segment NSSegmentedControl. When the segmented control is clicked, it should toggle the collapsed state of an NSSplitViewItem in a horizontal or vertical NSSplitView. The behavior I'm trying to achieve is the same as in Xcode 7 where a segmented control in the toolbar shows/hides the Navigator/Debug Area/Utilities views.
Currently the segmented control sends an action to the first responder. The action method is implemented by an NSSplitViewController subclass, that then toggles it's NSSplitViewItem's collapsed state.
The problem:
The issue is that the toolbar also contains an NSSearchField. If the NSSearchField has focus, or even if the segmented control itself has focus, clicking on the NSSegmentedControl with the cursor does not result in the action method correctly making it's way up the responder chain to the NSSplitViewController subclass.
Attempted solutions:
Previously I worked around this issue using notifications instead of target/action, but it ended up being too convoluted in the end. Another idea is to send the message to the window controller, which would then pass it to it's content view controller, which would pass it to the vertical split view controller that would then send the message (if needed) to the horizontal split view controller. While I know this would work, it also seemed like an ugly solution having to add code to 2 additional files that simply passed a message along, I thought this was what using the responder chain avoided.
Any insights would be grealy appreciated.
Final solution:
I realized that wiring the segmented control's action up to the first responder only makes sense if the key view context is important. In this case the segmented control should toggle the collapsed state of split view items in multiple nested split views, regardless of what the key view is.
Define an enumeration to represent areas of the split view:
enum SplitViewArea : Int {
// The raw values must match the order of the segmented control
case left, top, right
}
Define a protocol to communicate that a split view area should be toggled:
protocol SplitViewTogglable {
func toggleSplitViewItem(matching area: SplitViewArea)
}
Implement the segmented control action method in the window controller:
#IBAction func segmentedControlSelectionStateDidChange(_ sender: Any) {
guard let segmentedControl = sender as? NSSegmentedControl else { return }
guard let area = SplitViewArea(rawValue: segmentedControl.selectedSegment) else { return }
guard let togglable = contentViewController as? SplitViewTogglable else { return }
togglable.toggleSplitViewItem(matching: area)
}
Implement the SplitViewTogglable protocol's method in the NSSplitViewController subclass:
func toggleSplitViewItem(matching area: SplitViewArea) {
switch area {
case .left:
leftSplitViewItem.isCollapsed = !leftSplitViewItem.isCollapsed
case .top:
// Nested NSSplitViewController that adopts SplitViewTogglable
if let togglable = centerSplitViewItem.viewController as? SplitViewTogglable {
togglable.toggleSplitViewItem(matching: area)
}
case .right:
rightSplitViewItem.isCollapsed = !rightSplitViewItem.isCollapsed
}
}
Is the NSSplitViewController set as the contentViewController of the window?
As part of the responder chain search for an action target, the window will consider its contentViewController as a supplemental target if it responds to the action selector.
When the search field has key focus the responder chain does not go through the normal content area, and instead goes through the toolbar to the window. So the only way the NSSplitViewController could be a part of that search is to be the contentViewController.