Suggestion chips for display devices - actions-on-google

We're developing a google actions project for smart display devices, and are fining the behavior of suggestion chips rather unintuitive. They fade away after some period of time, which I think could confuse users. On the phone the chips stick around. Are there any setting that can alter this behavior?

No, there is no way for a developer to change the behavior of a suggestion chip's visibility. The behavior on a given surface is the only behavior that can happen.

Related

Pixel picking in Swift

I am trying to create a functionality (for a macOS 11+ application) where I pick a pixel from the screen (e.g like the one from Digital color pickier/ any color picker). However, I am having a hard time finding a way to do that.
I tried to "reverse engineer" some applications from AppStore and when I want to pick the pixel, the apps ask me if I want to give permission to record the screen, so I guess that they are "recording the screen" and they exact the pixel from a movie/picture that they record on the spot. However, this solutions does seems a little bit overkill and I think there should be a better way (a good example is Digital Colour Meter that does not ask for permissions).
Do you think that this can be achievable in a somewhat easy and clear manner? Also, if not, is my guess right with "recording the screen" and capturing a pixel from within the clip (of course, in real time)?
NSColorSampler was introduced in macOS 10.15 (Catalina). You can use it to sample a color from the screen using the system's built-in color picking interface. It does not require screen-recording permissions.
It's mentioned around the 5 minutes mark in WWDC 2019, Session 210, What’s New in AppKit for macOS
I would probably wrap it, and take a hybrid approach where I use old methods (with the screen permission prompt) on old versions, and NSColorSampler for new versions.

Built game looks duller than what is seen in the editor

I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.
I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.

Is pinch for font size changes in tableview ok re usability?

Is pinch for font size changes in tableview ok re usability?
Seems a good idea from my perspective, but wondering would users find it ok, or does it break an iPhone best practice?
I read through the tableView human interface guidelines and didn't see anything explicit there. In the Direct Manipulation section it says:
people can use the pinch gestures to directly expand or contract an area of content
If the user is using a pinch and zoom to change all text in a tableview it doesn't really seem to me like it is really proper "direct manipulation." It seems like it is more of a global setting change with a generic gesture. I think the real decision would come down to why are you planning to offer this gesture to the user? Is there a common use case for your app that the user would want to adjust the font size often? Are they going to change it more than once per session?
I know it is nice to offer lots of features, but any extra features are just going to obfuscate the real features. It may confuse the user if they accidentally do a pinch and then the text size is changed and they don't make the connection.
Overral, I don't think it breaks and explicit rule, but I would be really careful about deciding to add this "feature." If there is a really good reason to do it, I would say go for it, otherwise, it probably isn't worth the risk of getting rejected from the store and/or possibly confusing the user.
I vote no if it's key to the functionality of your app. You're having to train people to do something that may not be intuitive for them to realize that's there. If you look that the Twitter official app, they do something similar to expand the content in that cell, but you could totally live without that capability if you didn't know it was there.

User-friendly way to ask about locking the iPhone screen

I have an app where it will be a setting whether or not to allow the screen to auto-lock during idle periods.
My question: what's the user-friendly (and user-understandable) way to present the option. It'll be in a UITableView with a UISwitch, so i'll look somewhat like:
What's the best language to use?
Keep screen on
Prevent auto-lock
Is there a common-practice to adhere to?
Something else?
I certainly like "Disable Auto-Lock" with a UISwitch, like you have in your screenshot. I've seen this used in a number of apps, and it makes sense to me.

Initial iPhone virtual keyboard display is slow for a UITextField. Is this hack around required?

I have an app with a UITextField, amongst other things. When the user first taps on the text field, there is a noticeable delay before the virtual keyboard appears. On a 3GS it isn't too obvious, but on an older iPhone the delay can be around 1 second. After that the keyboard always pops up instantly. The delay is only the first time the keyboard pops up after app startup.
It looks like the initial UIKeyboard instantiation takes some time (quite a bit...) but is kept around after that.
I found very little information about this, which surprised me. However I did find this write up of the issue along with a hack-around solution.
http://blog.weareuproar.com/preloading-the-uikeyboard
My question is: is this hack around the only available solution? Is there a way to signal the framework (e.g. via info plist?) to instantiate the keyboard on startup?
No, there is no other (documented) way to do that. And even Apple's built-in apps (such as Maps) suffer from the same problem. You can either go with the hack you linked to or follow Apple's advice to not load stuff in advance before you really need it. By the way, this isn't much of an issue anymore with the iPhone 3Gs and the new iPod touch. The newer and faster devices load the keyboard almost instantly.