Difference between Custom Intents and App Intents - swift

On an IOS app built using SwiftUI on XCode, can someone explain the difference between custom intents and app intents, and provide documentation for the type of intent needed to create a shortcut from an installed app with the same functionality as the manual "repeat text" shortcut created on my phone which listens for speech and repeats it using Siri speakers?

Related

How do I tell when Siri opened my WatchOS app without using Siri Intents?

Basically the title. It's possible to open apps by just telling Siri "Open " without giving your app an entitlement or adding intents, so I was wondering if it's possible to detect if the app can detect if it's been opened by Siri using something like onContinueUserActivity.
I've tried looking all over the internet to see what types of activities open the app and trigger onContinueUserActivity but documentation is scarce.

Implement SiriKit/Google Assistant in Flutter App

I haven't been able to find any resources on integrating SiriKit in Flutter. I checked out the Writing custom platform-specific code but I'm not sure how to apply that for SiriKit. I understand the concept, but do I need to create custom intents or just provide platform channels (Android/iOS)
What I want to do is have the user be able to use Siri or Google Assistant to open the app which will then invoke an action (start a video call). Is this possible?

Compare app actions with conversational actions

I'm a developer trying to learn to interact apps with Google Assistant.
I noticed that as a developer we can use app actions (action.xml or shortcut.xml) to define how we want the google assistant to communicate with the app. Besides, there are conversational actions that can also do a similar job.
I wonder which one is preferred by Google and what are the differences in between. Are the apps developed by Google using conversational actions or app actions? and finally how can I tell if an app is using either of them or both?
Broadly speaking, app actions are a way to launch an Android app (possibly into a specific Android Intent and with specific information) from the Google Assistant running on Android, while conversational actions are a way to interact with a webhook-based app through the Google Assistant, typically over multiple turns.
While the two are similar, in that they both work through the Assistant, they are rather different.
App Actions
Only work on Android devices, not everywhere that the Assistant runs.
Are generally used to launch an already installed app, possibly providing specific information to a deep link in that app.
Can also be used (in some cases, with features that are coming soon) to provide widgets from the app into the Assistant.
Once the app is launched, you are (usually) no longer interacting with the Assistant - you're in the app, and have the UI from that app, which is generally not voice driven
Conversational Actions
Work across all platforms where the Assistant runs - from Smart Speakers to Android devices
Do not require an app to be "installed" - you can invoke it by name just like you open a web page
Primarily uses voice interaction for all of the work - there does not need to be a visual component.
Code runs "in the cloud", not on the device, which acts more like a web browser.
Google doesn't "prefer" either, and they develop both types. (For example, anything that works with a smart speaker is a conversational action, while apps like Google Maps include app action support). It depends on your use case and what you already have available:
If you have an existing Android app, then app actions may be a reasonable approach.
If you are starting from scratch, then you may want to look at conversational actions.

Is it possible to let Siri push button in swift app?

Is it possible to ask Siri to push buttons in the app with the text to speech?
I mean, I have a calculator and pushing the button of "dictation" Siri should be able to understand to push + button instead of just write inside the lable the text.
Thank you!
Siri cannot be used inside apps. Moreover, you can only use Siri for handling intents that are part of the SiriKit framework, which is quite limited at the moment.
VoiceOver is perfectly capable of what you need to do. It was designed for navigating through an app with voice commands as part of the Accessibility framework.
The Speech framework as suggested by others is not available in watchOS and wasn't really designed for voice navigation.
You will need to use Speech framework for that. It would not be possible with Siri. You can go through Apple documentation for that.

Does tvOS provide a standard alert for force-updating app?

Sometimes users have to update app to be compatible with server API, so I need to show a must-update-now alert. Is there some build-in message in all supported languages in tvOS or I have to provide such message myself for all languages?
And how do I go to App Store page for my app in tvOS?
Update: I see that there is Harpy/Siren library for ObjC/Swift so it looks like I will have to translate it myself.