I need to automatically navigate between my app pages and functionalities during kiosk mode. The usage context is an exhibition in which I like to put a smartphone on a stand to show people all features in an infinite cycle.
The idea is to use kiosk_mode package to put device in kiosk mode, then we need the automation to navigate between pages, tap buttons and show functionalities for some time.
So now we need to implement the navigation and the button taps like a test, for example:
Navigate page 1
Wait 5 seconds
Navigate page 2
Tap button x and the button y
Wait 5 seconds
etc..
Does it exist any package that automate that flow with a fluent API? Does it exist any advised pattern?
Else the idea is to simply create an async scope that starts and made all the steps in a custom way.
Related
I went through the official documentation and the autocomplete options in Xcode but couldn't find anything.
UI Tests are not supposed to provide you application code access. You test them as a user would (black box testing).
On current screen,
find a button with label "Get Started"
Tap on it
Swipe Up etc.
You would exercise the app as a user would use it, tap on A, wait for B to appear on screen etc.
I'm making an application that I want to function like a status bar app, where it doesn't control the menu bar (so if it's open over, say, Firefox, the menu is still the one for Firefox), but I also want people to be able to cmd-tab to it for keyboard navigation purposes. I know about Application is Agent, but that only flips both those things at once. Is there some more granular control elsewhere that allows me to change these aspects individually?
The problem I’m asking is how to have a kind of page (call it SetupPage) that is only shown until the user presses a button on the screen. Then when the screen should show the next page(call it HomePage) and every time the app is opened after the button press, the normal page will be HomePage.
I’ve seen this general idea in many apps (sign-in and home page, enter phone number or school and main page, etc) and I would like to have it in my app! I’m thinking of using Navigator routes and having the stack be this at first:
(HomePage, SetupPage)
and then once the button is pressed I can pop SetupPage. But I’m not sure how to implement this.
I already have the SetupPage and HomePage classes made. I’m not doing a sign-in the SetupPage or anything like that. I’m not using FireBase in this app either.
You can use the Flutter package shared_preferences to save a Boolean that tell you at the start of the program is the button pressed or not.
I want to create a button can move to any position like iPhone Style (Virtual Home Button) in Flutter but I don't know any packages or any library can do it. Moreover, I'd like the button to be able to show along with a specific app, eg. dial-up UI. If I deploy to desktop, will its behavior the same with mobile?
ios virtual home button
show button along with dial-up UI
Hope you guys help me ways to create it.
I have tried with unicorndial, floating_bubble, popup window, etc.
but all of them are widget within flutter app, instead of system-wide.
Sorry you can't really create anything like that outside of your App context. Apple is very strict in situations like this.
You can implement this sort of feature in Android by using Floating Service. But for Apple it's a NO NO.
I'm a newbie in UI Automation using instruments and I have the following question:
- the application starts and I get the mainWindow screen (which contains SignIn and Register buttons)
- I've managed to write the JavaScript code in order to tap one of the two buttons.
-> after tapping one of the buttons, another screen is displayed. let's say the Sign In screen which contains two fields: username and password.
In this case, how can I tell Instruments that this is another screen and this contains another elements that should be retrieved in order to fill the fields and tap the Sign In button ?
I only know to retrieve the mainWindow. I don't know how to write the code for a next screen
Did you try to capture what you do with the small record button in the Javascript editor window? That way you can find out how Instruments can call the elements you use.
See also this thread:
UI Automation - how to capture - record using javascript editor