how to find out if virtual keypad is shown BB Cascades - blackberry-10

I would like to know, if the virtual keypad is shown or not on BB10 devices (fulltouch).
I'm using C++ / QML.

There is a GitHub BlackBerry Community project called smart signals that provides a link between BlackBerry Platform Services (BPS) and Cascades. In that project they define a VirtualKeyboarServices class that will emit signals when the virtual keyboard is shown, hidden and give you its height. That is where I would start.

Related

Unity Build targets Mixed Reality but does not appear in apps

I have created a project using the Windows XR Plugin and XR Plugin management. I am NOT using the XR interaction toolkit, I have created my own tools, including my own XRRig using the Tracked Pose Driver.
My project is NOT built with UWP since I need file access that UWP does not provide or is too tricky to provide (after many attempts and work with a senior developer here, I just gave up). So I'm using standard Unity and my build settings are "PC, MAC, Linux stand-alone". My Player settings in the XR Plug-in management are "Windows Mixed Reality". I am using a few UWP functions for file access within the app.
Now, once the build is built, I obviously don't see it within Steam, but it also does not appear in the Mixed Reality list of applications. I have to start it manually by clicking on the icon on the desktop. It works great, but wtf...?
I know for applications to appear in Mixed Reality they must be built with UWP, but then if this build is neither UWP nor Steam, what is it then? How do I add it (or sideload it) to the Mixed Reality applications that the Windows menu brings up within the cliff house for e.g.?
To answer first question on what type of application is being built if UWP and Steam are removed as target in Unity, this is being built as a PC desktop application.
To answer the second question on how to access this application inside the Mixed Reality Cliff House shell when the application is not a UWP application, this can be launched via the “Classic Apps” pin inside the mixed reality cliff house shell.
Here is more information on the "Classic Apps" pin:
https://learn.microsoft.com/en-us/windows/mixed-reality/whats-new/release-notes-may-2019#how-to-launch
That should answer the question asked on how to launch inside of VR experience itself.

AOSP difference between multi-window and multi-display

Can anyone tell me what is the difference between multi-window and and multi-display in the AOSP environment.
multi-window
devices can display multiple apps simultaneously using multi-window. Android supports three multi-window configurations:
Split-screen is the default multi-window implementation, which provides two activity panes where users can place apps.
Freeform allows users to dynamically resize the activity panes and have more than two apps visible on their screen.
Picture-in-picture (PIP) allows Android devices to play video content in a small window while the user interacts with other apps.
https://source.android.com/devices/tech/display/multi-window
multi-display
Devices can now have multiple physical displays and inputs. Example: Two people can use the same Android device via two different displays at the same time. This is a new feature to Android Q and comes with a ton of limitations so far.
https://source.android.com/devices/tech/display/multi_display

One frontend for iPhone/iPad, Android and Multi-Touch displays

I have built a multi-touch application which is based on a Java EE backend and combined with BlazeDS to a Adobe Flex frontend. The application runs on a DIY-Multi-Touch which I built. Now I want to use another solution. The Adobe Flex frontend (with a multitouch library) and the BlazeDS adapter should be replaced by a solution which covers iPhone/iPad, Android and commercial Multi-Touch displays.
The problem is the iPad/iPhone, there is no Flash Player runtime (not the jail-breaked ones), but the application should be runnable also on those devices. So Adobe Flex and a Java frontend (no JVM on the iPhone/iPad) is not possible anymore. At first, starting the developing of the application it was not neccessary, to run it on Apples mobile devices, but this changed :-(
So, what can I do, using HTML5? So I can use it for Android and iPhone/iPad. But I also want to make it possible to use it on a commercial multi-touch-display and normal display with a mouse (I only have gestures for one finger, the 2-finger gestures are not neccessary). Are there any frameworks that allow this? Because I do not want to create several frontends (App for iPhone/iPad on Objective-C and a Adobe Flex for all other devices), it would be great if I can build a frontend for all devices.
Does anyone know how I could realize this?
Best Regards Tim
Take a look at Sencha Touch, it's a HTML5 framework dedicated to iOS and Android devices, which should make it relatively easy to build web apps that feel like native apps on those devices.
The Adobe Flex frontend (with a
multitouch library) and the BlazeDS
adapter should be replaced by a
solution which covers iPhone/iPad,
Android and commercial Multi-Touch
displays.
Android 2.2 and higher supports the full Flash Player, and therefore also supports Flex applications. However, Adobe has spoken of improving such support for the next Flex release, expected out early next year.
Apple has kind of specifically said they want to prevent the type cross platform development you're trying to accomplish.
Without knowing, or seeing, your application it is hard to say whether HTML5 will support you. But, yes, many parts of HTML5 should work across multiple browsers.
You may want to investigate Elips Studio which brings ActionScript applications to multiple devices including apple devices.

Interface Builder System Media Library Empty in iPhone 3.0 SDK

my System library in Interface Builder contains no image or sound resources etc.
I am currently using the iPhone 3.0 SDK. If you have any idea what I can do to get the default media for Interface Builder it would be appreciated.
I am using the Snow Leopard Developer Preview incase this makes a difference.
I do not believe that there are any system media elements exposed through IB on iPhone, only on the desktop Mac OS X. There are obviously several things (like system button images, backward and forward arrows, etc) that would be useful, but only a small number of them are exposed, and where they are exposed they are exposed through constants you pass to particular classes, not as images and sounds you can use directly.
I recommend filing an enhancement request through bugreporter.

Netbeans, mobile development and screen size

I'm looking at prototyping with a HTC Advantage, which runs Windows Mobile 5 and has a screen resolution of 640x480 (or the other way if in portrait).
Before anyone jumps in and suggests developing as a native Windows mobile app, we're prototyping as a Java midlet because we also want to find out what restrictions/limitations/design considerations there are if we decide to then take the code to run on other mobile platforms: Java allows us the largest mobile-base with fewer code changes.
I'm using Netbeans 6.8 for the development, and I can't see any way to change the "Device Screen" view of a midlet from a typical mobile-phone sized screen, nor change the view from Portrait to Landscape; similarly, the emulator doesn't have a large-resolution device.
I'm using the default mobile device of ClamshellCldcPhone1. I've looked at some of the other device profiles, but none of them seem to be targetting larger screen devices from what I can see. And I can't find any documentation that tells me the difference between, say, ClamshellCldcPhone1 and DefaultCldcPhone2.
Has anyone any experience of this? Most of the existing things I've read have said to design for the smaller resolution and use anchoring to ensure controls stay in place; however as I've got a screen that's twice the resolution, I want to write for that resolution (given this is currently in prototype world). I can copy the code over to the HTC device to test, but this will (probably) get painful, especially during the early stages.
Any advice welcome :-)
What you need is a new emulator configuration for your handset form factor.
The emulator in Netbeans is the same as the J2ME SDK (formerly Wireless ToolKit, hence the WTK acronym) from SUN Ltd.
You can make a copy of the ClamshellCldcPhone1 folder that is presumably located inC:\Program Files\NetBeans 6.8\mobility8\WTK2.5.2\wtklib\devicesand modify the images and .properties file in your new configuration to match the device you want to emulate.
You can add/remove physical keys, resize the screen and make it touchscreen that way.
This should all be explained in the J2ME SDK documentation.
It's been a very long time since I did any of this, but I think you can just copy one of the existing profiles, rename it, and change the settings to what you want.