I am trying to get an Uno project running on a Raspberry Pi. I can get the basic project running but there are issues:
Use VS 2019 Uno Extension to create a cross-platform application.
Add an Image to the MainPage.xaml
Build
Publish Project.Skia.Gtk project to a folder on Pi
Run Project.Skia.Gtk from that folder.
MainPage comes up and text objects on the page are rendered. However, Image objects are not rendered.
In the Linux command window there is an error:
Windows.Ui.Composition.CompositionObject[0]
The member void CompositionObject.Dispose() is not implemeted in Uno.
Everything looks right on the UWP project.
Related
absolute VR dev newbie here.
As stated in the title, how can I "build and run" VR application developed in Unity on HP Omnicept Reverb G2 headset? In other words, how can I turn my Unity project into an .exe or apk (not sure what is the correct file format)? Then how can I load this .exe file into my HP headset and run it?
I have experience developing VR applications for Quest2, and I remembered building my Unity project into an apk, and then load it into Quest2 through Sidequest. However, I have no idea how I may do the same thing but with HP headset...Please help!!! Thanks
In Unity, you can choose to build for Universal Windows Platform, please refer to Set up a new OpenXR project with MRTK - Mixed Reality | Microsoft Learn. Then refer Using Visual Studio to deploy and debug - Mixed Reality | Microsoft Learn to deploy the application to the PC via Visual Studio, and run the application via Windows Mixed Reality on the PC. This document may be useful to you - Unity development for VR - Mixed Reality | Microsoft Learn.
It turns out that to run the application, you simply double click the .exe file on your desktop...I wasn't able to run the .exe file due to a problem in my own PC. I swicthed to another PC and everything works fine.
I installed code-server on a Pixelbook Go (x86 Intel i3 with 8GB RAM, 64 GB SSD). I installed it as a user, not a root, not sure if that's a problem or not. The code-server service is enabled and starts when I start the Developer Linux VM (penguin). As far as I know I'm upgraded to the latest Crostini (104.0.5112.83) and the VM is a Bullseye Debian, I installed it not so long ago.
Outside of the VM I started the Chrome browser and I can view the GUI portion of the code-server by visiting http://localhost:8080 (as per .config/code-server/config.yaml). I installed the Flutter and the Dart extensions in the GUI. I also opened the root folder of my Flutter project (which is a regular project (https://github.com/TrackMyIndoorWorkout/TrackMyIndoorWorkout): no embedded or multi-module things in it).
I connected my phone (OnePlus Nord, Android 12), it displayed the bottom sheet related to the connection and I allowed Transfer. The Crostini detected this connection and popped up a notification about it. I could use that notification bubble to shuttle to the connected USB devices settings and enabled the Nord device.
Here is where things started to differ from starting the "regular" VS Code (installed into the penguin VM from a package) itself: VS Code shows a device selection in the footer section, which lists target devices such as the Pixelbook Go and also the Nord (AC2003) - see screenshot later. However the GUI of the code-server doesn't have selector for the devices. When I start to initiate a debug session it displays an error message:
Unable to launch Flutter project in a Dart-only workspace. Please open a folder close to your Flutter project root or increase the value of the "dart.projectSearchDepth" setting.
Here is how it should look like in the "regular" VS Code:
I opened the root folder of the Flutter project just as I did when I use the "regular" VS Code, so this project search depth doesn't make any sense to me, the dart files are there including the main.dart in the lib folder. And the VS Code kinda works, except that the build hangs due to possibly low resources, hence I'm trying to get code-server working so I can migrate at least the VS Code GUI portion out of the anemic VM. I suspect that the main problem might not be the Flutter / Dart settings, but first I have to figure out whey the GUI doesn't show a device selector. Then I can lament on this error message if still persists.
How to proceed?
I'm trying to build a very simple UWP app (for PC Desktop only) with a single image target and camera using Vuforia on Unity.
Everything works fine in the Editor, but after I build and run the app I get the following error on a black screen :
I don't even know where to debug this, as the only log I get is this error and nothing else. I also tried delayed initialization, but this same behavior happens when I manually call VuforiaAppllication.Instance.Initialize(); after it's initialized and before it's started.
Here's my build settings :
I'm using Unity 2020.3.30f1 and Vuforia 10.2.5, the only target SDK installed on my system is 10.0.19041.0 and also, I have Visual Studio 2019 Community installed if that info helps.
Fixed by disabling my laptop's webcam from device manager, and connecting an external USB camera.
I'm making a multiplayer first-person shooter, in the editor, I can play fine but when I build it, it just shows a gray screen and nothing happens. unity doesn't show any error after building.
i'm using unity 2020.1.6f1 and HDRP + DRX
my pc specs are:
AMD Ryzen 5 2600 Six-Core Processor
Nvidia GeForce GT 1030
this is the main menu in the editor
this is the build, it should show the main menu
You can try enabling Development Build from build settings to access the development console in the standalone application. Alternatively you can check the log files for the application for any errors that only happen in the build.
Based on current documentation you can find the log files from:
Windows: %USERPROFILE%\AppData\LocalLow\CompanyName\ProductName\Player.log
Mac: ~/Library/Logs/Company Name/Product Name/Player.log
Linux: ~/.config/unity3d/CompanyName/ProductName/Player.log
I'm trying to create an MQTT client to establish a communication between HoloLens and an MQTT broker. I created a script inside the Assets folder and tried to write an MQTT client using the following libraries.
using uPLibrary.Networking.M2Mqtt;
using uPLibrary.Networking.M2Mqtt.Messages;
But how do I import these libraries to the project? I'm new to C# so I'm not aware of any build tools that can be used.
Appreciate any advice on this regards.
You need to distinguish three cases having two different DLLs:
Running on HoloLens
The native HoloLens
emulator.
Unity editor simulation i.e. Holographic Emulation or simulator (s.
Introducing Holographic
Emulation)
1 and 3 are base on UWP while Unity editor is using .NET framework (4.X nowadays). For accessing uPLibrary namespace from UWP copy the M2Mqtt.WinRT.DLL to somewhere under your Assets directory select it and edit its import settings like this:
To make things work in Unity simulator too, copy M2Mqtt.Net.DLL and edit the import settings so that Any Platforms is clicked and WSAPlayer is excluded.
You don't have to make any distinctions in code and can access MqttClient and other classes platform independent like for iOS or Android.