Andengine project status - andengine

I am in the process of starting with Android development and thought of using andengine.
But when looking in github it seems like the project has been not-so-actibe the last 6 month.
Any comments on this?

It working well and active and awesome. If you wanna start with that, just go!

Related

Hololens 2 Unity app rarely renders anything to the screen

So this question I think is the same as this but the solution for this problem was downgrading to Unity 2017 LTS, which is incompatible with my current project.
Basically, whenever I build, there's maybe a 10% chance that the app runs in the Hololens. Most of the time however, no floating balls animation pops up, no Unity splash screen appears, and I don't see any of my app content. Strangely enough, the app does ask for microphone permissions, but that's all. It should ask for eye tracking too, but it doesn't. (Not sure if that's related but I'm out of ideas.) On a proper run, I get the floating balls, Unity splash, and all permissions asked before the content. The most frustrating part is that it sometimes works, and there has been no trend at all towards what helps or what doesn't.
I've tried rebuilding in Unity, redeploying in VS, reinstalling the appx, updating VS, updating Windows, and checked every forum post I could find but I can't figure this out. Does anyone have any ideas?
I thought it might be an OpenXR / Holographic remoting problem, but those have both been dead ends. Really not sure what to do at this point. Thanks in advance.
Unity version: 2020.3.36f1,
VS 2022, MRTKv2, HoloLens 2
Thanks for the help everyone! For some reason, my solution is just as confusing as the problem, but it seems to reliably work.
Problem has something to do with the Holographic Remoting for OpenXR, so simply enabling and then disabling that feature in the editor right before building seems to fix a buggy build. Tested this a few times now and it works pretty consistently. Hope this can help someone and eventually get patched.
EDIT: Previous answer didn't work. Now I'm thinking the problem was actually caused by a toggle in the Project Settings. For a successful build it seems as though XR Plug-in Management > Windows Mixed Reality > UWP > Use Primary Window should be checked on. Seems weird since I'm using OpenXR anyways, but this might help someone so thought I'd update. Works most of the time now - needs more testing.
There are two solutions:
Use "Windows Mixed Reality" plug-in provider.
Use "OpenXR plugin". For this case the order of installation of packages plays important role:
When you create new project, manually install OpenXR plugin using Package Manager of Unity.
Using "Mixed Reality Feature Tool", install other features of MRTK2 including "Mixed Reality Open XR plugin" in "Platform Support".
XR Plug-in Management
OpenXR features

hands not responding as expected on oculus Quest

I'm trying to develop an app to Oculus Quest
yet i encounter many issues on the way.
the app basically consist of a room and teleportation.
the issue i encountered happens both on unity 2018.3.12 and 2019.1.8.
I created an app that did work as expected before on Quest using Oculus integration v1.35.
however, when re-exported and installed to a brand new Quest device:
teleportation that worked before refuse to work
pressing a button on 1 controller hids the other one from view.
hands movement is limited
even though target devices is set to Quest, I still (on v1.35) see the controller of oculus go
it only happens when exporting to quest, on rift it works just fine.
from the posts in Oculus Forum
https://forums.oculusvr.com/developer/discussion/comment/702108#Comment_702108
https://forums.oculusvr.com/developer/discussion/79144/hands-not-showing-up-with-localavator-unity#latest
it looks like there is a firmware issue (but then you should be encountering that too no?)
things I tried:
Start a new project from scratch:
followed the standard tutorial and documentation like those
https://www.youtube.com/watch?v=qiJpjnzW-mw&t=1s
https://developer.oculus.com/documentation/quest/latest/concepts/book-unity-gsg/
could not see the hands at all, and no teleportation was implemented, it is said that there is a bug in current v1.39
https://forums.oculusvr.com/developer/discussion/79144/hands-not-showing-up-with-localavator-unity
I tried to use both the unity OVR assets and the following plugins
https://assetstore.unity.com/packages/tools/input-management/vr-movement-system-for-oculus-47292
https://assetstore.unity.com/packages/tools/input-management/vr-arc-teleporter-61561
and in all 3 encountered the same issue.
did anyone encounter any issue similar to what's described?
as said, i expect to see the ands and controllers, and the code attached to the trigger press executed. none of this happened.
as some test, (since i do not have another quest)
i exported a build and sharing here.
please comment if you have tried it and it workd fine on your quest or not
https://www.dropbox.com/s/uvcmhyar2qljb19/k14.apk?dl=0
From what you wrote it's seems like an update issue with your device.
Check references with the new device .
For me when i publish i use the 2018.4.1f LTE
Since i encountered a lot of issues with android build with the 2019 version.
Go to the company and ask what changed with the new versions.
thanks to all the helpers.
update 1:
if hands don't appear at all, its because Oculus integration 1.39 is buggy
revert to 1.38
https://developer.oculus.com/downloads/package/unity-integration-archive/
second:
the problem is that the latest firmware update added GO controller support and defaulted it.
so Quest controllers begun appearing as GO
the trick was to copy the AndroidManifest to Assets/Plugins and edit it properly to specifically define the correct settings.
you can see the recommended changes here
https://developer.oculus.com/documentation/quest/latest/concepts/mobile-native-manifest/?fbclid=IwAR3AgasGPJFVGsz7lyzfJNfuTB8R1FOg88Quq8YZz67eQlwEFvgEMDGjSdo

Unity3d Prime31 Google Play Game Services Tutorial

Does anyone know of any good resources on the web to get the Prime31 Google Play Game Services Plug in to work with Unity3D?
Its been 3 days, I just don't seem to be making any progress testing Game Play Services in my game.
I am using the SHA1 from the ~/.android/debug.keystore - which I believe is the the right SHA1 that the unsigned version uses. I also tried using the published build in order to get the Game Center to come up.
I tried the Prime31 documentation, but that doesnt seems to be helping.
I also believe my app is correctly setup in the Game Service area of the Developer Console.
Any information would be great.
Thank you.
It turned out that after importing the game center package , I ended up with two manifest files. I had the Prime[31] AdMob already in the project, which comes with a manifest file.
By the way this is normal.
After talking to the their support service they directed me to use the "Generate AndroidManifest.xml file..." under their prime[31] drop down menu inside Unity3D. This merges the two files into a single manifest. That problem got solved.
I also had to upgrade to the latest files, by downloading them manually, from the Prime[31] web site.
After this two steps I was able to get the Game Center to work properly.
I hope it will help, I am posting my interaction with the Prime[31] support

Working with Multiple Views in Xcode 4?

I'm doing some tutorials on iPhone development and I am working with window based applications to create an app with multiple views (The tutorial I'm using is from TheNewBoston 21-26). I'm currently using Xcode 4 to make these apps but I'm having trouble because Bucky uses Xcode 3 and there seem to be some differences. Every time I try to make this app, I end up with a blank white screen. I follow the instructions correctly watched the videos multiple times but still nothing happens. Even when I make my own simple window based application it does not seem to work. I think it may have something to do with connecting the views with the MainWindow. If you have somehow gotten this to work using only the instructions from the video can you please give me a link to a website or video that has a simple multi-view tutorial. If Xcode 4 no longer allows this method of working than can you please explain how to get around it. Thanks so much, I really appreciate you helping me. I know I'm new and this is a really basic question but it has been giving me a hard time. Thanks again!
Welcome to iOS development! :) I haven't worked the specific tutorial which you mention, but there's certainly many many others around the web which you could refer to.
Here's just a couple off Google when I looked around a bit regarding handling multiple views through a NavigationController which look detailed enough with good guidelines;
http://www.icodeblog.com/2008/08/03/iphone-programming-tutorial-transitioning-between-views/
http://fuelyourcoding.com/iphone-view-switching-tutorial/
If you're totally new, I would totally recommend getting your hands on some iOS development book (its worth the investment!); there are many; and they outline the process in good detail from start to finish. If you're unfamiliar with the whole process, there are many minor things that can go wrong and create frustration.
Happy coding! :)

Paging UIScrollView doesn't work :( (iPhone, cocos2d)

I found this:
http://www.cocos2d-iphone.org/forum/topic/9417
but the project doesn't works with iOS 4.x and cocos2d 0.99.5.
Can someone share a working project? or get it work?
Thank you very much for any help.
PS: I will vote up any helpful answer and the one who will share a working project gets the accepted answer.
Hi
I don't have time (right now) but I will try to use the one you have linked and post the working project because I will be in need of this in near future.
I used http://brandonreynolds.com/blog/tag/angry-birds-menu/ once.
Edit:
If you read that topic at cocos2d forum, a person named "blackmouth" have posted a working project.
This is the link : https://github.com/blackmouth/shapes-panels