I am new to Unity. I have done a character walk animation in 3ds max and imported in Unity. I created Xcode project for iOS through Unity, and animation works as expected.
I want to develop some UI button controls on screen and animate this animation in my iOS app, only when this button is clicked. How do i code it now for this UI controls and events? Do i need do adding these controls and events on the Xcode project (which created by Unity) (or) I can do everything like this kind of native code in Unity itself?
Please advise!
Thank you!
Getsy.
You can do all UI in the Unity itself. You have these options:
Old Unity GUI system. It is pretty easy to program, but it is awful in terms of performance and usability for designers — it's created completely from code, with no editors. It's almost never used in commercial products except for debugging and developer tools. However, it's still a good option for prototypes.
Use other GUI package. There are a lot of 2d and UI packages for Unity of all kinds. Currently the most popular one is NGUI, which is pad, but also has an evalution version.
Create your own UI framework (still in Unity). Just wanted to mention that this is a viable option, but it's obviously the worst one for your case.
Wait for the new 2D/GUI Unity framework. It's supposed to come out in 4.3 version, and it is just around the corner; more than that, the original NGUI author is working on it.
In you place, I'd create basic prototype controls with built-in Unity3d GUI, and by the time I'd need to create something more presentable, new Unity GUI would hopefully already be there.
Related
I'm having a weird issue, may be a simple fix.
I've got a UI only "game" using the new UI Toolkit. It's a little kind of a drawing program. I've got a draw area in the middle with "tool buttons" on the sides. Everything works fine with Mouse, Pen, and touch when drawing (using scripts I can access all types of pointers), but for some reason touch doesn't work with the UI buttons only.
What's even weirder is that touch on UI buttons works when testing directly in Unity Play mode (I've got a touch screen laptop), but doesn't work when I make a Build.
In my Project Settings -> Input System Package, I've got Pen, Mouse, and Touchscreen active under "Supported Devices"
The new UI Toolkit is so new there's no help or similar issues I can find online.
if its still relevant:
I had the same issue and used "Standalone Input Module" instead of "Input System UI Input Module" in EventSystem. It says it's the old option, but it works for me :D
I add the touch screen here and it works now.
Just a follow up since I ended up finding my answer somewhere else.
In the "Input System UI Input Module" component in the EventSystem, I changed the "pointer Behavior" to "Single Unified Pointer" and that fixed it. Not sure if that's just a work-around, but it works great now.
I am working in unity on creating a application for the the Quest VR system. (Quest2) and added the Oculus Integration Package to my project. (While following an online course)
When I try and use any of the custom hands provided in the package, or run any of the example scenes that use them, the finger movements follow my controller grip presses (hand trigger) perfectly making a semi fist with the bottom 3 fingers, but the index trigger presses do not seem to cause any animation. Neither does a thumb press.
I have added a crude Log to show the triggers are being picked up by the system, but the animation is not showing.
What makes this very vexing as I shared my project with the course provider I was following and it worked perfectly for him.
So the problem is on my system somewhere..
Implementing the same from scratch using the XRInteraction toolkit system works perfectly btw. But the Oculus provided system does not.
If anyone has any ideas on where to even begin looking for the problem I would appreciate it greatly.
Happy to share system details but not even sure which details would be relevant :)
I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.
I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.
I'm currently developing an AR+VR app using GoogleVR and Vuforia on Unity iOS. Everything works fine, but at some point (I don't remember since when!) Google's native UI layer(vertical alignment line, back button, setting button,...) is missing and I can't change the viewer profile.
Versions are: Gvr 1.10 + Vuforia 6.2
So I tried to manually call the ShowSettingsDialog() in GvrViewer.cs, but it won't work either.
A new project with fresh sdk has no problem, so it would be specific to my project. I doubt the NativeUILayer would be affected by Unity's camera settings, layers, Canvas settings or so.
I can't figure out what might cause this kind of problem. So I need any suggestions to narrow the cause.
Problem solved.
I also use a custom SDK OpenCVForUnity, and the plugin OpenCVForUnityAppController.mm causes the problem.
It also attach RenderDelegate, sets GraphicsDevice etc.
It looks like that it overrides GoogleVR's iOSDevice so GVR couldn't figure out what device this is.
I've already created an iPhone app using UIKit. However, for one part of my app i will need to perform relatively heavy graphica (particles, lots of moving images). I can't rewrite the entire app in cocos2D due to my deadline. It's just for graphical purposes and i won't need user input via cocos2D. Another option i was considering was using plain openGL, but i'm quite sure cocos2D will be easier to learn.
In short: How can i use cocos2D in a small part (say one UIView) of my app without rewriting it? (User interaction is not necessary).
Edit: I found out, after a while more searching. For the record if someone else will need it in the future:
http://www.cocos2d-iphone.org/forum/topic/4708
http://www.cocos2d-iphone.org/forum/topic/9239
First, integrate cocos2d into your project.
Copy cocos2d directory into your project from downloaded cocos2d-iphone.
Modify cocos2d/ccConfig.h, #define CC_FONT_LABEL_SUPPORT 0
Copy fps_images.png into your project from downloaded cocos2d-iphone/Resources/Fonts if you want to show FPS that is implemented in cocos2d.
Next, implement to use cocos2d as cocos2d-iphone/tests/attachDemo/attachDemo.m.