I'm developing a game and I wanna use texture packer extension. But I can't use. I don't know how to add my project? and I guess it because of being gles1.
#Shikima is correct. There is no support for TexturePacker in GLES1 and there likely never will be as it isn't in development and hasn't been for quite some time. Best to refactor your project so you can use live code.
Related
Is it possible in Unity to upload and download files while in a game? I want to make a minecraft-style building game and allow the user to import and use their own models (.obj files, etc), while in the game. I've been using Playfab for a backend and Photon for online cabilities, but as far as I can tell it will only work with image files (in Playfab). Is there a way to accomplish this and what would I need to use?
Yes, the way to go is to use Asset Bundles
Asset Bundles are platform specific assets that you create in Unity, but that you don't put in your build. Instead you can download them later, just as you want.
The process is a bit long, but not that difficult. The unity manual is actually really good and it should be easy to follow. Here is the link:
Asset Bundles documentation
I am developing an Augmented Reality app to be integrated into a website using Unity.I need to take output in WebGL. I am using Vuforia to create AR experience. Since Vuforia is not supported with WebGL, i am not able to build. Please suggest an alternate method or how to do Augmented reality in Unity for Web. Is there any alternative to Vuforia?
The good news is yes, you definitely can build an AR experience on the web!
The bad news is that none of the current libraries built for doing so offer a Unity plugin.. Meaning you'll either have to create a wrapper, do some complicated RPC call to talk to the JS library via Unity, or completely scrap Unity altogether and use only the library. To my knowledge, the best browser-based AR library is AR.js. I know this isn't the answer you were hoping for, but I hope you're able to achieve your goals. Good luck!
This is probably a bit late in the thread. But I'd like to add an option which might help. You can definitely build your AR app in web using WebGL as output. There is easy way to integrate it with a webiste too. SLAM based AR like Google ARCore is a great example to do it.
There are two options:
You can build such an app from scratch which will obviously take more time. Because apart from development, setting up hosting infrastructure is a challenge.
Otherwise, if you want to scale such AR web app development with low or no code and cloud ready hosting, you can use a SaaS platform called Marvin XR.
You can login and try it out for FREE: https://www.marvinxr.com:8443
Hope this helps the other folks who stumble upon this thread.
Can we create an augmented reality desktop application using unity which can convert all images from a school textbook into 3D objects. ? If yes, then what will be the procedure and what other tools do we need. ? It is our final year project and we really need help in this.
If watermark will not be an issue than you can use Vuforia library.
https://developer.vuforia.com/
It has nice unity integration and you can archive what you are up to in almost no time :). But it is not supporting desktop build out of the box, but below you have alternative libraries that do:
http://artoolkit.org/download-artoolkit-sdk#unity
http://www.easyar.com/view/download.html
I can't say how good they are because I have never used them.
You can try artoolkit, stable and simple. Easyar is good but a little young.
I recommend using ARToolKit. You can target OSX, Windows and Linux if you need to. Also using the Unity plugin it should be easy for you to get started.
And if you decide to go on mobile later it also has support for iOS and Android.
http://artoolkit.org/download-artoolkit-sdk
(Scroll down for Unity version)
Edit:
They also have an active forum where you can ask questions and find help if you have the need:
http://artoolkit.org/community/forums/
Best
Is it possible to mix iOS code and interface elements with Unity generated iOS code?
For example if I am working with a game developer who is developing a game in Unity, could I take his xCode project (generated by Unity), and add interface elements which I code myself using Objective-C & Interface Builder etc?
From what I can see this isn't possible as everything is created via Unity.... but hopefully I am wrong...
Thanks!
You can even use 3rd party libraries in XCode and integrate them in the build process. After fiddling around with the right settings I wrote a blog entry about this:
iPhone & Unity3D: Integrating 3rd Party Static Libraries in Unity3D Generated XCode Projects
You should be able to generate Native plug ins to interact with the Unity code. You would need to write a wrapper though. Read more at: http://unity3d.com/support/documentation/Manual/Plugins.html
Read the part at iOS :)
I came across this general guide to working with Unity native plugins in OSX which was useful and is probably where you need to be looking for your answer, as Zophiel said
https://blog.reigndesign.com/blog/unity-native-plugins-os-x
You can just design a view and button to trigger the unity to run or stop, but models in unity can't be controlled using Obj-C, although unity has import to Xcode, it only import the start code, game scripts depending on the link libraries which couldn't be modified.
I think you can do it as the same way I probably has been working with Objective-C without any changes. For Unity's code generator preserves the native codes under Classes.
And furthermore there is an another way to do this which is called "Plug-in".
But I hope the developers in Unity find a better way seamlessly combined with Xcode as staffs in Apple usually does.
Combining IB and Unity's integrated editor would be better and more welcomed in future.
Possible yes. Advisable for beginner or intermediate level Devs? Probably not. The Unity project is generated and regenerated every time you push a build. Now I believe that if you use Append when you do builds that it should keep existing changes to the Xcode project... but 'should' is the operative word there. You may need to implement some sort of build system like Jeeves to keep the headaches to a minimum if you are trying to do this on a large project in which you for see a constant stream of updates from both the Unity side and the Xcode side.
Now if you're integrating code that is in it's own files and doesn't overlap or rewrite the code Unity has generated, then the Append feature is really going to work for you, but if you're deleting, altering, or adding code to any of the files that Unity generated then definitely use SVN or some other form of source control and snap shot before and after every new Unity recompile / Xcode generation.
Also, take a look in the Unity Asset store. Whatever functionality you are trying to home brew in Xcode can definitely be written in C# on Unity. Someone else may have already conquered the problem you're trying to solve and placed it in the asset store for $5.
Hope that helps.
I have an iPhone and am getting the Droid and was wondering if blender games can run on either of them. I have already made a game and want to be able to use it on my phone.
****UPDATE****
I was wrong, just recently android support has been added but it's brand spanking new so just know that android support would be considered beta quality if that, perhaps even alpha.
I can't believe no one has given this answer:
http://code.google.com/p/gamekit/
Supports the logic bricks/python scripts etc so basically you can just load up your packaged blend file and you're good to go.
there is actually a book on how to integrate blender into a iphone game development workflow. it uses sio2 for the game engine on the iphone. it might be worth to give it a shot.