Built game looks duller than what is seen in the editor - unity3d

I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.

I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.

Related

Screen sizes in Xcode 7 the using SpriteKit

I am relatively new to sprite kit and have been attempting to create my first basic game. All physics and other basics seem ok, but for some reason whenever I build and run the screen dimensions are off (looks like default is 1024×768)?
Pretty sure I'm missing something fundamental here but it doesn't seem immediately obvious on how to adapt the screen to any size iPhone screen (this is my ultimate goal).
My question is whether this is actually just a setting issue or is it necessary to implement code?
Thanks in advance and have a great day!:)
To answer the first part, you can easily change the size of your scene.
If you take the default GameScene, click out of the scene and look at the Attributes Inspector. You will see the default size of 1024,768. Personally if landscape I tend to work with an iPhone 5 design resolution of 568,320.
Regarding multiple devices, SpriteKit works pretty well out the box. You should look at the documentation regarding scaleMode, take a look in the GameViewController.swift. .AspectFit worked really well, nearly pefect across all devices apart from a little letterboxing on iPad. However, for the amount of effort put in, more than good enough.
On a side note, I've found the following iPhone Resolution Guide resource useful in the past.

why does the screen on my phone turns white when it detects the target?

I'm new to augmented reality, and I'm using vuforia 4.2.3 and unity 5, I followed all the steps trying to make a test run and whenever the camera detects the target the whole screen turn white, I've tried many thing but none of them worked, can someone help me?
I had a problem that sounds similar.
There is a known bug that causes white screen, could be related to that? See here for more info.
What worked for me was to change the VideoBackground.shader code, as described on that thread.
Go to Project>>Qualcomm Augmented reality>>Shaders
Double-click VideoBackground. This opens up Mono.
In the code, change where it says:
"queue"="geometry-11"
to this:
"queue"="Geometry"
Save, rebuild etc.
Worked for me.

OpenGL ES sprites no longer render after updating to iOS 7.1

I am using a simple 2D sprite class based on this tutorial to render PNG bitmaps to the screen:
http://www.raywenderlich.com/9743/how-to-create-a-simple-2d-iphone-game-with-opengl-es-2-0-and-glkit-part-1
Everything worked fine on both my iPhone 4S running iOS 6.1 and my iPhone 5S running iOS 7. Since I updated to iOS 7.1, and on my MacBook Air updated to Mavericks and XCode 5.1, sprites no longer appear on the screen (I just get an empty white screen, which is the color I cleared the background to). When I build the app using XCode 5.1 and run on my iPhone 4S again, it still works.
Does anyone know what could be causing this? Has anyone run into this issue? I am having trouble getting to the source of the problem due to my lack of understanding of OpenGL ES among other things. :) My sprite class is exactly the same as the one in the tutorial.
Let me know if more details/code snippets are required.
I'm not even going to look at the sample code since it doesn't sound like it's the same as your current code. (You said 'based on'.) But I'll tell you how to find your problem.
First, clear to something other than white (like red) as a test.
If the screen turns red you at least know that your view and context and open gl in general is working. That knocks out a lot of possible culprits.
Second.. use the debug tools and chances are it will take you right to your problem. You have to run on a tethered device though and not the sim. Run your app...
Click on FPS in the debug navigator. Then click analyze and be patient. It will take a one frame snapshot of whats happening in open gl. It has to do a bunch of stuff to make that happen and it takes about 30 seconds. But then you'll have an interactive thing that shows you the frame and will let you step through processes and see what code is making that happen, and the frame as it draws each element. It's super cool actually. And probably it will show you an error message (in red).
My guess is that it's no longer loading your sprite images. Something probably changed between 6 and 7 or the versions of Xcode or of OSX. In that case the screen is blank because they're not loaded and therefore not being drawn.
EDIT:
I think the analysis will help find your problem. But to offer more possibilities, in my experience when nothing is drawing it's often one of these things:
An overall OpenGL issue - set your clear color to red or blue to test.
Shader didn't compile - always output shader compile errors so you know.
Bad Shader math or logic - use gl_FragColor = vec4(0.0,0.0,1.0,1.0) or some contrasting color to test. Can you see your structures?
Program isn't getting an attribute. Did you remember glEnableVertexAttribArray
Program isn't getting a uniform. Use the Analyze feature above to check the uniform values to make sure they made it to the shader.
(i'll add more if I think of them)
When stuff does

iOS iPhone finger paint-like app, is there a built-in canvas or open source projects to help me get started?

I need to implement on-screen drawing feature within one of my apps. I would imagine it would be an on-screen transparent overlay. I'm looking to be able to trace the finger path and leave a line, select colors and have erase/undo feature.
I did some research of what's currently on the app store, and a lot of those apps look similar and use similar brushes.
Something tells me that they look too much alike for this to be a coincidence. Does apple provide any built-in finger painting canvas, or are there some widely-known open source projects for on-screen drawing?
Thank you!
I ran into the same thing while looking for examples when I got started with drawing, so I released my efforts as a sample project called SimpleDrawing. I tried to keep things basic while supporting most/all of the standard drawing tools and operations.
There is the GLPaint project, which you can get here.
Other than that, you could use CoreGraphics & -touchesBegan:withEvent: if you wanted.

ios zxing with front or rear camera

Im using ZXing, is working fine on my new app, but i would like to integrate the option of front or rear camera,
so far the only reference to this i've found is on the google group
But is not very clear what they mean with that,
so any pointers on what i have to do to accomplish this?
thanks !
ZXWidgetController doesn't provide that functionality and it's not really set up to make it easy to change.
The code that needs to change is in - (void)initCapture. It calls [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]. This returns the default camera and you don't want the defalt.
You need code similar to that in - (ZXCaptureDevice*)device in ZXCapture.mm. That code won't work out of the box (it's designed to work with both AVFF and QTKit) but it's the same idea. Instead of using the default video input device, go through the devices and look at device position to find the device you want.
Be nice to port that code to the widget but that hasn't happened at this point.