Unity 3D: Rear camera in device is behaving weird - unity3d

Am working on taking snap from my device camera. The front camera is working good,but the rear camera is not upto the mark. It is showing upside down. I donno why it is behaving strange.
Can anyone help me out.
Cheers!!

It's difficult to come up with a simple answer for your problem, this depends on wether you run it on iOS and Android. In general you need to adjust ( rotate and flip ) the resulting image based on the mirror and rotate value in the WebCamTexture.
There is a toolkit available that help you with solving some of these issues that you can take a look at. It's called Camera Capture Kit and is available on the assetstore ( https://www.assetstore.unity3d.com/en/#!/content/56673 ) - since the source is included for both iOS and Android it might be a good foundation for helping you tweak and solve your issue in your game or maybe it will just be a plug'n'play solution for your needs. It might let you save literally weeks in the long run since there is a lot of tweaking to get camera capturing right in Unity. Furthermore there is a functionality available to enable and disable flash light which might come in handy as well.
There is also a demo app included which is quite similar to what you have already been doing on your screens.
Cheers!

Related

Built game looks duller than what is seen in the editor

I built my unity game through Xcode into an iOS device for testing, and the colors look duller and more pastel-like than in the game view in the editor. The exact same thing has also happened on a modern android device. How can I make it so that the colors as seen in the built game better reflect the colors in the editor?
EDIT: I have sent a file to my phone and found out that the phone perceives colors differently than my monitor. I'm sorry if I'm asking for quite a bit here, but... is there any way to make the game look as I intend it to? The fact that I never see exactly how my game will turn out seems kinda... awkward. Thanks.
I'm not sure if you already tried this, but you might want to use Unity Remote for your iOS device, so you can check the color differences in real time instead of having to build it every time.
But other than what the comments said, it sounds like you should try to adjust your monitor's display settings for better color correlation between the built game and the one in the editor.

Screen sizes in Xcode 7 the using SpriteKit

I am relatively new to sprite kit and have been attempting to create my first basic game. All physics and other basics seem ok, but for some reason whenever I build and run the screen dimensions are off (looks like default is 1024×768)?
Pretty sure I'm missing something fundamental here but it doesn't seem immediately obvious on how to adapt the screen to any size iPhone screen (this is my ultimate goal).
My question is whether this is actually just a setting issue or is it necessary to implement code?
Thanks in advance and have a great day!:)
To answer the first part, you can easily change the size of your scene.
If you take the default GameScene, click out of the scene and look at the Attributes Inspector. You will see the default size of 1024,768. Personally if landscape I tend to work with an iPhone 5 design resolution of 568,320.
Regarding multiple devices, SpriteKit works pretty well out the box. You should look at the documentation regarding scaleMode, take a look in the GameViewController.swift. .AspectFit worked really well, nearly pefect across all devices apart from a little letterboxing on iPad. However, for the amount of effort put in, more than good enough.
On a side note, I've found the following iPhone Resolution Guide resource useful in the past.

ios zxing with front or rear camera

Im using ZXing, is working fine on my new app, but i would like to integrate the option of front or rear camera,
so far the only reference to this i've found is on the google group
But is not very clear what they mean with that,
so any pointers on what i have to do to accomplish this?
thanks !
ZXWidgetController doesn't provide that functionality and it's not really set up to make it easy to change.
The code that needs to change is in - (void)initCapture. It calls [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]. This returns the default camera and you don't want the defalt.
You need code similar to that in - (ZXCaptureDevice*)device in ZXCapture.mm. That code won't work out of the box (it's designed to work with both AVFF and QTKit) but it's the same idea. Instead of using the default video input device, go through the devices and look at device position to find the device you want.
Be nice to port that code to the widget but that hasn't happened at this point.

iphone camera application

while in graduate school for biomedical eng I developed a device we called a "vein finder". It was a simple device that was good enough for our eng school to patent.
I think it would be very very easy to use the iphone camera to develop an iphone app whereby MD's/ nurses/EMT's could use the app to easily identify peripherial veins that are not visible to the naked eye. This would be invaluable in starting IV's and giving bedside medications. The vein finder was esp helpful for patients in shock who had poor venous filling and therefore didn't have veins that "popped into view" with a tourniquet.
It would require using the iphone light at specific wavelengths... anybody have any idea if that is possible?
I don't think you have any control over the flash light that illuminates images. You can only turn it on and off. I also doubt you would be able to get the specific wavelengths you need.
My suggestion would be to look into building a peripheral light device which plugs in to the headphone jack for power and has the necessary functions for emitting light at different wavelengths. That way, you would be able to get the exact result you require.
You may also need to look into the camera itself, as it may not have the ability to capture the light at the wavelengths you may require. Hope that Helps!
Do you mean overlaying veins over the picture on the camera? It would be easy if the veins are always in the same place... What would your imagined procedure be for using this app?
EDIT: You can not manipulate the iPhone light for different frequencies; you can only turn it on and off. You would need to get a separate external light for that.
To work around the fact that you can't set the wavelength of the built in light, you should use an external light along with the iOS device.
Apologies for massive necro!
The LED light of the iPhone is not the problem - most white light sources contain IR or near-IR (in fact they are made up from the entire visible spectrum). Daylight works very well too.
The problem is the receiver. Most camera lenses, including smartphones, have an IR filter - thus cutting out the useful part of the spectrum. All we need is a digital camera with IR filter removed, that has a viewfinder (where you'll see the veins).

water effect in cocos2d

I'd like to have a water effect on a background layer in my app. The effect doesn't need to react to touch or anything - it just needs to wave an image a little bit.
CCWaves3D seem ok, but leave have nasty black artifacts around the edges when I run it. Similarly CCShaky3D. CCLiquid brings my app down from 20fps to 5fps..
Is there any other effect I might want to try out? Or perhaps I'm using the current effects in a wrong way?
id shaky = [CCShaky3D actionWithRange:4 shakeZ:NO grid:ccg(15,10) duration:4];
id liquid = [CCLiquid actionWithSize:ccg(15,10) duration:1];
id wave = [CCWaves3D actionWithWaves:18 amplitude:80 grid:ccg(15,10) duration:10];
Bonus question - where can I find any good documentation for cocos2d effects? I found default cocos2d docs utterly useless & wasted a couple of hours trying to google before asking this question :/
I have noticed performance issues when building/running in debug mode. Have you tried to build/run in release mode? Also, are you experiencing this on the device and not just on the simulator?
Unfortunately, I have not found alternate documentation specifically for cocos2d effects. Here are a few links to posts and sites I have gathered for many different resources including tutorials, tools for making tile map games, using zwoptex for making sprite sheets, using vertex helper for making a verticies plist file for box2d/chipmunk collision detection instead of just rectangles, and sites for images & sounds:
Cocos2d Resources
Need 2D iPhone graphics designed
http://www.learn-cocos2d.com/knowledge-base/tutorial-professional-cocos2d-xcode-project-template/
I have found Ray's Tutorials especially helpful along with viewing the test applications included with cocos2d.
Happy coding!