DualShock 4 Controller Speaker Access - unity3d

Is there any way to route sound through an individual controller, i.e. play sound A in situation X from controller A and sound B in situation X from controller B?
A questions asking the same thing exists, but it is four years old, so I'm asking again. The person in the old question was using Rewired to access most of the DualShock 4 features, except for the sound, which is not yet possible in Rewired, as far as I have seen in the current documentation. Is there another way to achieve the above?

Unity only supports generic controller functionality, like analog input and button input, but does not have native support for what you want. For specific functionality like audio playing from a DUALSHOCK®4 controller, you'll need the appropriate devkit access and licensing.
If you have a Unity PS4 license, there's built-in Unity support for what you want. As a PS4 developer, I've accessed features like the speaker and lightbar quite easily in the past. Note, however, these features won't work on PC, as there are no drivers for a DUALSHOCK®4 controller that works outside of the PS4 environment.

Related

iPhone indoor location based app

I am researching how to create an app for my work that allows clients to download the app (preferably via the app store) and using some sort of wifi triangulation/fingerprints be able to determine their location for essentially an interactive tour.
Now, my question specifically is what is the best route to take for the iPhone? None of the clients will be expected to have jail broken iPhones.
To my understanding this requires the use of the wifi data which is a private api therefore not meeting the app store requirements. The biggest question I have is how does American Museum of Natural History get away with using the same technology, but still available on the app store?
if you're unfamiliar with American Museum of Natural History interactive tour app, see here:
http://itunes.apple.com/us/app/amnh-explorer/id381227123?mt=8
Thank you for any clarification you can provide.
I'm one of the developers of the AMNH Explorer app you're referencing.
Explorer uses the Cisco "Mobility Services Engine" (MSE) behind the scenes to determine its location. This is part of their Cisco wifi installation. The network itself listens for devices in the museum and estimates their position via Wifi triangulation. We do a bit of work in the app to "ask" the MSE for our current location.
Doing this work on the network side was (and still is) the only available option for iOS since, as you've found, the wifi scanning functions are considered to be private APIs.
If you'd like to build your own system and mobile app for doing something similar, you might start with the MSE.
Alternatively, we've built the same tech from Explorer into a new platform called Meridian which provides location-based services on both iOS and Android. Definitely get in touch with us via the website if you're interested in building on that.
Update 6/1/2017
Thought I would update this old answer - AMNH is no longer using the Wifi-based system I describe above, as of a few years ago. They now use an installation of a few hundred battery-powered Bluetooth Beacons (also provided by Meridian). The device (iOS or Android) scans for nearby beacons and, based on their known locations and RSSI values, triangulates a position. You can read more about it in this article.
Navizon offers an indoor positioning solution that works for iOS as well as any other platform. You can check it out here:
http://www.navizon.com/product-navizon-indoor-triangulation-system
It works by triangulating the WiFi signals transmitted by the device. Since it doesn't require an app to run on the phone, it bypasses the iOS limitations and can locate any other WiFi device for that matter.
Google recently launched an API called Maps Geolocation API. You can use it for indoor tracking of devices, which essentially can be used to achieve something similar to what AMNH's app does.
I would do this using Augmented Reality. There is a system sort of in place for this, the idea being that you place physical markers that have virtual information associated with them. I believe the system I saw was a type of bar code. When a user holds up the phone with the app, the app uses the camera to read the code and then display information. This could easily be used to make a virtual tour type app distributable through the app store and not even require a WIFI or 3/4G connection. This assumes that you simply load your information and store it locally with your app. Then to update it you simply push an update through the app store. Another solution is to use a SOAP/REST service and provide the information in that way, and this does not use private API's, though it does require some form of internet connection. For this you can see a question I asked about this topic a little bit ago:
SOAP/XML Tutorials Question
In addition, you could load a map of your tour location, and based on what code is scanned you can locate the user on the map and give suggested routes based on interests etc.
I found this tutorial recently on augmented reality, I haven't gone through it, but if its anything like the rest of Ray's tutorials, it will be extremely helpful.
http://www.raywenderlich.com/3997/introduction-to-augmented-reality-on-the-iphone
I'll stick around to clarify any questions or other concerns you may have with your app.
To augment the original answer for devs who were using Cisco MSE for indoor location - now they have an iOS and Android SDK which enables you to do indoor location using the MSE. A simulator can be used as well to develop the app without implementing the infrastructure to start with : https://developer.cisco.com/site/cmx-mobility-services/downloads/
For indoor location you can use Bluetooth LE beacons since it's a very accessible technology nowadays, there are several methods:
Trilateration: it uses 3 beacons, but with the noise and attenuation of Bluetooth signals, it gets quite difficult to determine the exact position and also it's not easy to use more than 3 beacons to increase accuracy.
Levenberg Marquadt method: used to solve non-linear squares problems showed good results on indoor positioning.
Dead Reckoning method: using the motion co-processor of the device, giving an initial position you can calculate the moving path of the device. Not that easy to implement anyway.
I wrote a post on the topic, you can find more info here: http://bits.citrusbyte.com/indoor-positioning-with-beacons/
And you can use this iOS app for your own indoor positioning experiments: https://github.com/citrusbyte/beacons-positioning
I doubt the American Museum is actually using private APIS; you'll probably find the routers that have been setup serve different responses to each other, so the app can detect it's position in the museum.
If you are looking for a cheaper to way to do the same task, you could have signs with QR codes, and use an open source library to let users scan these barcodes as they move through the museum, and update the onscreen content accordingly. On an even more low tech level, you can just tag each area with unique numbers, and distinguish that way.

VGA Output From ipad

Does apple support mirroring of ipad on tv can any one give me some idea
Read this article:
There’s been a lot of confusion about how the iPad VGA Adapter works. I received mine today, and I thought I’d try to clear things up a little (and give you some code to play with, if you’re an iPad developer with a VGA adapter of your own).
The first thing to understand is that the adapter does not mirror your iPad screen. You can’t plug your iPad into your TV or monitor and see all your apps on the big screen, like Steve Jobs does when he’s giving a demo.
Apps that want to support external display via the adapter must explicitly do so; the developer has to write code to support it. There are some standard iPhone OS-supplied APIs which will automatically do the right thing (such as video playback via standard controllers), but generally you won’t see anything on your external display unless the app you’re using has taken steps to put something there. That’s the “bad” (though surely not surprising) news.
The good news is that it’s trivially easy to support external display from your app if you’re a developer; the connected display just shows up as another UIScreen object. I made a sample project (which you can download as a zip archive here) that shows how to do it.
It’s basically just a nib with two windows (one for showing on the iPad, and one for showing on the connected external display), and a tiny bit of code to make it work.

Record from either or both microphones on iPhone 4

The iPhone 4 has two built in microphones, one in the standard place and one sitting next to the headphone jack. I know this new mic is used for noise reduction in calls but it is also used on facetime, videoing, speakerphone calls etc.
I have a valid reason that I want to be able to choose which mic to record from, be it the new one or the old AND even both at same time.
Does anyone know how to route avaudiorecorder so I can choose any of the above options of what is being recorded?
Thanks in advance
I know this is an ancient question, but for future reference in case others are looking for the same answer, I'll put some information here. There is a section in the AVAudioSession documentation that explains that since iOS7, there is a way to do this explicitly:
Using APIs introduced in iOS 7, developers can perform tasks such as locating a port description that represents the built-in microphone, locating specific microphones like the "front", "back" or "bottom", setting your choice of microphone as the preferred data source, setting the built-in microphone port as the preferred input and even selecting a preferred microphone polar pattern if the hardware supports it. See AVAudioSession.h.
In older OS's, you could only choose the AVAudioSessionMode to affect the choice of microphone.

What does iPhone OS 3.0 need from a programming perspective?

iPhone OS 3.0 is being announced and previewed next week (March 17).
We all know the feature set users want. Copy/paste, MMS, Flash on iPhone, etc.
We'll see about those.
What I'm interested in what does the development community feel the SDK is missing, in need of, to make programming for the platform easier and more productive.
A more complete Interface Builder with support for custom palettes and all sorts of goodies like that.
Better control over the keyboard.
Better unit testing support. (Unit testing can be done, but only on the simulator, and it's very awkward to set up.)
Push notifications. Please.
A more accurate simulator, i.e. one with a more accurate set of frameworks.
The ability to easily build views like the Mail compose window.
For that matter, an in-application compose window.
A better way for apps to share data locally than by invoking URLs.
Access to the calendar, notes, mail (possibly read-only), and bookmarks (again, read-only) databases. Maybe even limited access to the iPod database—even just the ability to read song metadata and access and change the playing song would be helpful.
Some sort of middle ground between UILabel and UIWebView that allows for formatted text without a huge hassle.
More built-in toolbar icons.
The return of the "glass" button style that was in the beta SDK.
A few useful internal views, like UIProgressHUD, exposed.
And last but not least...
A pony.
An easy Javascript bookmarklet installation method for Mobile Safari. (OpenRadar: 1, 2)
UIWebView needs more of UIScrollView's properties and methods, such as contentOffset.
More configurability on some of the built-in behaviors and views, e.g. the button text on UITableViewCell's "Delete" button, or the styles and text of UIAlertSheet/UIAlertView buttons. (Some of these can be done today with undocumented calls, but I'd rather not rely on those.)
More flexibility from UINavigationController, such as the ability to push/pop views that selectively don't display the navigation bar but using the same animations and stack, or more customizability over the navigation bar button labels and behaviors.
The ability to restrict interface orientation per UIViewController, not just accept/reject changes via shouldAutorotate. E.g. I want my main content view to be autorotatable, but I want my navigation hierarchy and settings screens to always display in portrait, even if the content view was rotated to landscape.
libxml and its handy DOM XML parser instead of the SAX-based NSXMLParser.
libcurl w/SSL, or more options and functionality for NSURLConnection.
Ability to check whether a URL scheme is registered. This could be used for apps to detect whether other specific apps are installed, and enable functionality selectively, e.g. when Instapaper detects Tweetie is installed, it can offer a "Post with Tweetie" button. (Disclaimer: That was a plug. I make Instapaper.)
I'm sure I'll think of more, but overall, I'm very happy developing for the iPhone. I'm amazed at the quality and sophistication of the iPhone OS, the SDK, and the development tools given how incredibly young they all are.
I'm surprised no one has mentioned garbage collection yet. Objective-C 2.0 on the Mac supports optional garbage collection. I don't really see any reason it wouldn't work just fine on the iPhone as well and it would eliminate much of the tedium of having to explicitly release objects all over the place.
What I'm hoping most for is to allow iPhones to talk to each other either via Bluetooth or some other means. Granted, they can talk via Bonjour if they are on the same Wi-Fi network, but that's just not convenient enough in 2009. If I'm out with a friend and want to play a multi-player game we first have to find a Starbucks or whatever the heck to get on the same Wi-Fi network. Also, think of the ridiculous amount of social apps you could have if iPhones could talk to each other without needing Wi-Fi. Exchange business cards, flirt with the cute girl over there, etc.
Form a PURE programmers perspective, make XCode as helpful of an IDE as Eclipse or IntelliJ are in the Java world. There's so much time I waste on stupid stuff that the IDE could have found for me as I typed it.
I also don't understand why I can't color buttons without having to use images.
Better multitasking is absolutely key at this point. Android's got it, Palm's WebOS has it - both, it seems, in largely unrestricted and well-implemented fashion. Possibilities:
Push notifications with a good UI (message stack in addition to badging/sound/whatever - if they have to have an extra approval step so apps can't be obnoxious, so be it)
Multiple full processes (not possible with current OS, I realize, but then I've never seen a good explanation why the iPhone doesn't support virtual memory)
Smaller "background" versions of apps that can run in the background - no GUI and a significantly tighter memory constraint
A good mapping API. Let us access the Google Maps abstraction that the Maps application uses !
More Interface Builder goodness
Better simulator
Smart inbox. Incoming messages are routed to installed handlers based on type.
Synchronisation framework that simplifies syncing with desktop & Mobile Me.
Decent landscape support, without the multitude of bugs, especially for the camera picker. Better support for rotation and more control of it.
Access to EXIF data on images from the picker, so we can tell their location
Deeper access to the camera API, so that we are not rail-roaded into the standard photo taker / picker
Push notifications that can launch an application. (In lieu of full multi-tasking, which I don't think we'll get and which could be problematic.)
Better, more intuitive keyboard controls.
API for inter-application messaging.
Access to data from Calendar, iTunes, Mail, Notes and more (with user's permission)
A more accurate simulator, with, for example, ways to limit bandwidth, and use the Mac's camera to actually take a photo.
Phone-phone bluetooth for data exchange
Access to more of the views used by iPhone apps, e.g. the progress HUD, email "blobbing" mechanism for email addresses, thumbnail scrollers, HUD brought up in Photos app, and more.
Less sandboxing. It won't likely happen, but it would always be appreciated for an app to have slightly more power than they currently do (actual filesystem access, for example. even if it was read-only access, it would still allow for more interesting applications to exist).
EDIT: Also, access to the copy/paste API. But I hope that one is obvious to Apple.
My list:
More full-featured IB support as the Mac has
Inter-app Data transfer mechanism (could be C&P, but does not have to be)
Greatly improved camera API with deeper level of control and more flexibility
SDK access to bluetooth and more support for protocols
Real ObjectiveC framework around the address book like the Mac has today.
Warnings similar to the location warning when an app tries to access address book data.
I'm sure whatever they actually have prepared, there will be a few interesting twists.
Ability to send SMS messages without having to have launch the SMS client and have the user type the message.
Access to the raw camera data so that things can be done without having to take a picture and wait for it to save (like you can do with Android)
push notification so that you can launch tasks... would need to be user controllable.
A camera that can focus (I know... have to wait for the next iPhone for that... if they decide to put it in...)
A UIKit level drawing api.
We all know the feature set people want. Copy/Paste, MMS, Flash on iPhone, etc.
I would have thought those specific items were down the SO wish list (although it seems I'm wrong looking at the votes on this comment :-).
MMS is a pretty pointless app when you have eMail. Flash is not an OS issue - Flash could be delivered today.
I don't even want push notifications - they're just a patch, I want background apps. I also want fixes for all the broken APIs like Camera, video and landscape support. Support for CoreImage filters would be nice too but probably too much to wish for.
[[ABAddressBook sharedAddressBook] me] for being able to use the owner's Zip code, phone number, or whatever.
Ability to download files to local storage and sync them back to iTunes or your hard drive
Get EXIF data from photos
Pull all photos at once
Pull all contacts at once
Control screen brightness
Access to music in iPod section
Read access to email and text messages
Access to Safari cookies (so maybe I could make some kind of keep-me-logged-in app.)
fix table view in landscape mode
new camera API with direct access to the camera
distribution code signing automatically when uploading to the app store (instead of code signing in xcode)
ability to request more memory so users don't have to reboot their phones to get rid of background apps
A non-Mac based development envionment.

which features do you look forward to the most in iPhone SDK 3?

Which of the new features are you looking forward to the most in iPhone SDK 3.0?
Is it one of the main advertised six new things, or something smaller? Something in the "1,000 new APIs", perhaps?
Phone to phone communication via bluetooth seems like it will terribly useful for some apps I am writing. No longer do you have to input all the data you want to store yourself, you can share some of it with other iPhone users.
not really a feature, but the best thing about developing the iPhone SDK further is the great frameworks that arise. there are some really, really great frameworks out there already (like the Three20 project) which will become even better with the new 3.0 SDK.
my real excitement will take over once they let us run background processes. maybe in 4.0?
Video! The ability to write decent tools for mobile video uploads is a big draw.
MapKit by far will bring the biggest change sweeping across the app space.
My personal favorite is that we can finally easily track upload progress of large files (like images).
I really, really want to see fixes in the camera API so that it isn't either broken (2.2.1) or forcing a switch to portrait (3.0).
Apart from that, the most useful features to me are:
push notifications. Great for making an app more sticky - you can let the user know that something of interest to them has happened.
CoreData - I've been using a third-party SQL layer, but it's a little buggy and no longer supported.
Peer-peer bluetooth, as the poster above said, is also useful for local data exchange.
And the least useful? Cut and paste. I actually want to disable it in my app (to discourage people from copying content) - and it doesn't look as though you can (yet).
Bluetooth phone-to-phone communication with GameKit will enable a host of currently impossible applications. Multiplayer games with no WiFi network needed and data exchange between two phones are obvious use-cases.
I'd also like to see - not currently included in the betas - a decent camera API that allowed us to customize the appearance of the capture screen, and as another poster said, have it work properly in landscape and portrait mode.