Is or will there be a built in feature to generate and access a depth map from the dual cameras on the iPhone 7 Plus?
We have all seen Apple's keynote where they demonstrate how to use the cameras to create a shallow depth effect, but reading the API-reference I can't read more than how to access the raw input from the two cameras.
Focus pixels is clearly something else since it is supported by earlier devices.
Got an answer for this on the Apple Developer Forum from a member of the Apple staff.
I'll send out a post soon about iOS 10 APIs that are specific to the
iPhone 7 and 7 Plus cameras. The short answer is that no, depth maps
will not be available in this release.
Related
Definitely there is an infrared camera on iPhone X. I wonder, if it possible to capture photo/video from it using official instruments and swift. All apple official docs are only about using face id or measuring depth. And I can't find any info about possibility to do this.
P.s. there is only one stack overflow question: Is it possible to access the infrared camera on iPhone X?.
But there is no clear answer. Thanks a lot
We are a school district and are in the middle of deploying ~800 iPads, one for every teacher. Next year we'll probably be installing an Apple TV in every room to mirror the iPads wirelessly to the classroom projectors.
I would love to use Apple TV as our standard to mirror all our Windows7 laptops also.
AirParrot (http://airparrot.com/) allows this from Mac OS/X. Apple doesn't license the mirroring protocol so the way AirParrot gets around it is they basically stream the desktop as an H.264 "movie" that is sent to the Apple TV. The Apple TV thinks it's just playing a movie. From the reviews it seems they've gotten the lag to a pretty acceptable level.
I can't see why this couldn't easily be done for Windows 7, I just can't see a app out there that has done it.
Any ideas? I'm a .Net software developer. If anyone has at least links towards how to handle Apple TV video streams from .Net that would be a good first step.
Thanks!
I'm working on a C# library for sending pictures/video to the AppleTV, having trouble with video cutting out after 30 seconds (Link to question), but hopefully I'll figure that out soon. If that problem is overcome you can probably figure out how to generate a movie of the screen and stream it using the library code.
https://airlib.codeplex.com/
I'm poking around the ios documentation at the moment in the core audio section.
I'm just trying to do pitch shifting without having to write my own processing callback and I'm really confused with the documentation telling me one thing and the headers saying another.
My first question is about kAudioUnitSubType_Pitch
First ,In the ios section of the docs here the pitch unit is listed but when I try to add it to code its not listed in the code hint and in the audio unit header it says that its for desktop only. Is it possible to use this in ios 5 at all or am I looking at the wrong docs.
Second , also in the ios section of the docs here I'm interested in kAudioUnitSubType_TimePitch. It is listed but states ios 2.0 through ios 2.0. Does this mean that you cant use it in ios5 ?
Could somebody give me some clarity on the issue?
Apple's time pitch modification AU is currently not available in iOS 5, only in the desktop Mac OS. The early AU docs have been corrected. The current iOS 5.x only supports a pitch resampler unit.
But there appear to be one or more commercial iOS library solutions for audio time pitch modification, if you don't want to roll your own.
ADDED later: Only the 1st gen iPad is limited to iOS 5.x. iOS 7 includes the NewTimePitch Audio Unit. But this audio unit is (currently) lower in quality than the OS X TimePitch audio unit
Is it possible to bring the photo Booth effects (twirl, squeeze, bulge) for an image?
The normal way in OS X to apply such filters is Core Image. Core Image isn't part of the current iPhone SDK, but Apple have announced that it'll be in iOS 5. That's currently under NDA so can't be discussed beyond the detail Apple have made public on sites such as this, but given what is public I think it'd be a very good idea to ask again or to update your question once iOS 5 is released.
In the meantime, if you want to do the effect live you're probably best off uploading the image to OpenGL and applying some sort of pixel shader, which is quite an undertaking but previous StackOverflow answers such as this one are likely to be helpful.
Core Image is now part of iOS 5. It has many filters you can use for creating photo effects by simply combining them. You can use Photo Effects SDK that is built on Core Image. It comes with 35+ ready to use photo effects. You can also easily create your own photo effects with the builtin filter chaining support.
I have a 10MB QuickTime VR file and I was wondering if it would be possible to play it on an iPod/iPhone/iPad?
I've seen multiple messages about the subject around but nobody could give a straight answer if the iPhone fully support this format, partly support the format or doesn't support the format at all. If this format is supported, which OS version supports it?
Nope, I don't have an iPhone at my disposal to check this, unbelievable right?
Gilad.
It's not possible to use QTVR at all, it's never been developed by apple on iPhone
but there are some other similar object you can use.
take a look at my old answer to a similar:
How to rotate QTVR image 360 degree in iPhone?
There's an app called iPano which views QTVRs, both panoramas and objects. It's ideal for viewing them and it let's you keep a collection of them on your phone. And it's really neat on the iPad too!