Using ARKit to process online video, is it possible? - arkit

Apple ARKit is awesome, however, it looks like it can only take the device's front/back end camera as the video source. My question is if it is possible to use the online video or some other video stream as the video source of the ARKit.

Welcome, I don't think it is possible, because ARKit uses accelerometer, gyroscope to get device position and scan to world. I don't know how it can be possible over the video. Also ARKit basic requirement is access to camera.

Related

Recording vr motion video

I want to record a vr motion video of my house using aframe so that I can show the demo view of the the house.What are the js files need to be included?How can I do it?
I assume you just want to record 360º video — if so, a 360º camera is the place to start. See: http://thewirecutter.com/reviews/best-360-degree-camera/
You may want to test things out on your device (Gear VR, Cardboard, Vive, whatever) before investing in a camera. Video playback in A-Frame may have some issues on mobile devices particularly.
Example: https://aframe.io/examples/showcase/videosphere/
hi kumar yes you use without using 360 camera by using normal mobile phone but to place that video you have to use latest version of unity that support not only 360 video format but also plane so where you can put that video as an medium.
or if you are using unity 4.6 to unity 5.3 then use have to use some packages like easy video texture in this package you have different options for placing video
https://github.com/maazirfan/Easy-Movie-Texture-for-Unity
and use where ever you want vr/gear vr / htc vive .

OpenCV detect iphone orientation

I have a site where users can upload video. When testing some video uploads that are processed with OpenCV and Python, if the video was recorded on an iPhone it always assumes the video was taken in landscape mode by rotating the phone 90 degrees to the left, such that videos in portrait mode are sideways and videos taken in the other landscape direction (90 degrees to the right) are upside down.
I know I can use OpenCV to rotate videos, but is there a way to detect:
a) if the video is even taken with an iPhone or not
b) if so, what the orientation should be, how much to rotate the video by
OpenCV is an computer vision library, for your problem you can't use OpenCV (AFAIK). What you need is to get the metadata of the video. Metadata contains the all the information you need about that video. Here you can see what does metadata contains. You should search how to extract metadata from a video. Take a look at this.
Good luck!

Is it possible to programmatically capture iPhone 5S slow motion video?

I couldn't find an answer to this question, and looking at Apples own apps like iMovie on iOS, the video picker does not offer a slow motion option on the iPhone 5S.
The image picker offers very little control over the video. If you are willing to dive deeper into the APIs you can use AVFoundation to capture your video and manipulate the camera properties as you see fit.
iOS 7 introduces a new AVCaptureDeviceFormat class that will give you the maximum and minimum supported frame-rates for the capture device, and you can use these to set a custom frame rate on the camera itself. I don't have an iPhone 5S to hand so I can't actually verify whether this API goes all the way down to 120FPS.

How to record screen to video on iPhone with openGL (view preview layer) and UIkit elements?

I have searched everywhere and tried mixing and matching different bits of code but I haven't found anything that works or anyone with the same question.
Basically I want to be able to create video demos of iPhone apps that include standard UIKit elements and also the image coming from the camera (video preview layer). I don't want to use airPlay or iOS simulator to project onto the desktop then capture because I want to be able to make videos outside in public. I have successfully been able to video capture the screen with this code but with the video preview layer being blank. I read that its because its using openGL and what I'm capturing is from the CPU, not the GPU. I have successfully used GPUImage from Brad Larson to capture the video preview layer but it doesn't capture the rest of the UIView. I have seen code that combines both and converts to an image but I'm not sure if that would be too slow for realtime video capture. Can someone point me in the right direction?
It might not be the cleanest solution, but it will work nonetheless: did you consider jailbreaking? I hope Apple does sue me for this one but if you really want to record your screen then simply install a screen recorder. Enough options can be found: http://www.google.be/search?q=iphone+jailbreak+record+screen
And if you don't like it: recover your phone for a previous backup.
(for the record: I'm against jailbreaking and posting this from a productivity point of view)

Graphical Audio player in iOS

Is there a library or built in graphical player to represent a playing audio file in iOS.
I dont want a full screen player, but a small inline player that can be embedded into a UIView.
Does this exist in iOS?
Apple has a good example of this, avTouch. I have successfully adapted parts of their code to display audio levels in the past.
avTouch