I have been looking around but have not found anything pertaining to phonegap. Is there a way to access the iphone camera and capture the stream while keeping the webview on top? Some sample code would be sweet.
Related
I am looking for a solution to share the screen from a mobile AR app (ARKit or Unity AR Foundation).
The screen needs to be shared to a browser on the desktop and it should be possible to draw lines on the screen from the browser using the mouse in the AR environment that can be seen on the mobile app which is sharing the screen.
After some investigation there does not seem to be a viable solution to truly share the same AR instance with browser/mobile as you can do with 2 mobile devices.
There should however be some sort of work around possible as it can be done with Vuforia Chalk AR.
Here is a GIF showing how it works:
AR Drawing demo
Sharing a video seems to be possible
Specifically trying to figure out how the line is drawn from the browser and then displayed on the mobile AR app
How can you achieve the same functionality with open source alternatives or Unity and custom code (No Vuforia is possible)?
Looking for a tutorial or some directions to how this can be implemented.
I came across this ans to avoid fullscreen video player in iPhone.Can we use this module to create video player custom controls.
I have using jwplayer and have own custom video controls (fast forward/backward, transcript, slow/fast, volume control, fullscreen).This controls gets hidden in iPhone and native iPhone video player is shown.
Is there any way to avoid this iphone fullscreen and still bring own custom controls.
Can this module be used for jwplayer? to bring own custom video control with jwplayer.
I have not seen that particular library in use with our player, so you're more than able to try it - but I cannot guarantee any success, nor can we provide support for its use.
With that being said, iOS 10 is slated to bring inline video playback to Safari for iPhone and the good news is our player is already set up in the way it needs to take advantage of this. Obviously, iOS 10 is in beta and not final yet so things are subject to change, but this is good news for those customers looking for inline video playback in Safari on iPhone.
I have searched everywhere and tried mixing and matching different bits of code but I haven't found anything that works or anyone with the same question.
Basically I want to be able to create video demos of iPhone apps that include standard UIKit elements and also the image coming from the camera (video preview layer). I don't want to use airPlay or iOS simulator to project onto the desktop then capture because I want to be able to make videos outside in public. I have successfully been able to video capture the screen with this code but with the video preview layer being blank. I read that its because its using openGL and what I'm capturing is from the CPU, not the GPU. I have successfully used GPUImage from Brad Larson to capture the video preview layer but it doesn't capture the rest of the UIView. I have seen code that combines both and converts to an image but I'm not sure if that would be too slow for realtime video capture. Can someone point me in the right direction?
It might not be the cleanest solution, but it will work nonetheless: did you consider jailbreaking? I hope Apple does sue me for this one but if you really want to record your screen then simply install a screen recorder. Enough options can be found: http://www.google.be/search?q=iphone+jailbreak+record+screen
And if you don't like it: recover your phone for a previous backup.
(for the record: I'm against jailbreaking and posting this from a productivity point of view)
I tried using the following code from http://codethink.no-ip.org/wordpress/archives/673 then putting it into the The OpenTokHello sample app from OpenTok and it appears to not actually record the video as I thought it would.
I made the ScreenCaptureView the new "superview" of everything and then made sure that the video streaming views would be added to that view. And when I played the video, the place where the streaming video should've been, was blank.
Any ideas on what I'm doing wrong?
Full disclosure: I wrote some of the OpenTok iOS SDK and work for TokBox.
The implementation of this ScreenCaptureView might not work with our SDK because all of our video rendering is done outside the context of UIView. You'd have to grab the rendering layer of the view in order to recover that part of the screen.
Depending on why you're trying to record the conversation, I recommend either
Using screen capture in QuickTime and running your app in simulator (easier)
Waiting for OpenTok archiving support on iOS which will be available in a few months (also easy, but not for the impatient)
Capturing the rendering output of the subscriber from CoreGraphics (less easy)
I am building a custom camera app, and would like to have the camera view similar to the native camera app in iPhone. (i.e., picks videos as a non-modal view, stays in the camera view after each video taken. I found the retake and use views unnecessary). Is there any possible way to do it? Thanks.
you can't do it for videos at present, but you can do it for still pictures in OS 3.1. if you search for "takePicture" and "cameraOverlayView" you should find helpful information; you can resize the preview window to be any size you like.