I have tried to capture a screen video using this sample code
How to capture screen activity to a movie file using AV Foundation
It's working fine but I am wondering how can I capture a specified window of some individual app (not screen area specified by CGRect).
I'm asking because Google Hangouts can share specified window even if it isn't visible.
So, my questions are:
How can I modify the code above to achieve this?
Is it possible to capture few windows at the same time?
I'm not sure it is possible to capture video of background window using AVFoundation, unless you make your own concrete subclass of AVCaptureInput. But taking screenshot of background window can be implemented using CGWindowListCreateImage() function from Core Graphics framework. Apple's SonOfGrab sample code may be helpful.
Related
I'm trying to use the vision.VideoPlayer in a custom created GUIDE GUI. The video source is a camera. Right now I can get it to work with the camera but the vision.VideoPlayer object pops out of my gui. I've read the example given but it seems that this doesn't use the videoplayer rather than the videoreader object to read a video file and project the frames in a gui.
Is there any way to embed the vision.VideoPlayer in my GUI using input from a camera?
I just uploaded this to the FEX:
http://www.mathworks.com/matlabcentral/fileexchange/53600-fancyflowplayer
It uses no high level dependencies, only the "VideoReader" interface (which btw the vision.VideoPlayer also uses).
It is by no means a competitor to VLC(this IS Matlab we are talking about), but it is all open source and shows how you can access and process video in real-time. It also has a working draggable seekbar, extra keyboard and mouse interfaces that are really handy, which the vision.VideoPlayer does not have.
cheers,
Stefan
Unfortunately there is no way to use vision.VideoPlayer in a custom GUI directly. However here an example of how to play a video inside a custom GUI without using vision.VideoPlayer.
How to do screen record in unity?
I want to record my screen(gameplay) during my running game.
That should be play/stop , replay , save that recording on locally from device, open/load from my device (which is already we recorded).
In my game one camera which can capture native camera, and one 3d model.
I wish to record that both and use my functionality whenever i want.
Thank you in advance.
This is hard to implement, but not impossible. Because every frame or interval you need to capture screen shot of your camera view and store it in the list. You need good, (Smaller interval but not much. Because when it becomes smaller, needs more memory) interval value. If your interval is big raplay can be seen laggy.
While you play game your ram becomes full and os will terminate the app. So you need to fully cover memory optimization. Another solution is assets in Unity Asset store.
EZ Replay Manager can be used. (Keep in mind: I haven't tried it yet.)
Free
Pro
Check out this open-source project: https://github.com/getsocial-im/getsocial-capture.
By default our project records Main Camera's rendered content. C# examples are in the repo.
You can record in 2 modes:
Continuous mode - capture last X frames.
Manual mode - capture frames on your own when needed. For example, record a timelapse of the level.
Once the recording is done, you can generate GIF, get raw bytes and do whatever you want. E.g. let your users share that GIF with friends.
Here's the recording of a game session from the test app. The recorded GIF shows up in the end:
Disclaimer: I worked at GetSocial at the time of writing.
well i know a guy who post a similar project on github. link :- https://github.com/thanh-nguyen-kim/Unity_Android_Screen_Recorder
but there is a limitation and that is this code is only works on android devices(android means only android not even on ios).
but this is very powerful recorder and it is capture whatever appear on screen(so basically it is a screen recorder made with unity) and also it will capture your microphone output.give it a try.
and if you find any other solution then please also tell me. because it will very helpful for me.because i want to record video with in-game audio and also save it into gallery
Unity now has a screen recording tool builtin. It's called Recorder and doesn't require any coding.
In Unity, go to the Window menu, then click on Package Manager
By default, Packages might be set to "In Project". Select "Unity
Registry" instead
Type "Recorder" in the search box
Select the Recorder and click Install in the lower right corner of the window
That's about all you need to get everything set up and hopefully the
options make sense. The main thing to be aware of that setting
"Recording Mode" to "Single" will take a single screenshot (with
F10)
NOTE: This is a copy of my answer from a Unity screenshots question
Actually I need a region over camera overlay like mostly of the QR code scanner app.
And when a square box comes within it just focus and click picture from it. Any idea how to implement it. I was using the UIIMAGEPICKER class but after doing some googling I found that I need to use the AVFoundation framework. But unfortunately I am not the near one.
Any code or any tutorial will be helpful. Please let me know how can I implement this.
One more thing if i need to take picture can i make the picture only to the region size?
Yes, you are correct. You will need to use AV Foundation to implement this. Have a look at the 'Using the Camera with AV Foundation' video from the WWDC 2010 session videos, to get an overview of the framework.
AvFoundation has no Dependancies on UIKit. So you will get some nice performance increases over using UIImagePickerController. It will also give you Full Access to the camera.
When using AV Foundation you are in control of the 'Device Capture Settings' i.e. Flash as well as Focus Mode and Exposure; including their points of interest. Have a look at the Programming Guide to see how to use these, or the device behaviour may differ from what you expected.
You can also download and example of an application that uses AV Foundation to implement the camera here.
Once you're up and running with that, have a look at this tutorial to get started with the overlay on the camera.
One more thing if i need to take picture can i make the picture only to the region size?
Yes you will be able to implement this. You can also configure the AVFoundation session itself to output the lowest practical resolution.
I'm creating an iphone app with video clips (among other things) where I need to play one or more ads before the actual video clip. Naturally my client doesn't want users to be able to fast forward during the ads but at the same time they must be able to exit the view so I cant set controlStyle to MPMovieControlStyleNone.
I would prefer not to hack the default view and remove the scrubbar so it seems my only option is to implement a custom bar with a single "Done"/"Back" button.
I've googled my eyes out trying to find example code for this, but no dice. I've seen similar questions posted here as well but no answers are given that could help me out. I'm a novice IOS developer and could really use som help with this (custom control bar with single "Done" button for fullscreen MPMoviePlayerController that shows/hideson tap OR disable scrubbar in an acceptable fashion – in fullscreen mode).
Anyone done this before or know where to find some example code? Thanks!
I would like to create videos that needs to run on an iPad native app. The app needs to show a demonstration of a product through iPad. It needs to be interactive as well. I know we can do these in Flash, since Flash is not supported in iPad what are my options?
I appreciate any guidelines or hints. Thank you in advance
The easiest way to create interactive videos for iOS is to use Apple's HTTP Live Streaming technology. You have to create a video, embed metadata, play it using MPMoviePlayerController or AVPlayerItem, and then display clickable areas in response to metadata notifications.
Metadata should contain coordinates for the element you are tracking, eg: a dress, and a identifier for the product. You overlay this info with a clickable subview that reveals more information about the product. There are several applications of this kind in iTunes, here is one.
Once you get a working product and weeks-time of videos, the most difficult part is to perform motion tracking with the less possible human interaction. One approach is to use Adobe After Effects, another is to code your own solution based on OpenCV.
Some ideas:
You could use an MPMoviePlayerController with no controls, on loop.
Here's a solution I thought of using interactive HTML popover's over the video:
You could have a data store (say an NSDictionary), with playback times as keys.
The values could then be a custom class, which includes all the necessary data for an interactive popover on the video.
Your custom class could look something like this
#interface InteractivePopover : NSObject
{
NSString *snippetTitle;
NSString *htmlData; // could include links etc.
CGPoint popoverDisplayPoint;
// other styling attributes etc.
}
Now, when a user taps the video, it pauses it at the next 'interactive' point (by looking for the next key after the current playback time), and displays (animates on) all the popups, which you set before to show off different parts of the product.
That class might be your data store, then you create another class to handling displaying, animating, controlling, sizing (etc) these interactive popovers. It would create a UIWebView for the HTML. It would also control direction, and indication of the source point for your popover.
This is clearly very expandable because you could put images, embedded content etc into the HTML for these interactive popovers.
Anyway, that's how I would do it.
Though flash will not run on ipad you can still create apps for it with flash cs5 .