I have website where user can upload 360 image and video . I play that image and video using vrview but vrview doesnot seems that good as it lack enough control button in video player (as i know).
I want to use krpano , but i dont know hoe to use for dynamically added content .
I search it but couldn't get proper knowledge on that .
I am posting so anyone who will come here searching for solution could get some help :
Actually the solution comes from how video is played by krpano .
In krpano for playing video we need to give xml file that contain necessary information related to video we are going to play .
So for every video , we want to play , we should have xml file of that video mean xml file containing the video source , thumbnail source specified .
we can create xml file for every video file while uploading .
I made xml file using string concatenate and wrote that in xml .
xml file is in sample code inside example folder .
Related
on assets from library i.e: assets-library://asset/asset.mov?id=0399CB6D-D3D9-4F4C-82B9-AC93CCE2FB16&ext=mov
[UIVideoEditorController canEditVideoAtPath:videoPath] returns always NO
i see this error in the console:
<Warning>: Video assets-library://asset/asset.mov?id=0399CB6D-D3D9-4F4C-82B9-AC93CCE2FB16&ext=mov cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=2 "This movie could not be played." UserInfo=0x6f7b90 {NSLocalizedDescription=This movie could not be played.}
help?
apparently there is no way to edit library assets with UIVideoEditController.
i ended up copying the file using AVAssetWriter to the app sandbox and then i was able to use UIVideoEditController.
I am able to write video tracks successfully.I am able to save that file to gallery as well.Video gets played with all the quality of original file .But not able to copy audio bufers and hence video gets played without sound.
But if I try to write audio track first and then video track then the writing fails for video buffers .That is BOOL appended = [assetWriterInput appendSampleBuffer:buffer]; fails by returning NO .
Buffers get appended while writing audio track.Then I cancel reader for audio track and start wrting reading video track.Appending the video buffers fails.For both writtes I am setting startSessionAtSourceTime:KCMTimeZero .
If I write only video buffers then video gets played without audio.
I want to make it possible to copy that .MOV file present inside gallery with all audio+video tracks inside it.
***Purpoose : Finally I want to edit the file copied using UIVideoEditorController.We can not directly edit file inside gallery.Hence I am first copying it inside sandbox and then editing it.
I am using Embedded HTML for playing video on iOS 5 and it is working perfectly, But if video doesn'n exist then it show blank screen. How can i identify video exist or not ? even in this case webView didFailLoadWithError not getting called. Any help will be appreciate. Thanks
You could check for existence of the video using Javascript (see How can I check existence of a file with JavaScript?) and then (again using JS) if the video exists update the reference in the <video> element.
When Video is made with the Sorenson CODEC... MPMoviePlayerController just plays Audio(and not the Video), Instead i want to show my custom error message at this point. How can i detect which CODEC is used by particular File programmatically ... ?
EDIT: I am not using Quick time in my code so that solution won't work
Thanks
Check this documentation to understand the Quicktime file format :
http://developer.apple.com/library/mac/documentation/QuickTime/QTFF/qtff.pdf
The field you are looking for is the "vfmt" code that is containing the video fourcc code (there is one for each video track in your file, so take care if your file is containing several video tracks). The fourcc codes for Sorenson codec are "SVQ1" and "SVQ3".
Now you'll have to write some code to parse the QT file to find the correct atom, extract the "vfmt" value and compare it to SVQ1/SVQ3 !
Apple is providing some classes to easily parse quicktime files, but it is only available on Mac OS, not on iOS !
Am developing a app which will upload video to the server which is picked form local iphone photo library.
here i want to restrict the user not to upload more than 3min of video..Question is how to get the duration of the video which is picked from local photo library using uiimagepickercontroller.
will it contain any property for that..?
Thanks
I am not sure but did u try imagePickerController.videoMaximumDuration = urDuration;
You can get the size of your videofile.If you want to put check according to the size that is possible or you can get the duration by using mpmovieplayer class but for that you have to get the full video file .I dont think so you want to do this so its gud if you can put a check according to the size.
I was able trim the video when i used the photo library source type. But when I use the camera source type trimmed video appears to be saved. but when i play the saved video it is not trimmed. Can any one pls help me with this?
It looks like you can't use uiimagepicker AND get the edited video. You have to use UIVideoEditorController.
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIVideoEditorController_ClassReference/Reference/Reference.html#//apple_ref/occ/cl/UIVideoEditorController