iOS get video frame rate - iphone

I want to get the frame rate for a specific video. I tried to look at APIs in the AVFoundation and AssetsLibrary like AVURLAsset or AVAssetReader. None of them are really helpful. Does anybody know a method in the Apple's frameworks/library to get the frame rate

The easier option is to obtain an AVAssetTrack and read its nominalFrameRate property.
Looks like this can help you.
Nominal frame rate is the frame rate of the track, in frames per second. (read-only)
AVAssetTrack Class
#property(nonatomic, readonly) float nominalFrameRate

For iOS 7+ you can use the currentVideoFrameRate property of AVPlayerItemTrack. Its the only consistent property that I've seen measure FPS. The nominalFrameRate property seems to be broken in HLS streams.
AVPlayerItem *item = AVPlayer.currentItem; // Your current item
float fps = 0.00;
for (AVPlayerItemTrack *track in item.tracks) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeVideo]) {
fps = track.currentVideoFrameRate;
}
}

For the property currentVideoFrameRate, from Apple's document:
If the item is not playing, or if the media type of the track is not
video, the value of this property is 0.0.
It requires the item is in "playing", so I used nominalFrameRate instead.
It still works in iOS 12, swift 4:
let tracks = asset.tracks(withMediaType: .video)
let fps = tracks?.first?.nominalFrameRate
Remember to handle nil checking.

Related

Multiple time allocation of GPUImageFilter crashes the application

I am working a picture/video filter effect project. For making effect i am using GPUImage project. It works fine for pictures. Now i need to do the same effect on videos too. I grab images from video at 30 frames per second. Now for 1 min video i filtered about 1800 images. and filtering, for each image i allocate GPUImagePicture and GPUImageSepiaFilter classes, and release them manully. But these allocations is not released, and after processing on about 20 sec video application crashes due to memory warning.
Is it possible to allocate GPUImagePicture and Filter class once for makeing filter on all images. if yes, How?
Please tell me it will very helpfull.
Here is my code what actully i am doing...
The get image method is called by a NSTimer which is called 30 times per second, and getting image from app document directory and send for filtering to - (void)imageWithEffect:(UIImage *)image method.
-(void)getImage
{
float currentPlayBacktime = slider.value;
int frames = currentPlayBacktime*30;
NSString *str = [NSString stringWithFormat:#"image%i.jpg",frames+1];
NSString *fileName = [self.imgFolderPath stringByAppendingString:str];
if ([fileManager fileExistsAtPath:fileName])
[self imageWithEffect:[UIImage imageWithContentsOfFile:fileName]];
}
- (void)imageWithEffect:(UIImage *)image
{
GPUImagePicture *gpuPicture = [[GPUImagePicture alloc]initWithImage:img] ;
GPUImageSepiaFilter *filter = [[[GPUImageSepiaFilter alloc]init] autorelease];
gpuPicture = [gpuPicture initWithImage:image];
[gpuPicture addTarget:filter];
[gpuPicture processImage];
playerImgView.image = [filter imageFromCurrentlyProcessedOutputWithOrientation:0];
[gpuPicture removeAllTargets];
[filter release];
[gpuPicture release];
}
Why are you processing a movie as a series of still UIImages? Why are you allocating a new filter for each image? That's going to be incredibly slow.
Instead, if you need to process a movie file, use a GPUImageMovie input, and if you need access to the camera, use a GPUImageVideoCamera input. Both of these are tuned to provide fast video feeds through the filter pipeline. There's a lot of wasted processing cycles in converting still frames to and from UIImages.
Also, only set up a filter once, and reuse it as necessary. There's significant overhead in the creation of a GPUImageFilter (setting up an FBO, etc.), so you only want to create it once and then attach inputs and outputs as they change.

Using AVQueuePlayer to get current track information

I am loading my iPod library into AVQueuePlayer and playing it using this:
[[AVQueuePlayer alloc]]initWithItems:[MPMediaCollectionInstance items] ]; //just one line.
But how do I read which MPMediaItem is currently playing? I want to know information like artist / song name etc.
Thanks.
Have the instance of the AVQueuePlayer that you have allocated.
AVQueuePlayer *_queuePlayer = [[AVQueuePlayer alloc] initWithItems:[MPMediaCollectionInstance items]];
With that instance, you can get the AVPlayerItem.
AVPlayerItem *currentItem = _queuePlayer.currentItem;
For the above line, please check the doc reference.
And now try the following code
NSArray *metadataList = [currentItem.asset commonMetadata];
for (AVMetadataItem *metaItem in metadataList) {
NSLog(#"%#",[metaItem commonKey]);
}
Which will give a list as follows:
title
creationDate
artwork
albumName
artist
Now you can get the value for corresponding keys. For this, you have to refer this doc too.

Error getting artwork for current song

Grabbing album art for current song and using it to change a certain imageView.image generates an error, but no longer crashes. (It did before because I left out the if (!artwork) error handling. Eheh.)
This method:
- (void)handleNowPlayingItemChanged:(id)notification {
MPMediaItem *item = self.musicPlayer.nowPlayingItem;
CGSize albumCoverSize = self.albumCover.bounds.size;
MPMediaItemArtwork *artwork =
[item valueForProperty:MPMediaItemPropertyArtwork];
if (artwork) {
self.albumCover.image = [artwork imageWithSize:albumCoverSize];
} else {
self.albumCover.image = nil;
}
}
Explodes like this:
CPSqliteStatementPerform: attempt to write a readonly database for
UPDATE ddd.ext_container SET orig_date_modified = (SELECT date_modified
FROM container WHERE pid=container_pid) WHERE orig_date_modified=0
CPSqliteStatementReset: attempt to write a readonly database for
UPDATE ddd.ext_container SET orig_date_modified = (SELECT date_modified
FROM container WHERE pid=container_pid) WHERE orig_date_modified=0
But only on launch. And it still shows the image (or lack thereof). Weird.
Edit: The iPod Library is readonly (apps can't change anything, only iTunes), so maybe it's yelling at
me for writing a readonly something, but still allowing it because nothing readonly is being modified?
And after that's fixed, I need to get resizing working (for Landscape support) instead of IB's stretching.
Not vital, but still a nice thing to have.
Here's what I do. It creates no errors, and produces an image every time. If the song doesn't have an image, it defaults to the one I provide. I think because you're not checking for an image with a specific size (320 by 320, matching the screen width for me), it fails to figure it out correctly. I don't know why you're getting the SQLite error, but hopefully this fixes it!
MPMediaItemArtwork *artworkItem = [self.musicPlayer.nowPlayingItem valueForProperty: MPMediaItemPropertyArtwork];
if ([artworkItem imageWithSize:CGSizeMake(320, 320)]) {
[self.currentlyPlayingArtworkView setImage:[artworkItem imageWithSize:CGSizeMake (320, 320)]];
}
else {
[self.currentlyPlayingArtworkView setImage:[UIImage imageNamed:#"NoArtworkImage"]];
}
Link here - Why am I getting this CPSqliteStatementPerform error in xcode console
Putting this here so the question can be marked as Answered.

Playing from iPod music library

I want to play particular selected song from my ipod music library.How can i do that?
My idea is to save the title name from MPMediaQuery and then play later on when app starts
so any one can have idea to do this?
Thank You.
The basic solution is to save the unique identifier each song in the library has, aka MPMediaItemPropertyPersistentID. You can use this ID to play the song, and you can save the ID to memory in order to remember the song the user selected between launches. If you don't know how the Media Player Framework works, look at the AddMusic sample code.
Your view controller must implement the MPMediaPickerControllerDelegate protocol. Assuming you're just allowing the user to select a single song, then the basic outline of your callback will look something like this.
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
NSArray* items = [mediaItemCollection items];
if ([items count] == 1)
{
MPMediaItem* song = (MPMediaItem *)[items objectAtIndex:0];
NSNumber* persistentId = [song valueForProperty:MPMediaItemPropertyPersistentID];
// ...Save/Play here...
}
}
At this point you can use the persistent ID to play the song, and/or save it to user defaults.

Cocoa - Add Video Watermark General Info

Just looking for how to programmatically add a watermark or some sort of overlay to video using cocoa. Not looking for a step by step ( although that would awesome ), but more or less looking for where I should start looking to learn how. Are there frameworks developed to work for this. Would like something native to cocoa or objective-c or c because I would like to eventually give this a go on the iPhone. Any help would be great.
I'm not sure if you mean just for playback, or if you'd like to export a video with a watermark that'll show up on other players.
If you mean just for playback, you can probably just add a view on top of the player view on Mac and iPhone that contains the watermark.
If you'd like a watermark on the video itself, this is hard on the Mac and probably impossible on the iPhone without essentially rewriting QuickTime.
On the Mac, the code might look like this (you need to import QTKit):
// Make a new movie so we don't destroy the existing one
QTMovie* movie = [[QTMovie alloc] initWithMovie:currentMovie
timeRange:QTMakeTimeRange(QTMakeTime(0,1000), [currentMovie duration])
error:nil];
// Make it editable
[movie setAttribute:[NSNumber numberWithBool:YES]
forKey:QTMovieEditableAttribute];
//Get the size
NSValue *value = [movie attributeForKey:QTMovieNaturalSizeAttribute];
NSSize size = [value sizeValue];
// Add a new track to the movie and make it the frontmost layer
QTTrack *track = [movie addVideoTrackWithSize:size];
[track setAttribute:[NSNumber numberWithShort:-1] forKey:QTTrackLayerAttribute];
// Create a codec dictionary for the image we're about to add
NSDictionary *imageDict = [NSDictionary dictionaryWithObjectsAndKeys:
#"tiff", QTAddImageCodecType,
[NSNumber numberWithLong:codecHighQuality], QTAddImageCodecQuality, nil];
// Get the video length in QT speak
QTTime qttime = [currentMovie duration];
NSTimeInterval reftime;
QTGetTimeInterval(qttime, &reftime);
//Add the image for the entire duration of the video
[track addImage:image forDuration:qttime withAttributes:imageDict];
// Finally, tell the track that it should use its alpha correctly
MediaHandler media = GetMediaHandler([[track media] quickTimeMedia]);
MediaSetGraphicsMode(media, graphicsModeStraightAlpha, NULL);
... And that's it! Your movie now has a watermark, and you can export it to file.