Is there a way to stream audio file from Google Drive with AVPlayer?
I have tried with both file.downloadUrl and file.webContentLink and it is not working.
Code:
GTLDriveFile *file = [self.data objectAtIndex:indexPath.row];
if (player)
{
[player removeObserver:self forKeyPath:#"status"];
[player pause];
}
player = [AVPlayer playerWithURL:[NSURL URLWithString:file.downloadUrl]];
//or
//player = [AVPlayer playerWithURL:[NSURL URLWithString:file.webContentLink]];
[player addObserver:self forKeyPath:#"status" options:0 context:nil];
if (delegate && [delegate respondsToSelector:#selector(audioPlayerDidStartBuffering)])
[delegate audioPlayerDidStartBuffering];
If it is not possible to stream, is it possible to start download in /tmp folder and play while downloading?
I could solve it just by appending the access_token to the download url
audiofile.strPath=[NSString stringWithFormat#"%#&access_token=%#",downloadUrl,accessToken];
pass the strPath to your AvPlayer object.
you can fetch the access token from the GTMOAuth2Authentication object
Note that you might need to refresh it if its expires.
Hope this helps you.
Regards
Nitesh
That is simply because you didn't provide your client's access code from header of the download request. When you get downloadUrl, that link is not public link and you should provide same authorization as you did for all other Drive API requests.
For example, Object-c code for downloading content from downloadUrl would be like this:
+ (void)downloadFileContentWithService:(GTLServiceDrive *)service
file:(GTLDriveFile *)file
completionBlock:(void (^)(NSData *, NSError *))completionBlock {
if (file.downloadUrl != nil) {
// More information about GTMHTTPFetcher can be found on
// http://code.google.com/p/gtm-http-fetcher
GTMHTTPFetcher *fetcher =
[service.fetcherService fetcherWithURLString:file.downloadUrl];
[fetcher beginFetchWithCompletionHandler:^(NSData *data, NSError *error) {
if (error == nil) {
// Success.
completionBlock(data, nil);
} else {
NSLog(#"An error occurred: %#", error);
completionBlock(nil, error);
}
}];
} else {
completionBlock(nil,
[NSError errorWithDomain:NSURLErrorDomain
code:NSURLErrorBadUrl
userInfo:nil]);
}
}
Or, if you can pass additional parameter to AVPlayer so that it sends additional header to authorize while downloading file, add the following header:
Authorization: Bearer {YOUR_ACCESS_TOKEN}
Related
I'm developing an app to help me understand OBJECTIVE-X/OSX.
The app simply connects to Facebook and sends a notification using NSUserNotification.
It is working fine, but now I want to add some UI to the mix.
To make the example simpler, I want to update a label (NSTextField) to show the status of the Facebook connection.
Connecting…
Connected
Failed
I have the following code in one File FacebookRequest.m
- (void) connectFacebook{
if(self.account == nil){
self.account = [[ACAccountStore alloc]init];
}
ACAccountType *facebookAccount = [self.account
accountTypeWithAccountTypeIdentifier:ACAccountTypeIdentifierFacebook];
NSDictionary *options = #{
ACFacebookAppIdKey: #"MY_CODE",
ACFacebookPermissionsKey: #[#"email",
#"user_about_me",
#"user_likes",
#"manage_notifications",
#"user_activities"],
ACFacebookAudienceKey: ACFacebookAudienceFriends
};
[self.account requestAccessToAccountsWithType:facebookAccount
options:options
completion:^(BOOL success, NSError *error){
if(success){
NSArray *accounts = [self.account accountsWithAccountType:facebookAccount];
self.account = [accounts lastObject];
}
else{
NSLog(#"Erro %#", [error description]);
}
}];
}
and the following one in my AppDelegate.m
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
[self.statusFacebook setStringValue:#"Connecting…"];
FacebookRequest *request = [[FacebookRequest alloc]init];
[request connectFacebook];
}
What is the best way to update the UI after the request is complete and I have an account?
I'm having troubles since the request is asynchronous and I can't return any value inside the requestAccessToAccountsWithType block. Another point is that if I put some "ifs" to check if my account is nil after it, it will be executed before the block has finished executing, so the account would still be nil.
Thanks!
PS.: Sorry for the English if it is not clear enough.
You may use NSNotificationCenter for this purpose:
[self.account requestAccessToAccountsWithType:facebookAccount
options:options
completion:^(BOOL success, NSError *error){
if(success){
NSArray *accounts = [self.account accountsWithAccountType:facebookAccount];
self.account = [accounts lastObject];
// You post a notification that the UI should update here
[[NSNotificationCenter defaultCenter] postNotificationName:#"UpdateUI" object:nil];
}
else{
NSLog(#"Erro %#", [error description]);
}
}];
Then, you add your viewController that should update its UI as an observer of this notification:
- (void)viewDidLoad
{
[super viewDidLoad];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(updateUI) name:#"UpdateUI" object:nil];
}
- (void)updateUI {
// Here you actually update your UI
}
p.s. if you are not using arc you also remove the observer in dealloc:
- (void)dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:self];
I am using Google Drive SDK for iPhone and trying to upload Audio file in "TestAudio" folder.If "TestAudio" folder is not created at google drive then first create that folder and after that my audio should store to that folder only. Every thing is working gr8 except folder creation. can any buddy please help?
I am using below code for upload audio file.
GTLUploadParameters *uploadParameters = nil;
NSString *soundFilePath = [[NSBundle mainBundle]
pathForResource:#"honey_bunny_new"
ofType:#"mp3"];
if (soundFilePath) {
NSData *fileContent = [[NSData alloc] initWithContentsOfFile:soundFilePath];
uploadParameters = [GTLUploadParameters uploadParametersWithData:fileContent MIMEType:#"audio/mpeg"];
}
self.driveFile.title = self.updatedTitle;
GTLQueryDrive *query = nil;
if (self.driveFile.identifier == nil || self.driveFile.identifier.length == 0) {
// This is a new file, instantiate an insert query.
query = [GTLQueryDrive queryForFilesInsertWithObject:self.driveFile
uploadParameters:uploadParameters];
} else {
// This file already exists, instantiate an update query.
query = [GTLQueryDrive queryForFilesUpdateWithObject:self.driveFile
fileId:self.driveFile.identifier
uploadParameters:uploadParameters];
}
UIAlertView *alert = [DrEditUtilities showLoadingMessageWithTitle:#"Saving file"
delegate:self];
[self.driveService executeQuery:query completionHandler:^(GTLServiceTicket *ticket,
GTLDriveFile *updatedFile,
NSError *error) {
[alert dismissWithClickedButtonIndex:0 animated:YES];
if (error == nil) {
self.driveFile = updatedFile;
self.originalContent = [self.textView.text copy];
self.updatedTitle = [updatedFile.title copy];
[self toggleSaveButton];
[self.delegate didUpdateFileWithIndex:self.fileIndex
driveFile:self.driveFile];
[self doneEditing:nil];
} else {
NSLog(#"An error occurred: %#", error);
[DrEditUtilities showErrorMessageWithTitle:#"Unable to save file"
message:error.description
delegate:self];
}
}];
I don't see your code to create a folder, but I was having the same problem with folder creation myself. As you know, the mimeType must be "application/vnd.google-apps.folder". I ran into assert failures if the NSData parameter to uploadParametersWithData was nil. So I tried a zero length NSData object and that failed. Using a 1 byte NSData object also failed. The trick is to call queryForFilesUpdateWithObject with uploadParameters:nil. Then the folder creation works fine. I also discovered that the Objective-C code shown at the end of:
https://developers.google.com/drive/v2/reference/files/insert
is incorrect. The file.parents should be as follows:
GTLDriveParentReference *parentRef = [GTLDriveParentReference object];
parentRef.identifier = parentID;
if (parentID.length>0) file.parents = [NSArray arrayWithObjects:parentRef,nil];
I'm trying to implement fetching an picasa web album on iphone, have downloaded the code and example from google.code but have run into a problem where fetching an album feed returns me feed with wrong object types - GDataEntryBase instead of GDataEntryPhoto.
Here's the code I'm using:
First I'm calling this to get all my albums:
- (void)fetchAllAlbums
{
NSLog(#"Fetching all albums");
//request albums
GDataServiceTicket *ticket;
NSURL *feedURL = [GDataServiceGooglePhotos photoFeedURLForUserID:myemail
albumID:nil
albumName:nil
photoID:nil
kind:nil
access:nil];
ticket = [_GooglePhotoService fetchFeedWithURL:feedURL
delegate:self
didFinishSelector:#selector(albumListFetchTicket:finishedWithFeed:error:)];
[self set_AlbumFetchTicket: ticket];
}
Now, in the callback I call to get all the photos of each returned album:
- (void)albumListFetchTicket:(GDataServiceTicket *)ticket
finishedWithFeed:(GDataFeedPhotoUser *)feed
error:(NSError *)error
{
[self set_UserAlbumFeed: feed];
[self set_AlbumFetchError:error];
[self set_AlbumFetchTicket:nil];
if (error == nil) {
NSLog(#"Got albums!");
for (GDataEntryPhotoAlbum * albumEntry in _UserAlbumFeed)
{
NSLog(#"Album Title: %#", [[albumEntry title] stringValue]);
{
NSLog(#"Fetching photos!");
[self set_AlbumPhotosFeed:nil];
[self set_PhotosFetchError:nil];
[self set_PhotosFetchTicket:nil];
GDataServiceTicket *ticket;
ticket = [_GooglePhotoService fetchFeedWithURL: [[albumEntry feedLink] URL]
delegate: self
didFinishSelector: #selector(photosTicket:finishedWithFeed:error:)];
[self set_PhotosFetchTicket:ticket];
}
}
}
}
and this is the callback for each album photo feed fetch:
// photo list fetch callback
- (void)photosTicket:(GDataServiceTicket *)ticket
finishedWithFeed:(GDataFeedPhotoAlbum *)feed
error:(NSError *)error
{
//tell me what class you are
NSLog(#"Feed class: %#", NSStringFromClass([feed class]));
[self set_AlbumPhotosFeed: feed];
[self set_PhotosFetchError: error];
[self set_PhotosFetchTicket: ticket];
if (error == nil)
{
NSLog(#"Got Photos!");
for (GDataEntryPhoto * photo in feed)
{
NSLog(#"Title: %#", [[photo title] stringValue]);
//tell me what class you are
NSLog(#"%#", NSStringFromClass([photo class]));
//NSArray * thumbnails = [[photo mediaGroup] mediaThumbnails];
//NSLog(#"thumbnails count: %d", [thumbnails count]);
//NSLog(#"Photo thumnail url: %#", [[thumbnails objectAtIndex:0] URLString]);
}
}
}
The trouble is that the entries in the feed in the last callback are not of type GDataEntryPhoto, just the base GDataEntryBase - and so trying to access their thumbnail urls will crash the app.
The code is copied from google's cocoa (non-touch) example, and there it works - the feed returned is populated with the GDateEntryPhoto objects.
Any help would be greatly appreciated.
Add -ObjC -all_load to Other Linker Flags in your xcodeproject, then add SystemConfiguration.framework, CFNetwork.framework and Security.framework to Build Phases -> Link Library With Binaries.
I have a video playing app which displays nothing in the AVPlayerLayer after repeatedly presenting and hiding the modal view which contains it. If I dismiss the modal view when this happens, the next load usually displays fine (?!!). The black screen issue happens roughly 20% of the time.
I build an AVMutableComposition to make the AVPlayerItem, but this bug happens even if there's only a single sample involved.
The issue can also be reproduced with a lot of app switching and turning music on and off. I do include music controls in my app (along with a simple view which displays the currently playing iTunes track).
This only happens on iOS 4. It used to happen on iOS 5 as well, but when I started recycling the view which contains the AVPlayerLayer, it worked fine. The only things I don't recycle are the AVPlayer and the relevant AVPlayerItem.
Here's how I load the assets and build a player:
- (void)loadAssetsFromFiles:(id)sender {
NSLog(#"loadAssetsFromFiles called");
assert ([assetURL2clipID count] > 0);
self.isWaitingToLoadAssets = NO;
composition = [AVMutableComposition new];
videoComposition = [AVMutableVideoComposition new];
[self releaseAssets];
//We're going to add this asset to a composition, so we'll need to have random access available
//WARNING: This can cause slow initial loading, so consider loading files later and as needed.
NSDictionary *assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
//iterate through the asset urls we know we need to load
for (NSURL *fileURL in [assetURL2clipID allKeys])
{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:assetOptions];
assert(asset);
//index assets by clipID
[assets setObject:asset forKey:[assetURL2clipID objectForKey:fileURL]];
NSString *tracksKey = #"tracks";
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
NSLog(#"an asset completed loading values for keys.");
NSLog(#"Tracks loaded:");
[asset.tracks enumerateObjectsUsingBlock:
^(AVAssetTrack *obj, NSUInteger index, BOOL *stop)
{
NSLog(#"\n mediaType: %#\n trackID: %d\n", obj.mediaType, obj.trackID);
}];
NSArray *metadata = [asset commonMetadata];
for ( AVMetadataItem* item in metadata ) {
NSString *key = [item commonKey];
NSString *value = [item stringValue];
NSLog(#" metadata key = %#, value = %#", key, value);
}
if (!viewIsActive)
{
NSLog(#"An asset finished loading while the player view was inactive! Did you make sure cancelLoading called on this asset?");
}
// Completion handler block.
NSError *error = nil;
AVKeyValueStatus status = [asset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded && error == nil) {
//if we've loaded all of our assets, it's time to build the composition and prepare the player!
loadedAssets++;
if (loadedAssets == [assets count])
{
CGSize videoSize = [asset naturalSize];
//every video composition needs these set
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps. TODO: Set this to the framerate of one of the assets
//using the assets we've already got
[self buildCompositions];
self.playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.playerItem.videoComposition = videoComposition;
//TODO: Adding observer stuff should be on the main thread to prevent a partial notification from happening
[playerItem addObserver:self forKeyPath:#"status"
options:0 context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
self.player = [AVPlayer playerWithPlayerItem:playerItem];
[playerView setPlayer:player];
[self.player addObserver:self forKeyPath:#"status" options:NSKeyValueObservingOptionNew context:nil];
self.isObservingPlayerStatus = YES;
}
} else if (error != nil) {
// Deal with the error appropriately.
NSLog(#"WARNING: An asset's tracks were not loaded, so the composition cannot be completed. Error:\n%#\nstatus of asset: %d", [error localizedDescription], status);
}
else
{
//There was no error but we don't know what the problem was.
NSLog(#"WARNING: An asset's tracks were not loaded, so the composition cannot be completed. No error was reported.\nstatus of asset: %d", status);
}
}];
}
}
That [self buildCompositions] function you see builds an AVVideoComposition to do opacity ramps, but I tried bypassing it and get the same problem.
When profiling the program, CoreAnimation reports a framerate of ~45 FPS when everything is working correctly, and 0-4 FPS when the blank screen rears its presumably ugly head.
This guy seems to have had a similar problem, but for me recycling the views really only fixed things for iOS 5:
Playing many different videos on iphone using AVPlayer
I am writing an application that uses the AVAudioRecorder class. It works great except for when a phone call comes in. I am handling this per apple's guidelines of using the AVAudioRecorderDelegate methods
– (void) audioRecorderBeginInterruption:
– (void) audioRecorderEndInterruption:
It works great until the interruption ends and I attempt to "resume" the recording by calling the record method again (per the documentation). However it does not resume my recording but instead throws out the old one and starts up an entirely new one in its place. I have not been able to find a solution to this problem, if anyone has figured this out, or if it is a bug with apple's AVAudioRecorder please let me know. I really hope I do not have to write this using AudioQueues.
thanks
Looks like its a bug with apple's API. Great fun....
This was the response we received from a support ticket.
"The behavior you described is a bug and unfortunately there's nothing in the API that you can change to work around to actually append to the original recording. The interruption is resulting in capturing only the audio recorded after the interruption. You could try and stop the recording after the interruption then creating a new file after which would at least not cause the user to loose any information, but the result would be two separate files.
Please file a bug report at for this issue since bugs filed by external developers are critical when iOS engineering is evaluating critical features of fixes to address. It's easily reproducible but if you have a test app you can include please do, iOS engineering like having apps that show the bug directly.
"
My solution was:
Start record on temp file
Watch for AVAudioSessionInterruptionNotificatio
On AVAudioSessionInterruptionTypeBegan - stop the recording.
On AVAudioSessionInterruptionTypeEnded - Start new recording.
When the user stops - Marge the files.
Full Code
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(audioSessionInterruptionNotification:)
name:AVAudioSessionInterruptionNotification
object:audioSession];
-(void)audioSessionInterruptionNotification:(NSNotification*)notification {
NSString* seccReason = #"";
//Check the type of notification, especially if you are sending multiple AVAudioSession events here
NSLog(#"Interruption notification name %#", notification.name);
NSError *err = noErr;
if ([notification.name isEqualToString:AVAudioSessionInterruptionNotification]) {
seccReason = #"Interruption notification received";
//Check to see if it was a Begin interruption
if ([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeBegan]]) {
seccReason = #"Interruption began";
NSLog(#"Interruption notification name %# audio pause", notification.name);
dispatch_time_t restartTime = dispatch_time(DISPATCH_TIME_NOW,
0.01 * NSEC_PER_SEC);
dispatch_after(restartTime, dispatch_get_global_queue(0, 0), ^{
AVAudioRecorder *recorder = [[self recorderPool] objectForKey:lastRecID];
if (recorder) {
if(recorder.isRecording) {
[recorder stop];
NSLog(#"Interruption notification name Pauseing recording %#", lastRecID);
} else {
NSLog(#"Interruption notification name Already Paused %#", lastRecID);
}
}else {
NSLog(#"Interruption notification name recording %# not found", lastRecID);
}
NSLog(#"Interruption notification Pauseing recording status %d",recorder.isRecording);
});
} else if([[notification.userInfo valueForKey:AVAudioSessionInterruptionTypeKey] isEqualToNumber:[NSNumber numberWithInt:AVAudioSessionInterruptionTypeEnded]]){
seccReason = #"Interruption ended!";
NSLog(#"Interruption notification name %# audio resume", notification.name);
//Start New Recording
dispatch_time_t restartTime = dispatch_time(DISPATCH_TIME_NOW,
0.1 * NSEC_PER_SEC);
dispatch_after(restartTime, dispatch_get_global_queue(0, 0), ^{
AVAudioRecorder *recorder = [[self recorderPool] objectForKey:lastRecID];
NSLog(#"Interruption notification Resumeing recording status %d",recorder.isRecording);
if (recorder) {
if(!recorder.isRecording) {
NSString *filePath = [[self orgFileNames] objectForKey:lastRecID];
NSArray * fileNames =[[self fileNames] objectForKey:lastRecID];
NSString *tmpFileName = [self gnrTempFileName:filePath AndNumber:fileNames.count];
[[[self fileNames] objectForKey:lastRecID] addObject:tmpFileName];
NSURL *url = [NSURL fileURLWithPath:tmpFileName];
NSError *error = nil;
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&error];
if (![recorder record]) {
NSLog(#"Interruption notification Error Resumeing recording %#",tempRecorder);
return;
}
[[self recorderPool] setObject:recorder forKey:lastRecID];
NSLog(#"Interruption notification nameResumeing recording %#",lastRecID);
}else {
NSLog(#"Interruption notification Already Recording %d",recorder.isRecording);
}
}else {
NSLog(#"Interruption notification name recording %# not found",lastRecID);
}
});
}
}
}
You will try by using this piece of code
-(IBAction)pauseandplay:(id)sender
{
BOOL status= [player isPlaying];
if(status)
{
[pauseplay setImage:[UIImage imageNamed:#"play.png"]];
[player pause];
}
else
{
[pauseplay setImage:[UIImage imageNamed:#"icon-pause.png"]];
[player play];
updateTimer = [NSTimer scheduledTimerWithTimeInterval:.01 target:self selector:#selector(updateCurrentTime) userInfo:player repeats:YES];
}
}