Ok, I have seen similar questions here but none are actually answering the problem for me.
I have a streaming audio app and the stream source returns to me the song title and artist name. I have an iTunes button in the app, and want to open the iTunes STORE (search) to that exact song, or at least close. I have tried the following:
NSString *baseString = #"itms://phobos.apple.com/WebObjects/MZSearch.woa/wa/advancedSearchResults?songTerm=";
NSString *str1 = [self.songTitle2 stringByReplacingOccurrencesOfString:#" " withString:#"+"];
NSString *str2 = [self.artist2 stringByReplacingOccurrencesOfString:#" " withString:#"+"];
NSString *str = [NSString stringWithFormat:#"%#%#&artistTerm=%#", baseString, str1, str2];
[[UIApplication sharedApplication] openURL: [NSURL URLWithString:str]];
This call does indeed switch me to the iTunes STORE as expected, but then it pops up an error "Cannot connect to iTunes Store". I am obviously on-line as the song is actively streaming, and I am in the store. The search box in iTunes app only shows the song name and nothing else.
Here is an example of a generated string:
itms://phobos.apple.com/WebObjects/MZSearch.woa/wa/advancedSearchResults?artistTerm=Veruca+Salt&artistTerm=Volcano+Girls
I have tired taking the string it generates and pasting it into Safari, and it works OK on my Mac, opening to albums from the artist in the store. Why not on the phone?
Also, it seems to ignore both items, as it does not take me to the song by that artist. Does this require also knowing the album name (which I do not have at this time.)
Help would be appreciated. Thanks.
Yes, I am answering my own question.
After much digging and a talk with one of the best programmers I know, we have a solution, so I thought I would share it here. This solution takes the song name and artist, actually does make a call to the Link Maker API, gets back an XML document, and extracts the necessary info to create a link to the iTunes Store, opening the store to the song in an album by that artist that contains the song.
In the interface of the view controller, add:
#property (strong, readonly, nonatomic) NSOperationQueue* operationQueue;
#property (nonatomic) BOOL searching;
In the implementation:
#synthesize operationQueue = _operationQueue;
#synthesize searching = _searching;
Here are the methods and code that will do this for you:
// start an operation Queue if not started
-(NSOperationQueue*)operationQueue
{
if(_operationQueue == nil) {
_operationQueue = [NSOperationQueue new];
}
return _operationQueue;
}
// change searching state, and modify button and wait indicator (if you wish)
- (void)setSearching:(BOOL)searching
{
// this changes the view of the search button to a wait indicator while the search is perfomed
// In this case
_searching = searching;
dispatch_async(dispatch_get_main_queue(), ^{
if(searching) {
self.searchButton.enabled = NO;
[self.searchButton setTitle:#"" forState:UIControlStateNormal];
[self.activityIndicator startAnimating];
} else {
self.searchButton.enabled = YES;
[self.searchButton setTitle:#"Search" forState:UIControlStateNormal];
[self.activityIndicator stopAnimating];
}
});
}
// based on info from the iTunes affiliates docs
// http://www.apple.com/itunes/affiliates/resources/documentation/itunes-store-web-service-search-api.html
// this assume a search button to start the search.
- (IBAction)searchButtonTapped:(id)sender {
NSString* artistTerm = self.artistField.text; //the artist text.
NSString* songTerm = self.songField.text; //the song text
// they both need to be non-zero for this to work right.
if(artistTerm.length > 0 && songTerm.length > 0) {
// this creates the base of the Link Maker url call.
NSString* baseURLString = #"https://itunes.apple.com/search";
NSString* searchTerm = [NSString stringWithFormat:#"%# %#", artistTerm, songTerm];
NSString* searchUrlString = [NSString stringWithFormat:#"%#?media=music&entity=song&term=%#&artistTerm=%#&songTerm=%#", baseURLString, searchTerm, artistTerm, songTerm];
// must change spaces to +
searchUrlString = [searchUrlString stringByReplacingOccurrencesOfString:#" " withString:#"+"];
//make it a URL
searchUrlString = [searchUrlString stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL* searchUrl = [NSURL URLWithString:searchUrlString];
NSLog(#"searchUrl: %#", searchUrl);
// start the Link Maker search
NSURLRequest* request = [NSURLRequest requestWithURL:searchUrl];
self.searching = YES;
[NSURLConnection sendAsynchronousRequest:request queue:self.operationQueue completionHandler:^(NSURLResponse* response, NSData* data, NSError* error) {
// we got an answer, now find the data.
self.searching = NO;
if(error != nil) {
NSLog(#"Error: %#", error);
} else {
NSError* jsonError = nil;
NSDictionary* dict = [NSJSONSerialization JSONObjectWithData:data options:0 error:&jsonError];
if(jsonError != nil) {
// do something with the error here
NSLog(#"JSON Error: %#", jsonError);
} else {
NSArray* resultsArray = dict[#"results"];
// it is possible to get no results. Handle that here
if(resultsArray.count == 0) {
NSLog(#"No results returned.");
} else {
// extract the needed info to pass to the iTunes store search
NSDictionary* trackDict = resultsArray[0];
NSString* trackViewUrlString = trackDict[#"trackViewUrl"];
if(trackViewUrlString.length == 0) {
NSLog(#"No trackViewUrl");
} else {
NSURL* trackViewUrl = [NSURL URLWithString:trackViewUrlString];
NSLog(#"trackViewURL:%#", trackViewUrl);
// dispatch the call to switch to the iTunes store with the proper search url
dispatch_async(dispatch_get_main_queue(), ^{
[[UIApplication sharedApplication] openURL:trackViewUrl];
});
}
}
}
}
}];
}
}
The XML file that comes back has a LOT of other good info you could extract here as well, including three sizes of album art, album name, cost, etc, etc.
I hope this helps someone else out. This stumped me for quite some time, and I thank a good friend of mine for making this work.
You are in fact using a URL for the search. That's why iTunes opens on search. My iTunes in Mac OS X also opens in search.
Use the Search API for iTunes to search for the content you want and get the artist, album or song ids so you can generate a direct URL for that content.
Look in the iTunes Link Maker how to create a URL for an artist or for a specific album and compose that URL on your app.
It appears that now iOS already opens the iTunes app directly when you try to open a itunes html url.
Example, trying to do a openURL on https://itunes.apple.com/br/album/falando-de-amor/id985523754 already opens the iTunes app instead of the website.
Related
Sometimes when I launch the Facebook share dialog, the OS switches to the Facebook app, but the share dialog doesn't load. Then if you try again, it usually works. Why doesn't it work the first time?
NSURL* url = [NSURL URLWithString:#"some URL"];
FBShareDialogParams* params = [[FBShareDialogParams alloc] init];
params.link = url;
if ([FBDialogs canPresentShareDialogWithParams:params]) {
[FBDialogs presentShareDialogWithLink:url
handler:^(FBAppCall *call, NSDictionary *results, NSError *error) {
if(error) {
NSLog(#"Error posting to FB: %#", error.description);
//do something
} else {
//do something else
}
}];
// }
}
This class is no longer available in the most recent version of the SDK.
You should use FBSDKShareDialog instead of this.The latest version is v4.11.0.
Thanks.
I want to implement location check-in feature in ios. My application is using google+ ios sdk. The problem which I am facing is that after I have implemented google+ check-in feature then that post is not being displayed on my google+ account.
The technique which I have tried and implemented is written below.
-(IBAction)checkIn
{
GTMOAuth2Authentication *auth = [GPPSignIn sharedInstance].authentication;
GTLServicePlus* plusService = [[GTLServicePlus alloc] init] ;
plusService.retryEnabled = YES;
[plusService setAuthorizer:auth];
GTLPlusMoment *moment = [[GTLPlusMoment alloc] init];
moment.type = #"http://schemas.google.com/CheckInActivity";
GTLPlusItemScope *target = [[GTLPlusItemScope alloc] init] ;
target.url =#"https://developers.google.com/+/plugins/snippet/examples/place";
moment.target = target;
GTLQueryPlus *query =
[GTLQueryPlus queryForMomentsInsertWithObject:moment
userId:#"me"
collection:kGTLPlusCollectionVault];
[plusService executeQuery:query
completionHandler:^(GTLServiceTicket *ticket,
id object,
NSError *error) {
if (error) {
GTMLoggerError(#"Error: %#", error);
NSString *failure =
[NSString stringWithFormat:#"Status: Error: %#", error];
NSLog(#"%#",failure);
} else {
NSString *result = #"CheckedIn Saved in Google+";
NSLog(#"%#",result);
}
}];
}
Can any one please help me out. Is this the right way of implementing location check-in feature of google+ or is there any other method for it?
The method you're using is writing an "app activity" to Google+, which stores a "moment" in the user's app activity vault. As noted on https://developers.google.com/+/mobile/ios/app-activities these moments are not directly visible on the user's stream, although users may choose to share them to the stream if they wish.
To see the moments that have been shared, you will need to use the desktop app. Your profile has a list of apps that are using the Google+ Sign-In and you can view, share, and delete the activities for each of these apps. The mobile Google+ clients don't let you view the activities yet.
I'm trying to use Google's Objective-C Youtube APIs to fetch a youtube channel's playlist - with no luck.
-I downloaded Google's official API from:
http://code.google.com/p/gdata-objectivec-client/source/browse/#svn%2Ftrunk%2FExamples%2FYouTubeSample
But the sample App doesn't really do anything - its not even an iOS sample App. Seems to be a Mac OS App. Its Read-Me file says: "This sample should automatically build and copy over the GTL.framework as part of the build-and-run process."
Ok... and then what?
How do you get this to work in an iPhone App?
I haven't found any actual instructions to make this work.
Any idea what we're supposed to do here?
you can try source code at this path
https://bitbucket.org/eivvanov/youtubedemo/overview
I have spent a day and a half trying to figure it out on how to use the MAC OSX app they have given as an example. I ended up with an iPhone app which I manage to build to get all the Uploaded video I have from YouTube.
Link: YouTubeProject
In order to make it work:
You have to add the GData project from google
In the LTMasterViewController.m-> (GDataServiceGoogleYouTube *)youTubeService: put your username and password
The "gdata-objectivec-client" for youtube been superseded by a JSON-API Link. Scroll down to youtube.
For supporting the JSON-API here is the details Link.
And for fetching the playlist have a look at the Link.
For total newbies who are lost : consider a sample function that will help understand the entire cycle of fetch,parse,display etc and bring youtube channel's videos to your tableview specifically. im not writing the tableview part here
-(void)initiateRequestToYoutubeApiAndGetChannelInfo
{
NSString * urlYouCanUseAsSample = #"https://www.googleapis.com/youtube/v3/search?key={YOUR_API_KEY_WITHOUT_CURLY_BRACES}&channelId={CHANNEL_ID_YOU_CAN_GET_FROM_ADDRESS_BAR_WITHOUT_CURLY_BRACES}&part=snippet,id&order=date&maxResults=20";
NSURL *url = [[NSURL alloc] initWithString: urlYouCanUseAsSample];
// Create your request
NSURLRequest *request = [NSURLRequest requestWithURL:url];
// Send the request asynchronously remember to reload tableview on global thread
[NSURLConnection sendAsynchronousRequest:request queue:[[NSOperationQueue alloc] init] completionHandler:^(NSURLResponse *response, NSData *data, NSError *connectionError) {
// Callback, parse the data and check for errors
if (data && !connectionError) {
NSError *jsonError;
NSDictionary *jsonResult = [NSJSONSerialization JSONObjectWithData:data options:NSJSONReadingMutableContainers error:&jsonError];
if (!jsonError) {
// better put a breakpoint here to see what is the result and how it is brought to you. Channel id name etc info should be there
NSLog(#"%#",jsonResult);
/// separating "items" dictionary and making array
//
id keyValuePairDict = jsonResult;
NSMutableArray * itemList = keyValuePairDict[#"items"];
for (int i = 0; i< itemList.count; i++) {
/// separating VIDEO ID dictionary from items dictionary and string video id
id v_id0 = itemList[i];
NSDictionary * vid_id = v_id0[#"id"];
id v_id = vid_id;
NSString * video_ID = v_id[#"videoId"];
//you can fill your local array for video ids at this point
// [video_IDS addObject:video_ID];
/// separating snippet dictionary from itemlist array
id snippet = itemList[i];
NSDictionary * snip = snippet[#"snippet"];
/// separating TITLE and DESCRIPTION from snippet dictionary
id title = snip;
NSString * title_For_Video = title[#"title"];
NSString * desc_For_Video = title[#"description"];
//you can fill your local array for titles & desc at this point
// [video_titles addObject:title_For_Video];
// [video_description addObject:desc_For_Video];
/// separating thumbnail dictionary from snippet dictionary
id tnail = snip;
NSDictionary * thumbnail_ = tnail[#"thumbnails"];
/// separating highresolution url dictionary from thumbnail dictionary
id highRes = thumbnail_;
NSDictionary * high_res = highRes[#"high"];
/// separating HIGH RES THUMBNAIL IMG URL from high res dictionary
id url_for_tnail = high_res;
NSString * thumbnail_url = url_for_tnail[#"url"];
//you can fill your local array for titles & desc at this point
[video_thumbnail_url addObject:thumbnail_url];
}
// reload your tableview on main thread
//[self.tableView performSelectorOnMainThread:#selector(reloadData) withObject:nil waitUntilDone:NO];
performSelectorOnMainThread:#selector(reloadInputViews) withObject:nil waitUntilDone:NO];
// you can log all local arrays for convenience
// NSLog(#"%#",video_IDS);
// NSLog(#"%#",video_titles);
// NSLog(#"%#",video_description);
// NSLog(#"%#",video_thumbnail_url);
}
else
{
NSLog(#"an error occurred");
}
}
}];
}
Does anyone know about Facebook checkins with the Facebook iOS SDK?
I have made an application using Facebook Graph API and now I want to add the capability to checkin to it. How would I do this?
I have tried the following code but it returns nil.
FbGraphResponse *fb_graph_response = [fbGraph doGraphGet:#"me/checkins" withGetVars:nil];
//my doTheGraph method
- (FbGraphResponse *)doGraphGet:(NSString *)action withGetVars:(NSDictionary *)get_vars {
NSString *url_string = [NSString stringWithFormat:#"https://graph.facebook.com/%#?", action];
//tack on any get vars we have...
if ( (get_vars != nil) && ([get_vars count] > 0) ) {
NSEnumerator *enumerator = [get_vars keyEnumerator];
NSString *key;
NSString *value;
while ((key = (NSString *)[enumerator nextObject])) {
value = (NSString *)[get_vars objectForKey:key];
url_string = [NSString stringWithFormat:#"%#%#=%#&", url_string, key, value];
}//end while
}//end if
if (accessToken != nil) {
//now that any variables have been appended, let's attach the access token....
url_string = [NSString stringWithFormat:#"%#access_token=%#", url_string, self.accessToken];
}
//encode the string
url_string = [url_string stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
return [self doGraphGetWithUrlString:url_string];
}
This isn't a great answer, but needs to be noticed. The project linked in your comment to mine under your question is no longer supported or maintained:
NOTE!!: this project is no longer maintained. The official Facebook/iOS SDK can be found here: https://github.com/facebook/facebook-iphone-sdk This project is an open source Objective-C (iPhone/iPad) library for communciating with the Facebook Graph API
That is why I didn't recognize your code, it isn't the Official Facebook iOS SDK.
I strongly encourage you to switch to the up-to-date, and more importantly, maintained project as soon as possible.
I was wondering how to access an MPMediaItem's raw data.
Any ideas?
you can obtain the media item's data in such way:
-(void)mediaItemToData
{
// Implement in your project the media item picker
MPMediaItem *curItem = musicPlayer.nowPlayingItem;
NSURL *url = [curItem valueForProperty: MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL: url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset
presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"public.mpeg-4";
NSString *exportFile = [[self myDocumentsDirectory] stringByAppendingPathComponent:
#"exported.mp4"];
NSURL *exportURL = [[NSURL fileURLWithPath:exportFile] retain];
exporter.outputURL = exportURL;
// do the export
// (completion handler block omitted)
[exporter exportAsynchronouslyWithCompletionHandler:
^{
NSData *data = [NSData dataWithContentsOfFile: [[self myDocumentsDirectory]
stringByAppendingPathComponent: #"exported.mp4"]];
// Do with data something
}];
}
This code will work only on ios 4.0 and later
Good luck!
Of course you can access the data of a MPMediaItem. It's not crystal clear at once but it works. Here's how:
Get the media item's URL from it's MPMediaItemPropertyAssetURL property
Initialize an AVURLAsset with this URL
Initialize an AVAssetReader with this asset
Fetch the AVAssetTrack you want to read from the AVURLAsset
Create an AVAssetReaderTrackOutput with this track
Add this output to the AVAssetReader created before and -startReading
Fetch all data with AVAssetReaderTrackOutput's -copyNextSampleBuffer
PROFIT!
Here is some sample code from a project of mine (this is not a code jewel of mine, wrote it some time back in my coding dark ages):
typedef enum {
kEDSupportedMediaTypeAAC = 'aac ',
kEDSupportedMediaTypeMP3 = '.mp3'
} EDSupportedMediaType;
- (EDLibraryAssetReaderStatus)prepareAsset {
// Get the AVURLAsset
AVURLAsset *uasset = [m_asset URLAsset];
// Check for DRM protected content
if (uasset.hasProtectedContent) {
return kEDLibraryAssetReader_TrackIsDRMProtected;
}
if ([uasset tracks] == 0) {
DDLogError(#"no asset tracks found");
return AVAssetReaderStatusFailed;
}
// Initialize a reader with a track output
NSError *err = noErr;
m_reader = [[AVAssetReader alloc] initWithAsset:uasset error:&err];
if (!m_reader || err) {
DDLogError(#"could not create asset reader (%i)\n", [err code]);
return AVAssetReaderStatusFailed;
}
// Check tracks for valid format. Currently we only support all MP3 and AAC types, WAV and AIFF is too large to handle
for (AVAssetTrack *track in uasset.tracks) {
NSArray *formats = track.formatDescriptions;
for (int i=0; i<[formats count]; i++) {
CMFormatDescriptionRef format = (CMFormatDescriptionRef)[formats objectAtIndex:i];
// Check the format types
CMMediaType mediaType = CMFormatDescriptionGetMediaType(format);
FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format);
DDLogVerbose(#"mediaType: %s, mediaSubType: %s", COFcc(mediaType), COFcc(mediaSubType));
if (mediaType == kCMMediaType_Audio) {
if (mediaSubType == kEDSupportedMediaTypeAAC ||
mediaSubType == kEDSupportedMediaTypeMP3) {
m_track = [track retain];
m_format = CFRetain(format);
break;
}
}
}
if (m_track != nil && m_format != NULL) {
break;
}
}
if (m_track == nil || m_format == NULL) {
return kEDLibraryAssetReader_UnsupportedFormat;
}
// Create an output for the found track
m_output = [[AVAssetReaderTrackOutput alloc] initWithTrack:m_track outputSettings:nil];
[m_reader addOutput:m_output];
// Start reading
if (![m_reader startReading]) {
DDLogError(#"could not start reading asset");
return kEDLibraryAssetReader_CouldNotStartReading;
}
return 0;
}
- (OSStatus)copyNextSampleBufferRepresentation:(CMSampleBufferRepresentationRef *)repOut {
pthread_mutex_lock(&m_mtx);
OSStatus err = noErr;
AVAssetReaderStatus status = m_reader.status;
if (m_invalid) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_Invalidated;
}
else if (status != AVAssetReaderStatusReading) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
// Read the next sample buffer
CMSampleBufferRef sbuf = [m_output copyNextSampleBuffer];
if (sbuf == NULL) {
pthread_mutex_unlock(&m_mtx);
return kEDLibraryAssetReader_NoMoreSampleBuffers;
}
CMSampleBufferRepresentationRef srep = CMSampleBufferRepresentationCreateWithSampleBuffer(sbuf);
if (srep && repOut != NULL) {
*repOut = srep;
}
else {
DDLogError(#"CMSampleBufferRef corrupted");
EDCFShow(sbuf);
err = kEDLibraryAssetReader_BufferCorrupted;
}
CFRelease(sbuf);
pthread_mutex_unlock(&m_mtx);
return err;
}
You can't, and there are no workaround. An MPMediaItem is not the actual piece of media, it is just the metadata about the media item communicated to the application via RPC from another process. The data for the item itself is not accessible in your address space.
I should note that even if you have the MPMediaItem its data probably is not loaded into the devices memory. The flash on the iPhone is slow and memory is scarce. While Apple may not want you to have access to the raw data backing an MPMediaItem, it is just as likely that they didn't bother dealing with it because they didn't want to invest the time necessary to deal with the APIs. If they did provide access to such a thing it almost certainly would not be as an NSData, but more likely as an NSURL they would give your application that would allow it to open the file and stream through the data.
In any event, if you want the functionality, you should file a bug report asking for.
Also, as a side note, don't mention your age in a bug report you send to Apple. I think it is very cool you are writing apps for the phone, when I was your age I loved experimenting with computers (back then I was working on things written in Lisp). The thing is you cannot legally agree to a contract in the United States, which is why the developer agreement specifically prohibits you from joining. From the first paragraph of the agreement:
You also certify that you are of the
legal age of majority in the
jurisdiction in which you reside (at
least 18 years of age in many
countries) and you represent that you
are legally permitted to become a
Registered iPhone Developer.
If you mention to a WWDR representative that you are not of age of majority they may realize you are in violation of the agreement and be obligated to terminate your developer account. Just a friendly warning.