I'm working on an app that takes pictures automatically during certain events and saves them directly to the photo library. Unfortunately the user might not always be looking at their phone when this is running and the first time a picture is taken it asks for permission to access photos.
Is there a way I can force this request during launch and get it out of the way?
Thanks
Yes. You need to request access to the library.
Simply use this code in AppDelegate's didFinishLaunchingWithOptions
ALAssetsLibraryGroupsEnumerationResultsBlock assetGroupEnumerator =
^(ALAssetsGroup *assetGroup, BOOL *stop) {
if (assetGroup != nil) {
// do somthing
}
};
ALAssetsLibraryAccessFailureBlock assetFailureBlock = ^(NSError *error) {
LogError(#"Error enumerating photos: %#",[error description]);
};
NSUInteger groupTypes = ALAssetsGroupAll;
[library enumerateGroupsWithTypes:groupTypes usingBlock:assetGroupEnumerator failureBlock:assetFailureBlock];
Related
I'm making an app with Facebook login. I get user profile pics the normal way:
https://graph.facebook.com/[USER_ID]/picture?type=large
but one of my test users only has a facebook page, not a facebook profile. His userID shows up as normal, but I only get the silhouette image for him.
Is there a way, ideally parallel to the above method for users, to get the picture associated with a page, given the userID of the page's owner/admin?
EDIT: To clarify, I want to get the publicly-available picture that's in the 'profile pic' UI section for a page, but which isn't associated with any particular user. See https://www.facebook.com/cajitamusic for example.
No, you'd have to ask for permission to access the user's pages first (which could be an extra unwanted threshold to use your app).
I think you can. You can using the same with
https://graph.facebook.com/[PAGE_ID]/picture?type=large
The problem for you seems is how to get object id of the page. You can try search API as it works for me when I try to get city profile picture.
fbPageInfo = [FBRequestConnection startWithGraphPath:[NSString stringWithFormat:#"search?q=%#&type=page",[cityName stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]]
completionHandler:^(FBRequestConnection *connection, id result, NSError *error) {
if (!error) {
// Sucess! Include your code to handle the results here
NSLog(#"result: %#", result);
NSArray* pages = (NSArray *)[result data];
if (pages.count > 0) {
NSDictionary *firstCityPage = [pages objectAtIndex:0];
for (NSDictionary *page in pages) {
//NSLog(#"user events: %#", page);
if ([[page objectForKey:#"category"] isEqualToString:#"City"]) {
firstCityPage = page;
break;
}
}
NSString *profileImgUrl = [NSString stringWithFormat:#"http://graph.facebook.com/%#/picture?type=large", [firstCityPage objectForKey:#"id"]];
...
}
} else {
// An error occurred, we need to handle the error
// See: https://developers.facebook.com/docs/ios/errors
// do nothing - user default image
fbPageInfo = nil;
NSLog(#"Can't get image for %#. Error:%#", cityName, [error description]);
}
}];
Also how to get bigger picture you can find it in this answer:
How to get the bigger profile picture of a facebook page
i want to store some data in "EverNote" through our app either (image or text or both).
I googled, i got some guidance like EverNote SDK and i got the EverNoteCounter Sample also(When i run this, when i click getCount button it shows an alert message "Could not authenticate").
I generated the developer token also.
But i unable to create the consumerKey,consumerSecret. And also i did not find how to store our data to evernote from our app.
I got some links like this one
but when i go through that link it says( HTTP method GET is not supported by this URL)
I able to authenticate with the EVERNOTE and i able to get the number of notebooks in that Account.
I am using sqllite in my app. i am using one folder for images. Sqllite have the images links info.
How to store the data.
I used the following code to authenticate and to get the count
- (IBAction)retrieveUserNameAndNoteCount:(id)sender
{
// Create local reference to shared session singleton
EvernoteSession *session = [EvernoteSession sharedSession];
[session authenticateWithViewController:self completionHandler:^(NSError *error) {
// Authentication response is handled in this block
if (error || !session.isAuthenticated) {
// Either we couldn't authenticate or something else went wrong - inform the user
if (error) {
NSLog(#"Error authenticating with Evernote service: %#", error);
}
if (!session.isAuthenticated) {
NSLog(#"User could not be authenticated.");
}
UIAlertView *alert = [[[UIAlertView alloc] initWithTitle:#"Error"
message:#"Could not authenticate"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil] autorelease];
[alert show];
} else {
// We're authenticated!
EvernoteUserStore *userStore = [EvernoteUserStore userStore];
// Retrieve the authenticated user as an EDAMUser instance
[userStore getUserWithSuccess:^(EDAMUser *user) {
// Set usernameField (UILabel) text value to username
[usernameField setText:[user username]];
// Retrieve total note count and display it
[self countAllNotesAndSetTextField];
} failure:^(NSError *error) {
NSLog(#"Error retrieving authenticated user: %#", error);
}];
}
}];
}
- (void)countAllNotesAndSetTextField
{
// Allow access to this variable within the block context below (using __block keyword)
__block int noteCount = 0;
EvernoteNoteStore *noteStore = [EvernoteNoteStore noteStore];
[noteStore listNotebooksWithSuccess:^(NSArray *notebooks) {
for (EDAMNotebook *notebook in notebooks) {
if ([notebook guid]) {
EDAMNoteFilter *filter = [[EDAMNoteFilter alloc] init];
[filter setNotebookGuid:[notebook guid]];
[noteStore findNoteCountsWithFilter:filter withTrash:NO success:^(EDAMNoteCollectionCounts *counts) {
if (counts) {
// Get note count for the current notebook and add it to the displayed total
NSNumber *notebookCount = (NSNumber *)[[counts notebookCounts] objectForKey:[notebook guid]];
noteCount = noteCount + [notebookCount intValue];
NSString *noteCountString = [NSString stringWithFormat:#"%d", noteCount];
[noteCountField setText:noteCountString];
}
} failure:^(NSError *error) {
NSLog(#"Error while retrieving note counts: %#", error);
}];
}
}
} failure:^(NSError *error) {
NSLog(#"Error while retrieving notebooks: %#", error);
}];
}
Please suggest me the links or give me the guidance
Thanks a lot in advance
Developer token is to be used when you only need to access your own account. To get a consumer key/secret, go here : http://dev.evernote.com/documentation/cloud/ .
If you are using iOS, https://github.com/evernote/evernote-sdk-ios has a sample app that you can use once you have a consumer key and secret.
In general, there is a lot of information on dev.evernote.com.
All SDKs are located at https://github.com/evernote
Getting started guide for iOS : http://blog.evernote.com/tech/2012/05/24/evernote-sdk-integration-ios/
Did you solved it? If not, i did the following to get it work:
download and include sdk
get consumerKey and secret (if you want to access notes too, then
instead of basic you should request for full access, http://dev.evernote.com/documentation/cloud/ top right corner)
Add the URLType entry in info.plist ("Modify your application's main
plist file" chapter https://github.com/evernote/evernote-sdk-ios)
Copy the session init code (filled with consumer key and secret, the hostname should be left unchanged) and implement the two application delegate specific code
An (on-screen) viewcontroller should be passed on authenticating the user to authenticateWithViewController method, f.e. the appdelegate's rootViewController
Study this pages to understand the model hierarchy used by Evernote:
http://dev.evernote.com/documentation/cloud/chapters/data_structure.php
http://dev.evernote.com/documentation/reference/Types.html
Image could be stored as EDAMResource (Resource) in the field data and text as EDAMNote (Note) in the field content. Both is handled by Evernote SDK's EvernoteNoteStore object.
Part of my app has a photo browser, somewhat similar to Apple's Photos app, with an initial view controller to browse photo thumbnails and a detail view that's shown when you tap on a photo.
I'm using ALAssetsLibrary to access photos, and I pass an array of ALAsset URL's to my detail view controller so you can swipe from one photo to the next.
Everything works great, until I receive an ALAssetsLibraryChangedNotification while swiping from one photo to another (in the detail view controller), which often results in a crash:
NOTIFICATION: the asset library changed // my own NSLog for when the
notification occurs
loading assets... // my own NSLog for when I start reloading assets in
the thumbnail browser
Assertion failed: (size == bytesRead), function
-[ALAssetRepresentation _imageData], file /SourceCache/AssetsLibrary/MobileSlideShow-1373.58.1/Sources/ALAssetRepresentation.m,
line 224.
The specific line of code it crashes on, is in calling [currentRep metadata] as shown here:
- (void)someMethod {
NSURL *assetURL = [self.assetURLsArray objectAtIndex:index];
ALAsset *currentAsset;
[self.assetsLibrary assetForURL:assetURL resultBlock:^(ALAsset *asset) {
[self performSelectorInBackground:#selector(configureDetailViewForAsset:) withObject:asset];
} failureBlock:^(NSError *error) {
NSLog(#"failed to retrieve asset: %#", error);
}];
}
- (void)configureDetailViewForAsset:(ALAsset *)currentAsset {
ALAssetRepresentation *currentRep = [currentAsset defaultRepresentation];
if (currentAsset != nil) {
// do some stuff
}
else {
NSLog(#"ERROR: currentAsset is nil");
}
NSDictionary *metaDictionary;
if (currentRep != nil) {
metaDictionary = [currentRep metadata];
// do some other stuff
}
else {
NSLog(#"ERROR: currentRep is nil");
}
}
I understand that once a notification is received, it invalidates any references to ALAsset and ALAssetRepresentation objects... but how am I supposed to deal with the situation where it invalidates something right in the middle of trying to access it?
I've tried setting a BOOL, right when receiving the notification to completely abort and prevent [currentRep metadata] from ever being called, but even that doesn't catch it every time:
if (self.receivedLibraryChangeNotification) {
NSLog(#"received library change notification, need to abort");
}
else {
metaDictionary = [currentRep metadata];
}
Is there anything I can do? At this point I'm almost ready to give up on using the ALAssetsLibrary framework.
(note this unresolved thread on the Apple dev forums describing the same issue: https://devforums.apple.com/message/604430 )
It seems the problem is around here:
[self.assetsLibrary assetForURL:nextURL
resultBlock:^(ALAsset *asset) {
// You should do some stuff with asset at this scope
ALAssetRepresentation *currentRep = [asset defaultRepresentation];
// Assume we have a property for that
self.assetRepresentationMetadata = [currentRep metadata];
...
// assume we have a method for that
[self updateAssetDetailsView];
}
failureBlock:^(NSError *error) {
NSLog(#"failed to retrieve asset: %#", error);
}];
Once you have got user asset it is better to copy asset information by providing necessary data to your details controller subviews or by caching for later use. It can be helpful for avoiding ALAsset invalidation troubles. When notification ALAssetsLibraryChangedNotification sent you may need to discard details controller and query the Library content from the beginning.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How Do I Get The Correct Latitude and Longitude From An Uploaded iPhone Photo?
I'm making a photo app and would like to know what are my options for working with geotagged photos. I would like to display where a photo has been taken (similar to photos app). Is this possible?
Additionally, I need to know when the picture has been taken. I can capture this info for the pictures that I take myself, but what about the camera roll images?
Yes, it is possible.
You have to use ALAssetsLibrary to access your camera roll. Then you just enumerate through your photos and asks for location.
assetsLibrary = [[ALAssetsLibrary alloc] init];
groups = [NSMutableArray array];
[assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop)
{
if (group == nil)
{
return;
}
[groups addObject:group];
} failureBlock:^(NSError *error)
{
// Possibly, Location Services are disabled for your application or system-wide. You should notify user to turn Location Services on. With Location Services disabled you can't access media library for security reasons.
}];
This will enumerate your assets groups. Next, you pick up a group and enumerate its assets.
ALAssetGroup *group = [groups objectAtIndex:0];
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop)
{
if (result == nil)
{
return;
}
// Trying to retreive location data from image
CLLocation *loc = [result valueForProperty:ALAssetPropertyLocation];
}];
Now your loc variable contains location of the place where the photo was taken. You should check it against ALErrorInvalidProperty before use, since some photos might lack this data.
You can specify ALAssetPropertyDate to obtain the date and time of photo creation.
Is there a way I can delete an image that is loaded into my app from a UIImagePickerController?
I want to be able to delete the image from the user's photo library when the user performs a specific action.
I am prompting the user to choose a image from their library, then it gets loaded into my app at which point the app does some shnazzy animation, then actually deletes the image.
Please help!
Apple doesn't actually allow you to delete from the photo library through an API. The user has to actually go to the Photos app and delete it manually themselves. Apple does allow you write to the photo library:
To save a still image to the user’s
Saved Photos album, use the
UIImageWriteToSavedPhotosAlbum
function. To save a movie to the
user’s Saved Photos album, use the
UISaveVideoAtPathToSavedPhotosAlbum
function.
But for deleting and editing/overriding an existing photo, Apple doesn't have anything like that right now.
Actually, you can delete photos saved by your app (saved to photo library with UIImageWriteToSavedPhotosAlbum API call).
The documented API [ALAsset setImageData:metadata:completionBlock:] works.
1). Add an image "photo.jpg" to your project
2). Save an image to asset library:
ALAssetsLibrary *lib = [ALAssetsLibrary new];
UIImage *image = [UIImage imageNamed:#"photo.jpg"];
[lib writeImageToSavedPhotosAlbum:image.CGImage metadata:#{} completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(#"Write image %# to asset library. (Error %#)", assetURL, error);
}];
3). Go to default gallery, you will find photo.jpg in your "Saved Photos" album.
4). Delete this image from asset library:
ALAssetsLibrary *lib = [ALAssetsLibrary new];
[lib enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
[group enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if(asset.isEditable) {
[asset setImageData:nil metadata:nil completionBlock:^(NSURL *assetURL, NSError *error) {
NSLog(#"Asset url %# should be deleted. (Error %#)", assetURL, error);
}];
}
}];
} failureBlock:^(NSError *error) {
}];
5). Go to default gallery, you will find photo.jpg has already been deleted.
Yes we can delete a photo. We can use PHAssetChangeRequest for this operation.
From Apple:
A request to create, delete, change metadata for, or edit the content of a Photos asset, for use in a photo library change block.
class func deleteAssets(_ assets: NSFastEnumeration)
where assets:
An array of PHAsset objects to be deleted.
PHAssetChangeRequest.deleteAssets([assetToDelete])
So, you could use the above code to delete assets.
below is swift 3 code,
PHPhotoLibrary.shared().performChanges({
let imageAssetToDelete = PHAsset.fetchAssets(withALAssetURLs: imageUrls as! [URL], options: nil)
PHAssetChangeRequest.deleteAssets(imageAssetToDelete)
}, completionHandler: {success, error in
print(success ? "Success" : error )
})