I'm trying to implement AudioServicesPlaySystemSound(SystemSoundID(****)) therefor I need a list of existing IDs for Apples SystemSounds. Searching through various posts I found this on GitHub. I couldn't find a fitting sound in this list for my purpose. Since this repository wasn't updated since 2013 I'm not sure if its up to date. I would like to know if there is a list of SystemSounds which is more up to date.
First of all, the list you have found is not published by Apple.
I do not know if the author researched by themself or just collected them, but that sort of action is considered as a reverse engineering which is inhibited in developer agreement.
I couldn't find a fitting sound in this list for my purpose.
You may need to find a sound resource instead of SystemSoundID, and register it and create a SystemSoundID for it using AudioServicesCreateSystemSoundID.
I would like to know if there is a list of SystemSounds which is more up to date.
The latest list of public SystemSoundID is here:
Alert Sound Identifiers
Constants
kSystemSoundID_Vibrate
On the iPhone, use this constant with the
AudioServicesPlayAlertSound
function to invoke a brief vibration. On the iPod touch, does nothing.
kSystemSoundID_UserPreferredAlert
On the desktop, use this constant with the
AudioServicesPlayAlertSound
function to play the alert specified in the Sound preference pane.
kSystemSoundID_FlashScreen
On the desktop, use this constant with the
AudioServicesPlayAlertSound
function to display a flash of light on the screen.
kUserPreferredAlert
Deprecated. Use kSystemSoundID_UserPreferredAlert instead.
(Some of them are for macOS only.)
Using other SystemSoundIDs, can be considered as using private API.
Some comments from Apple developers in Apple's devforums:
Does this count as a private API?
It would not be appropriate to use undocumented arbitrary values in APIs, so I would recommend you not do that in your submission.
Haptic feedback for force touch?
For a fixed SystemSoundID value to be considered API, it must have a symbolic constant in the headers. Passing in other fixed values is not OK.
Related
When I update tilesets on mapbox, changes don't appear in the iOS app unless I re-install it. There is seemingly documentation on this here: https://docs.mapbox.com/ios/api/maps/5.2.0/Classes/MGLOfflineStorage.html#/c:objc(cs)MGLOfflineStorage(im)setMaximumAmbientCacheSize:withCompletionHandler: but I can't figure out how exactly to implement it. I don't have an MGLOfflineStorage object because I am not worried about offline map storage right now, I just want to refresh the cache in the app. There are good examples of how to do this in android, but not on iOS. Any help is appreciated (preferably in swift)
It appears to be correct to call the methods on the shared MGLOfflineStorage object. The method parameter should be a closure containing any code you want to execute upon completion.
MGLOfflineStorage.shared.invalidateAmbientCache { error in
print("Invalidated")
}
Naturally you should check the error in the usual 'safe' way.
I am wanting to add offline map functionality to an iOS app build using Swift and Mapbox. There is great documentation and examples for downloading a map region pack, but am having a difficult time figuring out how to retrieve a list of offline packs. Their documentation here gives these instructions on how to receive:
"To detect when the shared offline storage object has finished loading its packs property, observe KVO change notifications on the packs key path. The initial load results in an NSKeyValueChangeSetting change."
But I am having a difficult time find any examples or explanations as to what that means. Any help would be greatly appreciated!
An array of all known offline packs can be retrieved using the .packs attribute of the MGLOfflineStorage class. Like so:
MGLOfflineStorage.shared.packs
To access these packs, you just need to iterate over the array or pass a specific index and retrieve whatever information you're interested in from the packs.
There is a good example of using this array to create a tableview of the completed offline packs on a device in the SDK's open source test app (NB: this example is written in Obj-C).
⚠️ Disclaimer: I currently work at Mabpox ⚠️
I was finally able to come to a solution. To observe the packs retrieval using Swift, you can use this code:
MGLOfflineStorage.shared.observe(\.packs, options: [.new, .old]){ object, change in
var offlinePacksArr : [MGLOfflinePack] = object.packs // Access to packs array here
}
I am currently trying to make some modifications to the incoming WebRTC video stream in the AppRTC app for iOS in Swift (which in turn is based on this Objective-C version). To do so, I need access to the data which is stored in the frame objects of class RTCI420Frame (which is a basic class for the Objective-C implementation of libWebRTC). In particular, I need an array of bytes: [UInt8] and Size of the frames. This data is to be used for further processing & addition of some filters.
The problem is, all the operations on RTCVideoTrack / RTCEAGLVideoView are done under the hood of pre-compiled libWebRTC.a, it is compiled from the official WebRTC repository linked above and it's fairly complicated to get a custom build of it, so I'd prefer to go with the build available in the example iOS project; in my understanding it's supposed to have all the available functionality in it.
I was looking into RTCVideoChatViewController class and in particular, remoteView / remoteVideoTrack, but had no success in accessing the frames themselves, spent a lot of time researching the libWebRTC sources in official repo but still can't wrap my head around the problem of accessing the frames data for own manipulations with it. Would be glad for any help!
Just after posting the question I had a luck in finding the sneaky data!
You have to add the following property to RTCEAGLVideoView.h file:
#property(atomic, strong) RTCI420Frame* i420Frame;
In the original implementation file there is the i420Frame property but it wasn't exposed in the iOS project's header file for the class. Adding the property allows you to get view's current frame.
I'm still in search of a more elegant way of getting the stream data directly, without the need to look into remoteView contents, will update the answer once I find it.
I use iCloud to sync an user xml file between devices in my apps, with a UIDocument subclass, similar to the code from the question:http://stackoverflow.com/questions/7795629/icloud-basics-and-code-sample. but I am not sure when and how should I detect a conflict. I read the sdk doc and searched the internet but didn't seem to find any information with detailed information. it seems we can use some code like
NSNumber* conflicted ;
[url getResourceValue:&conflicted forKey:NSURLUbiquitousItemHasUnresolvedConflictsKey error:nil];
but in my app, it seems always give a true value for "conflicted"?
also I am not sure when should I detect the conflict, my guess is before the contentsForType method of the UIDocument subclass is called. if anyone can give any hint that would be great.
You should observe the UIDocumentStateChangedNotification. If the documentState property of your document has the UIDocumentStateInConflict flag set, there is a conflict. Note that a document can be in multiple states simultaneously, so don't check with ==, instead use something like if (document.state & UIDocumentStateInConflict) {....
You can then get the conflicting versions with the otherVersionsOfItemAtURL: class method of NSFileVersion.
You can find more detailed information in the chapter on resolving document conflicts in the Document-Based App Programming Guide for iOS.
I am using the ALAssets framework to access the Photo Library. The first time it's accessed, it asks the User if the app can use their Current Location, and I understand that is necessary and why.
However, in Core Location Manager, there is a purpose property, where it looks like I can customize the iPad's alert message to say why it is necessary to tap Yes. (I don't actually use location, just want access to the photo library.)
I can't seem to work out how to find out where to use this property, as the alert message comes up when I first try and enumerate the assets, and there doesn't seem to be anywhere to intercept it before the error occurs if the user says NO.
I know I can put up a notice of my own before first usage of ALAssets, in anticipation of the iPad built-in alert, but it seems slicker to change the actual iPad message.
Thanks.
I’d suggest, before you try to access the photo library, that you create your own dummy CLLocationManager, set its purpose, then call its -startUpdatingLocation. That’ll get the system to bring up the location permissions dialog with your custom text, and the resulting app-wide location permissions ought to carry over to your ALAsset access.
Unfortunately you can't customize this message. I suggest you will a radar with Apple, if you want to see this feature in the future.
Cheers,
Hendrik