I have implemenetd to play iTunes songs from my tvOS application. I wants to give the feature to purchase songs from my application for this i have following code -
var iTunesLink = "https://itunes.apple.com/us/album/crazy-naked-girls/id310568758?i=310568759&uo=4&at=xxxxxxWa&app=itunes";
if let appStoreURL = NSURL(string: iTunesLink as String) {
if (UIApplication.sharedApplication().canOpenURL(appStoreURL) == true) {
UIApplication.sharedApplication().openURL(appStoreURL)
}
}
but this line UIApplication.sharedApplication().canOpenURL(appStoreURL) always giving false value.
Related
I have the following in my AppDelegate
//added these 3 methods
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool {
window?.makeKeyAndVisible()
GIDSignIn.sharedInstance()?.clientID = Environment.googleClientId
FirebaseApp.configure()
ApplicationDelegate.shared.application(application, didFinishLaunchingWithOptions: launchOptions);
let adjustAppToken = Environment.adjustToken
var environment = ADJEnvironmentSandbox
if (Environment.environment == "prod") {
environment = ADJEnvironmentProduction
}
let adjustConfig = ADJConfig(
appToken: adjustAppToken,
environment: environment)
adjustConfig?.logLevel = ADJLogLevelVerbose
adjustConfig?.delayStart = 2.5
adjustConfig?.delegate = self
Adjust.appDidLaunch(adjustConfig)
var configFlurryAPIKey = "x";
if (!isDev) {
configFlurryAPIKey = "x";
}
FlurryMessaging.setAutoIntegrationForMessaging()
// Step2 : (Optional) Get a callback
FlurryMessaging.setMessagingDelegate(self)
let version = Bundle.main.infoDictionary?["CFBundleShortVersionString"] as? String ?? "1.0"
// Step3 : Start Flurry session
let builder = FlurrySessionBuilder.init()
.withAppVersion(version)
.withIncludeBackgroundSessions(inMetrics: true)
.withCrashReporting(true)
.withSessionContinueSeconds(10)
.withLogLevel(FlurryLogLevelAll)
Flurry.startSession(configFlurryAPIKey, with: builder)
// registerForPushNotifications()
Purchases.debugLogsEnabled = true
Purchases.configure(withAPIKey: Environment.revenueCatKey)
Amplitude.instance().trackingSessionEvents = true
// Initialize SDK
let amplitudeKey = Environment.amplitudeKey
Amplitude.instance().initializeApiKey(amplitudeKey)
if #available(iOS 14, *) {
print("status notDetermined == \(ATTrackingManager.trackingAuthorizationStatus == .notDetermined)")
print("status authorized == \(ATTrackingManager.trackingAuthorizationStatus == .authorized)")
print("IDFA == \(ASIdentifierManager.shared().advertisingIdentifier)")
ATTrackingManager.requestTrackingAuthorization { (status) in
var statusStr = "";
if (status == .authorized) {
statusStr = "authorized"
} else if (status == .denied) {
statusStr = "denied"
} else {
statusStr = "restricted"
}
let identify = AMPIdentify().set("att", value: statusStr as NSObject)
Amplitude.instance().identify(identify!)
}
}
Even though I'm calling App Transparency, Apple Review sent me this message
Starting with iOS 14.5, apps on the App Store need to receive the user’s permission through the AppTrackingTransparency framework before collecting data used to track them. This requirement protects the privacy of App Store users.
Next Steps
Here are two ways to resolve this issue:
If you do not currently track, or decide to stop tracking, update your app privacy information in App Store Connect. You must have the Account Holder or Admin role to update app privacy information.
If you track users, you must implement App Tracking Transparency and request permission before collecting data used to track. When you resubmit, indicate in the Review Notes where the permission request is located.
Does anyone know what I'm doing wrong?
You can follow this link to know how to update AppStore privacy if your app does not tracking user for third-party service purpose, otherwise you must implement App Transparency framework.
App Store Rejection - Guideline 5.1.2 - Legal - Privacy - Data Use and Sharing
I'm working on a macOS cocoa-app in Swift where I import several different file types into the app for the user to interact with.
I'm currently trying to determine if it's possible to implement the "Open file with" feature, so that the user could open those files in a different program if they wanted to:
I've found a few different SO questions that seem tangentially related to what I'm trying to do:
Swift: How to open file with associated application?
Launch OSX Finder window with specific files selected
...but so far nothing to indicate if it's possible to implement right-click Finder/file (?) access in the way I had in mind.
Apologies if this is too vague of a question; any help / guidance appreciated!
Without going into details, it's pretty straight forward:
Get the list of all known applications that can open a specific file type (see LSCopyApplicationURLsForURL, a Core Foundation C function).
Build the menu. You can use NSWorkspace (and probably URL) to get the application icons.
Use NSWorkspace.openFile(_:withApplication:) to tell the application to open the given document.
2022, Swift 5
Get app list associated with local file:
func getAppsAssociatedWith(_ url: URL?) {
guard let url = localFileURL,
let retainedArr = LSCopyApplicationURLsForURL( url as CFURL, .all)?.takeRetainedValue(),
let listOfRelatedApps = retainedArr as? Array<URL>
else {
return []
}
return listOfRelatedApps
}
Getting thumbnail for app:
let singleAppIcon = NSWorkspace.shared
.icon(forFile: appUrl.path)
.scaledCopy(sizeOfLargerSide: 17)
Open url with app:
#available(macOS 10.15, iOS 9.0, *)
public class func openUrlWithApp(_ urls: [URL], appUrl: URL) {
NSWorkspace.shared.open(urls, withApplicationAt: appUrl, configuration: NSWorkspace.OpenConfiguration())
}
In my app I'm cashing all apps icons in dictionary.
[someFile localURL : app icon]
If I have already got icon earlier - no need to get it once more
var relatedAppsThumbnails: [URL: Image] = [:]
func updateRelatedApps() {
guard let url = currImgUrl, // file url to get icons from related apps
let retainedArr = LSCopyApplicationURLsForURL( url as CFURL, .all)?.takeRetainedValue(),
let listOfRelatedApps = retainedArr as? Array<URL>
else {
relatedApps = []
return
}
self.relatedApps = listOfRelatedApps
// add app icon in case of it wasn't added yet
for appUrl in listOfRelatedApps {
if relatedAppsThumbnails[appUrl] == nil {
let nsImg = NSWorkspace.shared.icon(forFile: appUrl.path)
.scaledCopy(sizeOfLargerSide: 17)
relatedAppsThumbnails[appUrl] = Image(nsImage: nsImg)
}
}
}
LSCopyApplicationURLsForURL is deprecated. You can use this alternative:
func getListOfExternalApps(forURL url: URL) -> [(URL, Image)] {
let listOfExternalApps = NSWorkspace.shared.urlsForApplications(toOpen: url)
let icons = listOfExternalApps.map {
let nsimage = NSWorkspace.shared.icon(forFile: $0.path())
nsimage.size = CGSize(width: .s16, height: .s16)
return Image(nsImage: nsimage)
}
return Array(zip(listOfExternalApps, icons))
}
I have a string value in a WatchOS 2 app, and want to send an iMessage text using the value. An example of this is in the Workflow watch app, where the send message action results in the following screenshot:
I can't find any frameworks or url schemes that will work on WatchOS, so how do I do something like this?
You can send your "string" by enabling the Capabilites of group on both iMessage and watch target and sharing by the Userdefaults among the Targets(iMessagge and Watch OS).
//Watch OS sharing **String**
func sharedUserInfo() {
if let userDefaults = UserDefaults(suiteName: "group.watch.app.com" ) {
userDefaults.set( stringObj as AnyObject, forKey: "string")
userDefaults.synchronize()
}
}
//iMessage extracting the info
func sharedInfo() {
if let userDefaults = UserDefaults(suiteName: "group.watch.app.com") {
let stringObj = userDefaults.string(forKey: "string")
}
}
So you can send the string from watch OS to iMessage
I am trying to write an app for IOS 8. This app will be written in Swift. I have looked at some Youtube videos for capturing video and playing video. It seems that I have to use AVKit to do this.
After capturing the video I want to be able to send the video to a server so that it can be accessed by other users of this app.
So my question is how do I get my app to record video, send that video to a server, and also be able to play videos from that server.
to record a video:
func startCaptureVideoBlogFromViewController(viewcontroller: UIViewController, withDelegate delegate: protocol<UIImagePickerControllerDelegate, UINavigationControllerDelegate>) -> Bool{
if (UIImagePickerController.isSourceTypeAvailable(.Camera) == false) {
return false
}
let cameraController = UIImagePickerController()
cameraController.sourceType = .Camera
cameraController.mediaTypes = [kUTTypeMovie as String]
cameraController.allowsEditing = false
cameraController.delegate = delegate
presentViewController(cameraController, animated: true, completion: nil)
return true
}
I want to play a sound when user press on the button on WKInterfaceController. Here is how I did in my project:
- Add a AVFoundation frame work to my watchkit app.
- Import AVFoundation on my WKInterfaceController
- Create 2 variable for audio session and player:
var audioSession:AVAudioSession!
var player:AVAudioPlayer!
- make 2 function for configure an audio session and configure Audio Player:
func configureAudioSession() {
self.audioSession = AVAudioSession.sharedInstance()
var categoryError:NSError?
var activeError:NSError?
// set category cho audio session
self.audioSession.setCategory(AVAudioSessionCategoryPlayback, error: &categoryError)
println("error: \(categoryError)")
// set active cho audio session
var success = self.audioSession.setActive(true, error: &activeError)
if !success {
println("error making audio session active :\(activeError)")
}
}
func configureAudioPlayer() {
// Lay song path
var songPath = NSBundle.mainBundle().pathForResource("Open Source - Sending My Signal", ofType: "mp3")
// Chuyen thanh URL
println("songpath: \(songPath)")
var songURL = NSURL(fileURLWithPath: songPath!)
println("songURL: \(songURL)")
//
var songError:NSError?
// Tao audioplayer
self.player = AVAudioPlayer(contentsOfURL: songURL!, error: &songError)
println("songerror:\(songError)")
self.player.numberOfLoops = 0
}
After that i finish my button press function like this:
#IBAction func startGameButtonPressed() {
self.configureAudioSession()
self.configureAudioPlayer()
self.player.play()
}
Every thing's working fine , I can saw the southPath although my button is working but I cannot hear the sound. I still use these steps on IOS app and it's working fine. May be we can't play a sound effect on Watchkit at this time? If we can, please help me to do that.
No. It is not possible to play sounds with WatchKit on the Apple Watch.
- Apple's WatchKit Evangelist
It is not currently possible to play sounds using the latest build of WatchKit.
I would suggest submitting a feature request.
YEs its possible to play a sound file in apple watch application.You need to add the sound file separately in apple watch application extension and the call the AVAudioPlayer to play the sound file.
e.g:
let path = NSBundle.mainBundle().pathForResource("coinFlip", ofType:"caf")
let fileURL = NSURL(fileURLWithPath: path!)
player = AVAudioPlayer(contentsOfURL: fileURL, error: nil)
player.prepareToPlay()
player.delegate = self
player.play()