How to run an Automator Workflow/Service from Cocoa app? - swift

So I tried learning Swift well enough to recreate my programs in it, didn't work so well and I didn't get very far. Tried running my C++ functions from Obj-C++ source by calling the functions, didn't work and the project refused to open again after the first time I closed it. I don't find object oriented programming very intuitive in the first place so I'd like to avoid Obj-C.
I already have both an Automator standalone workflow and a Service (that do the same thing) which is get the programs I need, display confirmation, run the program in a terminal window with stdout, and display a notification before exiting. This is everything I need it to do when a specific button is pressed.
So how would I go about linking this button to an Automator func run() block in Swift? I know the command that needs to be used but like I said I don't find object oriented programming very intuitive so I need to know the context in which that function is used. Would following block be enough in practice or would it need more specification?
#IBOutlet weak var testButton(NSButton!)
#IBAction testButton(_ sender: AnyObject)
{
let guard Bundle.main.path(forName: "test",forType:"workflow")
else
{
print("Could not find 'test.workflow'")
return
}
let URL="//location of file"
class func run(at: URL, withInput: nil)
}
Am I missing something about how to do this or is the above enough? Secondarily can someone please give an example as to the format of a file URL where the file is located in the bundles "Resources" folder?
Also would the class remain the word class or should I be specifying a custom class? Can someone please give me a realworld example of this block/concept in practice?

Here's a testButton function that should work:
#IBAction func testButton(_ sender: AnyObject) {
guard let workflowPath = Bundle.main.path(forResource: "test", ofType: "workflow") else {
print("Workflow resource not found")
return
}
let workflowURL = URL(fileURLWithPath: workflowPath)
do {
try AMWorkflow.run(at:workflowURL, withInput: nil)
} catch {
print("Error running workflow: \(error)")
}
}
Notes:
You need to import Automator at the top of your source file in order for the Swift compiler to recognize the AMWorkflow class
You need to add test.workflow to your project

Related

Ffmpeg for use in iOS application coded in Swift

I have been browsing the web, even this website... bu cannot find a good option to implement ffmpeg functionality in an iOS application made in swift.
Options looked at and reasons why they are not solutions:
SwiftFFmpeg - I am not sure it can run on iOS, plus I don't see an option to run my desired Ffmpeg command.
MobileFFmpeg - Not maintained anymore, and a new project by the same founders created a "better altenrative" called ffmpeg-kit.
ffmpeg-kit - looks amazing, but their API only allows for interaction in the Objective-C language.
Any solutions anyone can give me?
First...
Make sure you understand what "packages" are available. You can build it yourself, but to be honest, unless you have a very specific reason, I'd just use the pre-build packages. See "8. Packages" of the ffmpeg-kit README.MD to see what's on offer
Second
Go to the Releases and find the version you're interested (I used FFmpegKit Native v4.5.1), scroll down and expand the "Assets" list for the release you're interested in.
For this experiment I used ffmpeg-kit-full-4.5.1-macos-xcframework.zip. If you're doing this in iOS, the workflow is basically the same, you just need to take into account that your file access is sandboxed (and yes, I've done this, and yes, it takes a long time to transcode video, compared to a desktop)
Third
Create a new Xcode project. Again, for this experiment, I created a "MacOS App" using "Storyboards" as the UI (you could try using SwiftUI, but that's another layer of complexity this example didn't need right now)
Unzip the *xcframework.zip file you download from the last step.
In Xcode, select the "project" node, select the MacOS target (🤞 there's only one target).
Select "General", drag the *.xcframework folders from Finder to the "Frameworks, Libraries and Embedded Content" section of your project
Forth
For this experiment, I opened the ViewController class (which was automatically created by Xcode) and simply added...
func syncCommand() {
guard let session = FFmpegKit.execute("-i file1.mp4 -c:v file2.mp4") else {
print("!! Failed to create session")
return
}
let returnCode = session.getReturnCode()
if ReturnCode.isSuccess(returnCode) {
} else if ReturnCode.isCancel(returnCode) {
} else {
print("Command failed with state \(FFmpegKitConfig.sessionState(toString: session.getState()) ?? "Unknown") and rc \(returnCode?.description ?? "Unknown").\(session.getFailStackTrace() ?? "Unknown")")
}
}
func asyncCommand() {
FFmpegKit.executeAsync("-i file1.mp4 -c:v file2.mp4") { session in
guard let session = session else {
print("!! Invalid session")
return
}
guard let returnCode = session.getReturnCode() else {
print("!! Invalid return code")
return
}
print("FFmpeg process exited with state \(FFmpegKitConfig.sessionState(toString: session.getState()) ?? "Unknown") and rc \(returnCode).\(session.getFailStackTrace() ?? "Unknown")")
} withLogCallback: { logs in
guard let logs = logs else { return }
// CALLED WHEN SESSION PRINTS LOGS
} withStatisticsCallback: { stats in
guard let stats = stats else { return }
// CALLED WHEN SESSION GENERATES STATISTICS
}
}
The code above is basically the "2. Execute synchronous FFmpeg commands." and "4. Execute asynchronous FFmpeg commands by providing session specific execute/log/session callbacks." examples from the ffmpeg-kit/apple documentation
!! Important !! - don't forget to add import ffmpegkit to the start of the file!
At this point, this should now compile (you'll get a couple of warnings about logs and stats not been used, you can ignore those).
After thoughts...
You should realise by now that the code I've provided won't actually run, for two reasons.
I've not actually called either func from anywhere (I tested it by placing it in the viewDidLoad func of the ViewController class)
The input file, used in the execute command, doesn't exist. You will need to provide an actual reference to an actual file, preferably with an absolute path. This, how ever, may require you to change the "App Sandbox" settings under the targets "Signing and Capabilities"
Xcodes auto code suggestions aren't bad and I mostly filled out the above using it, and the Obj-c code as a starting point.
Also, beware, SO is not a "tutorial" site, Xcode is a complex beast and you may need to spend some time exploring other resources to overcome issues you encounter

Is it necessary to use this function to play sound?

I am new to programming and new to this website so a complete noob =)
I am doing a course online for swift and had two questions:
The first is about the first "playSound" - is it a function?? if yes why does it not have the function key?
Why are we creating a function for "soundName" - can we not just retrieve the data using "sender.currentTitle!" ?
I took a screenshot with comments to make it more clear.
The code:
#IBAction func keyPressed(_ sender: UIButton) {
playSound(soundName: sender.currentTitle!)
}
func playSound(soundName: String) {
let url = Bundle.main.url(forResource: soundName, withExtension: "wav")
player = try! AVAudioPlayer(contentsOf: url!)
player.play()
}
I really apologize if this is super stupid and not the right forum T_T
enter image description here
func playSound is a function. playSound(...) in the first part calls that function.
Very good question! It's a matter of responsibilities. We want func playSound to be able to play any .wav file whose name it is given. It is a general file sound player. It happens that when this key is pressed, we want to play this file. But we could have other keys that play other files.
(But I would go even further: I'd argue that using the title of the button to tell us which file to play is bad code. You should never use the interface to function as the data.)

Drag a file promise from a NSView onto the desktop or another application (macOS)

I need to be able to drag a file representation (a pdf in my case) from an NSView contained within my application onto the Desktop or another application that supports opening PDF files.
I spent a few hours trying to get this working in my own app, and I thought I'd add my solution here as there's a lot of half-solutions online, some of which rely on Obj-C extensions and others which are outdated and are no longer supported. I'm hoping this post os the sort of post I'd wished I'd found during my own searches. I'm also aware of all of the minutae of the system (for example, using file coordinators instead of a direct write) but this seems to be the minimum code required to implement.
I've also provided a simple Swift NSView implementation.
The operation occurs in three main stages.
Basic overview
You'll need to make your view (or other control) a 'Data Provider' for the drag by implementing the NSPasteboardItemDataProvider protocol. The majority of the work required (other than starting the drag) occurs in the following protocol function.
func pasteboard(_ pasteboard: NSPasteboard?, item _: NSPasteboardItem, provideDataForType type: NSPasteboard.PasteboardType)
Starting the drag
This section occurs when the drag starts. In my case, I was doing this in mouseDown(), but you could also do this in the mouseDragged for example.
Tell the pasteboard that we will provide the file type UTI for the drop (kPasteboardTypeFilePromiseContent)
Tell the pasteboard that we will provide a file promise (kPasteboardTypeFileURLPromise) for the data type specified in (1)
Responding to the receiver asking for the content that we'll provide
kPasteboardTypeFilePromiseContent
This is the first callback from the receiver of the drop (via pasteboard(pasteboard:item:provideDataForType:))
The receiver is asking us what type (UTI) of file we will provide.
Respond by setting the UTI (using setString("") on the pasteboard object) for the type kPasteboardTypeFilePromiseContent
Responding to the receiver asking for the file
kPasteboardTypeFileURLPromise
This is the second callback from the receiver (via pasteboard(pasteboard:item:provideDataForType:))
The receiver is asking us to write the data to a file on disk.
The receiver tells us the folder to write our content to (com.apple.pastelocation)
Write the data to disk inside the folder that the receiver has told us.
Respond by setting the resulting URL of the written file (using setString() on the pasteboard object) for the type kPasteboardTypeFileURLPromise. Note that the format of this string needs to be file:///... so .absoluteString() needs to be used.
And we're done!
Sample
// Some definitions to help reduce the verbosity of our code
let PasteboardFileURLPromise = NSPasteboard.PasteboardType(rawValue: kPasteboardTypeFileURLPromise)
let PasteboardFilePromiseContent = NSPasteboard.PasteboardType(rawValue: kPasteboardTypeFilePromiseContent)
let PasteboardFilePasteLocation = NSPasteboard.PasteboardType(rawValue: "com.apple.pastelocation")
class MyView: NSView {
override func mouseDown(with event: NSEvent) {
let pasteboardItem = NSPasteboardItem()
// (1, 2) Tell the pasteboard item that we will provide both file and content promises
pasteboardItem.setDataProvider(self, forTypes: [PasteboardFileURLPromise, PasteboardFilePromiseContent])
// Create the dragging item for the drag operation
let draggingItem = NSDraggingItem(pasteboardWriter: pasteboardItem)
draggingItem.setDraggingFrame(self.bounds, contents: image())
// Start the dragging session
beginDraggingSession(with: [draggingItem], event: event, source: self)
}
}
Then, in your Pasteboard Item Data provider extension...
extension MyView: NSPasteboardItemDataProvider {
func pasteboard(_ pasteboard: NSPasteboard?, item _: NSPasteboardItem, provideDataForType type: NSPasteboard.PasteboardType) {
if type == PasteboardFilePromiseContent {
// The receiver will send this asking for the content type for the drop, to figure out
// whether it wants to/is able to accept the file type (3).
// In my case, I want to be able to drop a file containing PDF from my app onto
// the desktop or another app, so, add the UTI for the pdf (4).
pasteboard?.setString("com.adobe.pdf", forType: PasteboardFilePromiseContent)
}
else if type == PasteboardFileURLPromise {
// The receiver is interested in our data, and is happy with the format that we told it
// about during the kPasteboardTypeFilePromiseContent request.
// The receiver has passed us a URL where we are to write our data to (5).
// It is now waiting for us to respond with a kPasteboardTypeFileURLPromise
guard let str = pasteboard?.string(forType: PasteboardFilePasteLocation),
let destinationFolderURL = URL(string: str) else {
// ERROR:- Receiver didn't tell us where to put the file?
return
}
// Here, we build the file destination using the receivers destination URL
// NOTE: - you need to manage duplicate filenames yourself!
let destinationFileURL = destinationFolderURL.appendingPathComponent("dropped_file.pdf")
// Write your data to the destination file (6). Do better error handling here!
let pdfData = self.dataWithPDF(inside: self.bounds)
try? pdfData.write(to: destinationFileURL, options: .atomicWrite)
// And finally, tell the receiver where we wrote our file (7)
pasteboard?.setString(destinationFileURL.absoluteString, forType: PasteboardFileURLPromise)
}
}
If anyone finds issues with this or it's completely incorrect please let me know! It seems to work for my app at least.
As Willeke has pointed out, Apple has some sample code for using the (newer) NSFilePromiseProvider mechanism for drag drop.
https://developer.apple.com/documentation/appkit/documents_files_and_icloud/supporting_drag_and_drop_through_file_promises
I wish my search had started at Apple's Developer pages instead of Google 🙃. Oh well! The sample provided is valid and still works, so if this post helps someone locate more cohesive info regarding drag drop then fantastic.

Accessing the media library on OS X with Swift code

I am an experienced coder, however a novice writing code for Apple devices. I am attempting to access the media library on my OS X device using Swift. I can find dozens of examples accomplishing this task for iOS devices, and successfully implemented some code to do this for iOS. However I am having a difficult time trying to do the same for OS X.
Can anyone please point me to, or offer any suggestions that would help me access the media library (itunes) on an OS X device using Swift?
edit: To clarify, if I am writing for iOS I can make a call such as MPMediaQuery to query the media library. I am looking for something similar that can be used in Swift code written for OS X.
Thanks in advance.
It appears you're referring specifically to the iTunes library. It is created, maintained, and accessed through the Apple supplied iTunes application. Fortunately, iTunes is highly scriptable, either directly through AppleScript, or by incorporating AppleScript calls within your own application.
To get an idea of what's possible, start by opening the Script Editor app, located in /Applications/Utilities, and select File -> Open Dictionary...
The list includes all applications that support scripting. Choose iTunes to display a browser detailing its interface. For example, selecting iTunes Suite -> track displays the properties you can access:
How to write AppleScript code and/or how to incorporate it into your own application is far beyond the scope of a single question here. However, there are many resources on the Apple developer site that can help you get going. A logical place to begin is: AppleScript Overview.
I've been working on this myself for the last month off and on. This is what I was able to come up with.
You have to do the same thing with the media objects once you've loaded them. The calls are non-blocking.
I used the MediaLibrary reference in the OS X library to figure this out. It's a good place to get the rest of the details.
import Cocoa
import MediaLibrary
class ViewController: NSViewController {
var library : MLMediaLibrary!
var iTunes : MLMediaSource!
var rootGroup : MLMediaGroup!
override func viewDidLoad() {
super.viewDidLoad()
let options : [String : AnyObject] = [MLMediaLoadIncludeSourcesKey: [MLMediaSourceiTunesIdentifier]]
library = MLMediaLibrary(options: options)
library.addObserver(self, forKeyPath: "mediaSources", options: NSKeyValueObservingOptions.New, context: nil)
library.mediaSources
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
guard let path = keyPath else { return }
switch path {
case "mediaSources":
loadSources()
case "rootMediaGroup":
loadRootGroup()
default:
print("Nothing to do for \(path)")
}
}
func loadSources(){
if let mediaSources = library.mediaSources {
for (ident, source) in mediaSources {
print("Ident: \(ident)")
print("Source Ident: \(source.mediaSourceIdentifier)")
iTunes = source
iTunes.addObserver(self, forKeyPath: "rootMediaGroup", options: NSKeyValueObservingOptions.New, context: nil)
iTunes.rootMediaGroup
}
}
}
func loadRootGroup(){
if let rootGroup = iTunes.rootMediaGroup {
print("Root Group Identifier: \(rootGroup.identifier)")
print("Root Group Type Ident: \(rootGroup.typeIdentifier)")
}
}
}
I cut out my specific code as I'm writing an Uploader that takes a subset of my library for streaming for others at my office. But this should get you started.

How to link actions to File->Save/File->Open in Swift app

I have a Swift application that has data I want to save using the methods described in this question. Now, I need to know what is the proper way to link these actions to the File -> Save/Save As menu item and the File -> Open menu item. This isn't a document-based application.
I'm running Xcode 6.4 on OS X 10.10.4.
Create an IBAction function and link it to the XIB via Interface Builder.
Create an open/save panel in that function and let the user select the file name and location, use the returned NSURL array for saving/loading path. (after converted to required object type, of course.)
There are lots of example codes almost everywhere, either Objective-C or Swift.
In Swift 3, it may seem odd because it's using 'First Responder' but all you have to do is add the following code to your NSViewController class that is set as the Custom Class on a storyboard. It does not have to be connected like other #IBAction functions.
class Test: NSViewController {
#IBAction func saveDocument(_ sender: Any?) {
// code to execute for save functionality
// following line prints in debug to show function is executing.
// delete print line below when testing is completed.
Print("save")
}
#IBAction func openDocument(_ sender: Any?) {
// code to execute for open functionality here
// following line prints in debug to show function is executing.
// delete print line below when testing is completed.
print("open")
}
}