Is it necessary to use this function to play sound? - swift

I am new to programming and new to this website so a complete noob =)
I am doing a course online for swift and had two questions:
The first is about the first "playSound" - is it a function?? if yes why does it not have the function key?
Why are we creating a function for "soundName" - can we not just retrieve the data using "sender.currentTitle!" ?
I took a screenshot with comments to make it more clear.
The code:
#IBAction func keyPressed(_ sender: UIButton) {
playSound(soundName: sender.currentTitle!)
}
func playSound(soundName: String) {
let url = Bundle.main.url(forResource: soundName, withExtension: "wav")
player = try! AVAudioPlayer(contentsOf: url!)
player.play()
}
I really apologize if this is super stupid and not the right forum T_T
enter image description here

func playSound is a function. playSound(...) in the first part calls that function.
Very good question! It's a matter of responsibilities. We want func playSound to be able to play any .wav file whose name it is given. It is a general file sound player. It happens that when this key is pressed, we want to play this file. But we could have other keys that play other files.
(But I would go even further: I'd argue that using the title of the button to tell us which file to play is bad code. You should never use the interface to function as the data.)

Related

How to drag and drop an external item for Xcode-ui-testing

In my application I allow the user to drag and drop items from Finder (or any other source of a file based URL) into my application. What I want to do is to add a mechanism that will allow me to test this in the Xcode UI testing.
I can see how to use XCUIElement.press(forDuration:thenDragTo:) to test the drag and drop of a source and destination within the application, but I have been unable to find a way to test when the source of the drag is outside of the application.
In a somewhat related test, I test the copy and paste portion of the application by setting the string I want to paste into NSPasteboard.general, then using XCUIElement.typeKey("v", modifierFlags: .command) to paste it into the desired element. That is a little less than ideal as it depends on Command-v actually being implemented as the paste command, but that is unlikely to change so it is acceptable for my needs. (In fact I've written an XCUIElement.paste(_ s: String) extension that makes it easy for me to add this in a test.)
I believe that drag and drop is also using an NSPasteboard for its communications, so with a little investigation into the underlying mechanism, I should be able to set my object into the correct pasteboard just like I do for the cut and paste. I'm reasonably certain I can figure that part out. But I haven't figured out how to perform the actual drop.
My goal would be to create an XCUIElement.drop(_ url) that would setup the proper "public.file-url" object into the correct pasteboard, and then simulate/perform the drop into the element.
Any ideas?
I should note that I have already tried the following two items:
First, I did use the Xcode record feature to attempt to record the drag and drop operation and see what events it would give me. Unfortunately, it records absolutely nothing.
Second, I do have a menu based alternative where the user selects a file via the file selector. So if I could simulate the file selection, that would be a suitable testing alternative for my purposes. Unfortunately, I didn't make any progress along that path either. When I used Xcode to record the events, it recorded the menu selection, nothing that was actually done in the dialog.
Based on your comments I would recommend you to read this article documentation piece
https://developer.apple.com/documentation/xctest/xcuiapplication
Notice the init(bundleIdentifier: String) and init(url: URL) methods. These allow you to interact with apps apart from the target application.
Then you can use XCUIElement.press(forDuration:thenDragTo:)
import XCTest
import XCTApps
import ScreenObject
let notes = XCTApps.notes.app
let photos = XCTApps.photos.app
class Tests: XCTestCase {
func testDragAndDrop() {
photos.launch()
notes.launch()
photos.images.lastMatch.press(forDuration: 1, thenDragTo: notes.textViews["Note Body Text View"])
}
}
P.S. In this example I use XCTApps because I don't want to remember or google bundle identifiers :D
https://github.com/rzakhar/XCTApps
Ok, so I haven't yet figured out the answer to my question (how to test a drag and drop), but I have come up with an acceptable workaround for my test.
Specifically, as I thought more about the pasteboard I realized that if I'm allowing the user to drag and drop a file into my application, then I should also be allowing them to cut and paste a file into the application.
Once I had that realization, then it was a reasonably simple process to test the necessary feature of my application by pasting a URL instead of dragging and dropping the URL. This has the added advantage that I can add the necessary test file to my testing package, keeping everything nicely self contained.
To this end I've added the following function to my XCUIElement extension:
extension XCUIElement {
func paste(url: URL) {
precondition(url.isFileURL, "This must be a file URL to match the pasteboard type.")
let pasteboard = NSPasteboard.general
pasteboard.clearContents()
pasteboard.setString(url.absoluteString, forType: .fileURL)
click()
typeKey("v", modifierFlags: .command)
}
}
Then in my test code I add the following to trigger the event:
let mainWindow = app.windows[/*...my main window name goes here...*/]
let testBundle = Bundle(for: type(of: self))
let fileURL = testBundle.url(forResource: "Resources/simple", withExtension: "json")
mainWindow.paste(url: fileURL!)
Granted, this doesn't actually test the drag and drop, but it does test the same portion of my code, since in my AppDelegate I have my onPaste action method calling the same underlying method as my performDrop method.
I will wait a couple of days to see if anyone comes up with an answer to the actual question (since I would still find that useful), but if no one does, I'll accept my own answer.

How to run an Automator Workflow/Service from Cocoa app?

So I tried learning Swift well enough to recreate my programs in it, didn't work so well and I didn't get very far. Tried running my C++ functions from Obj-C++ source by calling the functions, didn't work and the project refused to open again after the first time I closed it. I don't find object oriented programming very intuitive in the first place so I'd like to avoid Obj-C.
I already have both an Automator standalone workflow and a Service (that do the same thing) which is get the programs I need, display confirmation, run the program in a terminal window with stdout, and display a notification before exiting. This is everything I need it to do when a specific button is pressed.
So how would I go about linking this button to an Automator func run() block in Swift? I know the command that needs to be used but like I said I don't find object oriented programming very intuitive so I need to know the context in which that function is used. Would following block be enough in practice or would it need more specification?
#IBOutlet weak var testButton(NSButton!)
#IBAction testButton(_ sender: AnyObject)
{
let guard Bundle.main.path(forName: "test",forType:"workflow")
else
{
print("Could not find 'test.workflow'")
return
}
let URL="//location of file"
class func run(at: URL, withInput: nil)
}
Am I missing something about how to do this or is the above enough? Secondarily can someone please give an example as to the format of a file URL where the file is located in the bundles "Resources" folder?
Also would the class remain the word class or should I be specifying a custom class? Can someone please give me a realworld example of this block/concept in practice?
Here's a testButton function that should work:
#IBAction func testButton(_ sender: AnyObject) {
guard let workflowPath = Bundle.main.path(forResource: "test", ofType: "workflow") else {
print("Workflow resource not found")
return
}
let workflowURL = URL(fileURLWithPath: workflowPath)
do {
try AMWorkflow.run(at:workflowURL, withInput: nil)
} catch {
print("Error running workflow: \(error)")
}
}
Notes:
You need to import Automator at the top of your source file in order for the Swift compiler to recognize the AMWorkflow class
You need to add test.workflow to your project

Play video on Swift Error

I'm trying to play a video using swift, however the line of code below gives me an error
init(URL:)' has been renamed to init(url:)'
let player = AVPlayer(URL: NSURL(fileURLWithPath: path) as URL)
How can modify this line to get ride of the error.
Thanks
Try this:
if let player = Bundle.main.url(forResource: "The name of your file", withExtension: "mp4"){
if NSWorkspace.shared().open(player) {
}
}
Be sure to substitute the name of your file and file type. You may want to put it in a function and call that function.
I know it's a totally different way of doing it, but it's always worked for me. I recommend trying it.

increase volume of audio file recorded with swift

I am developing an application with swift. I would like to be able to increase the volume of a recorded file. Is there a way to do it directly inside the application?
I found Audiokit Here and this question but it didn't help me much.
Thanks!
With AudioKit
Option A:
Do you just want to import a file, then play it louder than you imported it? You can use an AKBooster for that.
import AudioKit
do {
let file = try AKAudioFile(readFileName: "yourfile.wav")
let player = try AKAudioPlayer(file: file)
// Define your gain below. >1 means amplifying it to be louder
let booster = AKBooster(player, gain: 1.3)
AudioKit.output = booster
try AudioKit.start()
// And then to play your file:
player.play()
} catch {
// Log your error
}
Just set the gain value of booster to make it louder.
Option B: You could also try normalizing the audio file, which essentially applies a multiple constant across the recording (with respect to the highest signal level in the recording) so it reaches a new target maximum that you define. Here, I set it to -4dB.
let url = Bundle.main.url(forResource: "sound", withExtension: "wav")
if let file = try? AKAudioFile(forReading: url) {
// Set the new max level (in dB) for the gain here.
if let normalizedFile = try? file.normalized(newMaxLevel: -4) {
print(normalizedFile.maxLevel)
// Play your normalizedFile...
}
}
This method increases the amplitude of everything to a level of dB - so it won't effect the dynamics (SNR) of your file, and it only increases by the amount it needs to reach that new maximum (so you can safely apply it to ALL of your files to have them be uniform).
With AVAudioPlayer
Option A: If you want to adjust/control volume, AVAudioPlayer has a volume member but the docs say:
The playback volume for the audio player, ranging from 0.0 through 1.0 on a linear scale.
Where 1.0 is the volume of the original file and the default. So you can only make it quieter with that. Here's the code for it, in case you're interested:
let soundFileURL = Bundle.main.url(forResource: "sound", withExtension: "mp3")!
let audioPlayer = try? AVAudioPlayer(contentsOf: soundFileURL, fileTypeHint: AVFileType.mp3.rawValue)
audioPlayer?.play()
// Only play once
audioPlayer?.numberOfLoops = 0
// Set the volume of playback here.
audioPlayer?.volume = 1.0
Option B: if your sound file is too quiet, it might be coming out the receiver of the phone. In which case, you could try overriding the output port to use the speaker instead:
do {
try AVAudioSession.sharedInstance().overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
} catch let error {
print("Override failed: \(error)")
}
You can also set that permanently with this code (but I can't guarantee your app will get into the AppStore):
try? audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, with: AVAudioSessionCategoryOptions.defaultToSpeaker)
Option C: If Option B doesn't do it for you, you might be out of luck on 'how to make AVAudioPlayer play louder.' You're best off editing the source file with some external software yourself - I can recommend Audacity as a good option to do this.
Option D: One last option I've only heard of. You could also look into MPVolumeView, which has UI to control the system output and volume. I'm not too familiar with it though - may be approaching legacy at this point.
I want to mention a few things here because I was working on a similar problem.
On the contrary to what's written on Apple Docs on the AVAudioPlayer.volume property (https://developer.apple.com/documentation/avfoundation/avaudioplayer/1389330-volume) the volume can go higher than 1.0... And actually this works. I bumped up the volume to 100.0 on my application and recorded audio is way louder and easier to hear.
Another thing that helped me was setting the mode of AVAudioSession like so:
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.playAndRecord, options: [.defaultToSpeaker, .allowBluetooh])
try session.setMode(.videoRecording)
try session.setActive(true)
} catch {
debugPrint("Problem with AVAudioSession")
}
session.setMode(.videoRecording) is the key line here. This helps you to send the audio through the louder speakers of the phone and not just the phone call speaker that's next to the face camera in the front. I was having a problem with this and posted a question that helped me here:
AVAudioPlayer NOT playing through speaker after recording with AVAudioRecorder
There are several standard AudioKit DSP components that can increase the volume.
For example, you can use a simple method like AKBooster: http://audiokit.io/docs/Classes/AKBooster.html
OR
Use the following code,
AKSettings.defaultToSpeaker = true
See more details in this post:
https://github.com/audiokit/AudioKit/issues/599
https://github.com/AudioKit/AudioKit/issues/586

Accessing the media library on OS X with Swift code

I am an experienced coder, however a novice writing code for Apple devices. I am attempting to access the media library on my OS X device using Swift. I can find dozens of examples accomplishing this task for iOS devices, and successfully implemented some code to do this for iOS. However I am having a difficult time trying to do the same for OS X.
Can anyone please point me to, or offer any suggestions that would help me access the media library (itunes) on an OS X device using Swift?
edit: To clarify, if I am writing for iOS I can make a call such as MPMediaQuery to query the media library. I am looking for something similar that can be used in Swift code written for OS X.
Thanks in advance.
It appears you're referring specifically to the iTunes library. It is created, maintained, and accessed through the Apple supplied iTunes application. Fortunately, iTunes is highly scriptable, either directly through AppleScript, or by incorporating AppleScript calls within your own application.
To get an idea of what's possible, start by opening the Script Editor app, located in /Applications/Utilities, and select File -> Open Dictionary...
The list includes all applications that support scripting. Choose iTunes to display a browser detailing its interface. For example, selecting iTunes Suite -> track displays the properties you can access:
How to write AppleScript code and/or how to incorporate it into your own application is far beyond the scope of a single question here. However, there are many resources on the Apple developer site that can help you get going. A logical place to begin is: AppleScript Overview.
I've been working on this myself for the last month off and on. This is what I was able to come up with.
You have to do the same thing with the media objects once you've loaded them. The calls are non-blocking.
I used the MediaLibrary reference in the OS X library to figure this out. It's a good place to get the rest of the details.
import Cocoa
import MediaLibrary
class ViewController: NSViewController {
var library : MLMediaLibrary!
var iTunes : MLMediaSource!
var rootGroup : MLMediaGroup!
override func viewDidLoad() {
super.viewDidLoad()
let options : [String : AnyObject] = [MLMediaLoadIncludeSourcesKey: [MLMediaSourceiTunesIdentifier]]
library = MLMediaLibrary(options: options)
library.addObserver(self, forKeyPath: "mediaSources", options: NSKeyValueObservingOptions.New, context: nil)
library.mediaSources
}
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
guard let path = keyPath else { return }
switch path {
case "mediaSources":
loadSources()
case "rootMediaGroup":
loadRootGroup()
default:
print("Nothing to do for \(path)")
}
}
func loadSources(){
if let mediaSources = library.mediaSources {
for (ident, source) in mediaSources {
print("Ident: \(ident)")
print("Source Ident: \(source.mediaSourceIdentifier)")
iTunes = source
iTunes.addObserver(self, forKeyPath: "rootMediaGroup", options: NSKeyValueObservingOptions.New, context: nil)
iTunes.rootMediaGroup
}
}
}
func loadRootGroup(){
if let rootGroup = iTunes.rootMediaGroup {
print("Root Group Identifier: \(rootGroup.identifier)")
print("Root Group Type Ident: \(rootGroup.typeIdentifier)")
}
}
}
I cut out my specific code as I'm writing an Uploader that takes a subset of my library for streaming for others at my office. But this should get you started.