App crashes when trying to access API with alamofire - swift

I have this code that I use to read barcode from a record vinyl and after, I try to get information based on the result of the barcode on a website called Discogs. I've created an app at this web site which is required but my app crashes every time when it finishes reading the barcode.
I can read the barcode(get the number out of it) and it dismisses my viewcontroller back to mainVC where should show the result but my app crashes before...
What should I do ? I am using alamofire
func setupCamera(){
session = AVCaptureSession()
let videoCaptureDevice = AVCaptureDevice.default(for: AVMediaType.video)
let videoInput : AVCaptureDeviceInput!
do {
videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice!)
} catch {
return
}
if (session.canAddInput(videoInput)){
session.addInput(videoInput)
} else {
scanningNotPossible()
}
// Create output object.
let metadataOutput = AVCaptureMetadataOutput()
// Add output to the session.
if (session.canAddOutput(metadataOutput)) {
session.addOutput(metadataOutput)
// Send captured data to the delegate object via a serial queue.
metadataOutput.setMetadataObjectsDelegate(self, queue: .main)
// Set barcode type for which to scan: EAN-13.
metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.ean13]
} else {
scanningNotPossible()
}
previewLayer = AVCaptureVideoPreviewLayer(session: session);
previewLayer.frame = view.layer.bounds;
previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill;
view.layer.addSublayer(previewLayer);
// Begin the capture session.
session.startRunning()
}
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
session.stopRunning()
if let metadataObject = metadataObjects.first {
guard let readableObject = metadataObject as? AVMetadataMachineReadableCodeObject else { return }
guard let stringValue = readableObject.stringValue else { return }
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
found(code: stringValue)
}
// dismiss(animated: true)
}
func barcodeDetected(code: String) {
// Let the user know we've found something.
let alert = UIAlertController(title: "Found a Barcode!", message: code, preferredStyle: UIAlertController.Style.alert)
let theAction = UIAlertAction(title: "Search", style: .default){ (action: UIAlertAction!) in
// Remove the spaces.
let trimmedCode = code.trimmingCharacters(in: .whitespaces)
// EAN or UPC?
// Check for added "0" at beginning of code.
let trimmedCodeString = "\(trimmedCode)"
var trimmedCodeNoZero: String
if trimmedCodeString.hasPrefix("0") && trimmedCodeString.count > 1 {
trimmedCodeNoZero = String(trimmedCodeString.dropFirst())
// Send the doctored UPC to DataService.searchAPI()
DataService.searchAPI(codeNumber: trimmedCodeNoZero)
} else {
// Send the doctored EAN to DataService.searchAPI()
DataService.searchAPI(codeNumber: trimmedCodeString)
}
print("popopop")
self.navigationController?.popViewController(animated: true)
}
alert.addAction(theAction)
self.present(alert, animated: true, completion: nil)
}
import Foundation
import Alamofire
import SwiftyJSON
class DataService {
static let dataService = DataService()
private(set) var ALBUM_FROM_DISCOGS = ""
private(set) var YEAR_FROM_DISCOGS = ""
static func searchAPI(codeNumber: String) {
// The URL we will use to get out album data from Discogs
let discogsURL = "\(DISCOGS_AUTH_URL)\(codeNumber)&?barcode&key=\(DISCOGS_KEY)&secret=\(DISCOGS_SECRET)"
Alamofire.request(discogsURL)
.responseJSON { response in
var json = JSON(response.result.value!)
let albumArtistTitle = "\(json["results"][0]["title"])"
let albumYear = "\(json["results"][0]["year"])"
self.dataService.ALBUM_FROM_DISCOGS = albumArtistTitle
self.dataService.YEAR_FROM_DISCOGS = albumYear
// Post a notification to let AlbumDetailsViewController know we have some data.
NotificationCenter.default.post(name: NSNotification.Name(rawValue: "AlbumNotification"), object: nil)
}
}
}
log error:
> 2019-04-26 17:35:21.791189-0300 Discogs Barcode Example[1852:598955] -[Discogs_Barcode_Example.AlbumDetaisViewController setLabels]: unrecognized selector sent to instance 0x10230a460
2019-04-26 17:35:21.793519-0300 Discogs Barcode Example[1852:598955] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[Discogs_Barcode_Example.AlbumDetaisViewController setLabels]: unrecognized selector sent to instance 0x10230a460'
*** First throw call stack:
(0x2016c3518 0x20089e9f8 0x2015e0278 0x22d83bef8 0x2016c8d60 0x2016ca9fc 0x2016345bc 0x201634588 0x201633a7c 0x201633728 0x2015ad524 0x2016331d8 0x20201b814 0x100f1cb38 0x101358d78 0x101325708 0x10209b6f0 0x10209cc74 0x1020aa6fc 0x201654ec0 0x20164fdf8 0x20164f354 0x20384f79c 0x22d810b68 0x100f1b03c 0x2011158e0)
libc++abi.dylib: terminating with uncaught exception of type NSException

[Discogs_Barcode_Example.AlbumDetaisViewController setLabels]:
it seems that AlbumDetaisViewController was hooked to an action method named setLabels and you changed it to other name , so either return back to original name or clear the connection from IB and connect with the new name

Related

AddInstanceForFactory: No factory registered for id within my func PlaySound(theSoundName: String)?

Problem = PlaySound below crashes with this error:
AddInstanceForFactory: No factory registered for id
1st problem of 3 to go before submission to App Store
Setting a breakpoint before the guard let url block = NO error.
Setting a breakpoint after the guard let url block = error!!, coupled with a horrible static sound happening. Note that if I set NO breakpoints, the correct sound happens but with the above error.
What am I missing?
Within my SKScene Class:
func controllerInputDetected(gamepad: GCExtendedGamepad,
element: GCControllerElement,
index: Int) {
// rightShoulder
if (gamepad.rightShoulder == element)
{
if (gamepad.rightShoulder.value != 0)
{
startGame()
}
}
}
As shown above, I call the following function by pressing a shoulder button on my Gamepad:
func startGame() {
PlaySound(theSoundName: "roar")
}
func PlaySound(theSoundName: String) {
guard let url = Bundle.main.url(forResource: "audio/" + theSoundName,
withExtension: "mp3") else {
print("sound not found")
return
}
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
itsSoundPlayer = try AVAudioPlayer(contentsOf: url)
if theSoundName == "roar" {
itsSoundPlayer?.numberOfLoops = -1 // forever
}
itsSoundPlayer!.play()
}
catch let error as NSError {
print("error: \(error.localizedDescription)")
}
}
FWIW the above iOS error changes for tvOS =
<NSError: 0x600002cdf540; domain: FBSSceneSnapshotErrorDomain; code: 4; reason: "an unrelated condition or state was not satisfied"
Again, what am I missing?

How do I handle web socket re-connect for .cancelled case in Starscream pod?

I have a small iOS app in Swift which fetches data on stock prices from a public API using the Starscream swift pod running on an iPhone Pro 12. The app downloads, processes and displays the stock data correctly when wifi is connected. However, if I swipe up from the bottom of the device, the app screen disappears into the app icon and my debug window reports that sceneDidResignActive followed by sceneDidEnterBackground are called. The debug console also reports that the .cancelled case is being sent to the didReceive(event:client:) delegate method of StarScream. When I swipe up from the bottom of the device again and select the app screen the scene delegate methods sceneWillEnterForeground and sceneDidBecomeActive are called but my app is no longer updating and displaying live stock price information. The socket property is still there but is not connected. I have tried placing the line self.socket.connect() in the .cancelled switch case but it stops my app from handling wifi disconnection correctly. What code should I have in my scene delegate file or in .cancelled switch case to get the app to continue downloading and displaying stock price information from the API?
Here is my NetworkServices class;
//
// Services.swift
// BetVictorTask
//
// Created by Stephen Learmonth on 28/02/2022.
//
import UIKit
import Starscream
import Network
protocol NetworkServicesDelegate: AnyObject {
func sendStockInfo(stocksInfo: [String: StockInfo])
}
final class NetworkServices {
static let sharedInstance = NetworkServices()
var request = URLRequest(url: FINNHUB_SOCKET_STOCK_INFO_URL!)
var socket: WebSocket!
public private(set) var isConnected = false
var stocksInfo: [String: StockInfo] = [:]
var socketResults: [String: [StockInfo]] = [:]
weak var delegate: NetworkServicesDelegate?
var stockSymbols: [String] = []
private init() {
request.timeoutInterval = 5
socket = WebSocket(request: request)
socket.delegate = self
}
private let queue = DispatchQueue.global()
private let monitor = NWPathMonitor()
public func startMonitoring() {
monitor.start(queue: queue)
self.monitor.pathUpdateHandler = { [weak self] path in
if path.status == .satisfied {
// connect the socket
self?.socket.connect()
print("DEBUG: socket is connected")
} else {
self?.socket.disconnect()
print("DEBUG: socket is disconnected")
self?.isConnected = false
// post notification that socket is now disconnected
DispatchQueue.main.async {
print("DEBUG: Notification \"isDisconnected\" posted")
let name = Notification.Name(rawValue: isDisconnectedNotificationKey)
NotificationCenter.default.post(name: name, object: nil)
}
}
}
}
public func stopMonitoring() {
monitor.cancel()
socket.disconnect()
isConnected = false
}
func fetchStockInfo(symbols: [String], delegate: CompanyPriceListVC) {
stockSymbols = symbols
self.delegate = delegate
for symbol in symbols {
let string = FINNHUB_SOCKET_MESSAGE_STRING + symbol + "\"}"
socket.write(string: string)
}
}
private func parseJSONSocketData(_ socketString: String) {
self.socketResults = [:]
self.stocksInfo = [:]
let decoder = JSONDecoder()
do {
let socketData = try decoder.decode(SocketData.self, from: socketString.data(using: .utf8)!)
guard let stockInfoData = socketData.data else { return }
for stockInfo in stockInfoData {
let symbol = stockInfo.symbol
if self.socketResults[symbol] == nil {
self.socketResults[symbol] = [StockInfo]()
}
self.socketResults[symbol]?.append(stockInfo)
}
for (symbol, stocks) in self.socketResults {
for item in stocks {
if self.stocksInfo[symbol] == nil {
self.stocksInfo[symbol] = item
} else if item.timestamp > self.stocksInfo[symbol]!.timestamp {
self.stocksInfo[symbol] = item
}
}
}
self.delegate?.sendStockInfo(stocksInfo: self.stocksInfo)
} catch {
print("DEBUG: error: \(error.localizedDescription)")
}
}
func fetchCompanyDetails(symbol: String, completion: #escaping (CompanyInfo?, UIImage?)->()) {
let urlString = FINNHUB_HTTP_COMPANY_INFO_URL_STRING + symbol + "&token=" + FINNHUB_API_TOKEN
guard let url = URL(string: urlString) else { return }
let task = URLSession.shared.dataTask(with: url) { data, response, error in
if let error = error {
print("Error fetching company info: \(error)")
}
guard let data = data else { return }
let decoder = JSONDecoder()
do {
let companyInfo = try decoder.decode(CompanyInfo.self, from: data)
guard let logoURL = URL(string: companyInfo.logo) else { return }
let task = URLSession.shared.dataTask(with: logoURL) { data, response, error in
if let error = error {
print("Error fetching logo image: \(error)")
}
guard let data = data else { return }
guard let logoImage = UIImage(data: data) else { return }
completion(companyInfo, logoImage)
}
task.resume()
} catch {
print("Error decoding JSON: \(error)")
completion(nil, nil)
}
}
task.resume()
}
}
extension NetworkServices: WebSocketDelegate {
func didReceive(event: WebSocketEvent, client: WebSocket) {
switch event {
case .connected(_):
self.isConnected = true
DispatchQueue.main.async {
// post notification that socket is now connected
let name = Notification.Name(rawValue: isConnectedNotificationKey)
NotificationCenter.default.post(name: name, object: nil)
print("DEBUG: Notification \"isConnected\" posted")
}
case .disconnected(let reason, let code):
print("DEBUG: Got disconnected reason = \(reason) code = \(code)")
self.isConnected = false
case .cancelled:
print("DEBUG: cancelled.")
// socket = WebSocket(request: request)
// socket.delegate = self
// startMonitoring()
case .reconnectSuggested(let suggestReconnect):
// print("DEBUG: suggestReconnect = \(suggestReconnect)")
break
case .viabilityChanged(let viabilityChanged):
// print("DEBUG: viabilityChanged = \(viabilityChanged)")
break
case .error(let error):
print("DEBUG: error: \(String(describing: error?.localizedDescription))")
case .text(let socketString):
// print("DEBUG: .text available")
parseJSONSocketData(socketString)
default:
break
}
}
}

Unsupported type found - use -availableMetadataObjectTypes' issue

I am trying to create QR Reader. However, when I open the window with scanner, it crashes with error "Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureMetadataOutput setMetadataObjectTypes:] Unsupported type found - use -availableMetadataObjectTypes'"
This is my code:
import UIKit
import AVFoundation
import Alamofire
import SwiftyJSON
class CameraTwoViewController: UIViewController,
AVCaptureMetadataOutputObjectsDelegate {
#IBOutlet weak var square: UIImageView!
var video = AVCaptureVideoPreviewLayer()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
//Creating session
let session = AVCaptureSession()
//Define capture devcie
let captureDevice = AVCaptureDevice.default(for: .video)
do
{
let input = try AVCaptureDeviceInput(device: captureDevice!)
}
catch
{
print ("ERROR")
}
let output = AVCaptureMetadataOutput()
session.addOutput(output)
output.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
output.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
//output.metadataObjectTypes = [AVMetadataObject.availableMetadataObjectTypes.qr]
video = AVCaptureVideoPreviewLayer(session: session)
video.frame = view.layer.bounds
view.layer.addSublayer(video)
self.view.bringSubview(toFront: square)
session.startRunning()
}
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
if metadataObjects != nil && metadataObjects.count != 0 {
if let object = metadataObjects[0] as? AVMetadataMachineReadableCodeObject {
if object.type == AVMetadataObject.ObjectType.qr {
let alert = UIAlertController(title: "Your code is:", message: object.stringValue, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: "Retake", style: .default, handler: nil))
alert.addAction(UIAlertAction(title: "Copy", style: .default, handler: { (nil) in
UIPasteboard.general.string = object.stringValue
}))
present(alert, animated: true, completion: nil)
}
}
}
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Thank you in advance!
Try to add input to the session before you add output. Something like that:
func configureScanner() {
guard let captureDevice = AVCaptureDevice.default(for: .video) else {
return
}
var input: AVCaptureDeviceInput?
do {
input = try AVCaptureDeviceInput(device: captureDevice)
} catch let error {
print(error.localizedDescription)
}
guard let indeedInput = input else {
return
}
captureSession = AVCaptureSession()
captureSession!.addInput(indeedInput)
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession!.addOutput(captureMetadataOutput)
captureMetadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
captureMetadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
...
}

Load page into webview swift 2 from java class

I am developing an App for the iPhone using Xwebview which enables me to download a page then interact with the javascript on the downloaded page.
All works, but if the internet connection drops, a default local page is loaded, informing the user there is no internet connection. The page displays a retry button that, when pressed checks, the internet connection: if the connection is made the app tries to connect again to the external page and load the page into the webview.
I cannot get this to work: the code downloads the page (I can see this in my session data) but I can't get that page to load back into the webview.
override func viewDidLoad() {
super.viewDidLoad()
login()
}
func login()
{
// *********** Get stored hashkey **************
let hashcode = getHashcode()
// ********** Check network connection *********
let netConnection = Connection.isConnectedToNetwork()
print("net connection: ", netConnection)
if netConnection == true
{
if hashcode != "00000"
{
print("local key found", hashcode)
// We dont have local key
let webview = WKWebView(frame: view.frame, configuration: WKWebViewConfiguration())
//webview.loadRequest(NSURLRequest(URL: NSURL(string: "about:blank")!))
view.addSubview(webview)
webview.loadPlugin(jsapi(), namespace: "jsapi")
let url:NSURL = NSURL(string: serverLocation + onlineLoginApi)!
let session = NSURLSession.sharedSession()
let request = NSMutableURLRequest(URL: url)
request.HTTPMethod = "POST"
request.cachePolicy = NSURLRequestCachePolicy.ReloadIgnoringCacheData
let paramString = "/?username=username&password=password"
request.HTTPBody = paramString.dataUsingEncoding(NSUTF8StringEncoding)
let task = session.downloadTaskWithRequest(request) {
(
let location, let response, let error) in
guard let _:NSURL = location, let _:NSURLResponse = response where error == nil else {
print("error")
return
}
let urlContents = try! NSString(contentsOfURL: location!, encoding: NSUTF8StringEncoding)
guard let _:NSString = urlContents else {
print("error")
return
}
print(urlContents)
}
task.resume()
// you must tell webview to load response
webview.loadRequest(request)
}
else{
print("local key found", hashcode)
// ********* Found local key go to site pass key over ************
let webview = WKWebView(frame: view.frame, configuration: WKWebViewConfiguration())
view.addSubview(webview)
webview.loadPlugin(jsapi(), namespace: "jsapi")
let req = NSMutableURLRequest(URL: NSURL(string:serverLocation + onlineLoginApi + "?hashcode=\(hashcode)")!)
req.HTTPMethod = "POST"
req.HTTPBody = "/?hashcode=\(hashcode)".dataUsingEncoding(NSUTF8StringEncoding)
NSURLSession.sharedSession().dataTaskWithRequest(req)
{ data, response, error in
if error != nil
{
//Your HTTP request failed.
print(error!.localizedDescription)
} else {
//Your HTTP request succeeded
print(String(data: data!, encoding: NSUTF8StringEncoding))
}
}.resume()
webview.loadRequest(req)
}
}
else{
// No connection to internet
let webview = WKWebView(frame: view.frame, configuration: WKWebViewConfiguration())
view.addSubview(webview)
webview.loadPlugin(jsapi(), namespace: "jsapi")
let root = NSBundle.mainBundle().resourceURL!
let url = root.URLByAppendingPathComponent("/www/error-no-connection.html")
webview.loadFileURL(url, allowingReadAccessToURL: root)
print("No internet connection")
}
}
class jsapi: NSObject {
// Reconnect button on interface
func retryConnection()
{
print("Reconnect clicked")
dispatch_async(dispatch_get_main_queue())
{
let netConnections = Connection.isConnectedToNetwork()
if netConnections == true {
let netalert = UIAlertView(title: "Internet on line", message: nil, delegate: nil, cancelButtonTitle: "OK")
netalert.show()
let url = self.serverLocation + self.onlineLoginApi
let hashcode = ViewController().getHashcode()
if(hashcode != "00000") {
let url = url + "?hashcode=\(hashcode)"
print("url: ", url)
}
ViewController().loadPagelive(url)
}
else{
let netalert = UIAlertView(title: "Internet off line", message: nil, delegate: nil, cancelButtonTitle: "OK")
netalert.show()
}
}
print("retryConnect end")
}
}
You try to perform the loadPagelive(url) on a new instance of your ViewController, not on the current one shown on the screen, that's why you don't see any update.
You should create a delegate or a completion block in order to execute code on you ViewController instance loaded on the screen: every time you do ViewController(), a new object is created.
You can try using the delegate pattern, which is simple to achieve. I will try to focus on the important part and create something that can be used with your existing code:
class ViewController: UIViewController {
let jsapi = jsapi() // You can use only 1 instance
override func viewDidLoad() {
super.viewDidLoad()
// Set your ViewController as a delegate, so the jsapi can update it
jsapi.viewController = self
login()
}
func loadPagelive(_ url: URL) {
// Load page, probably you already have it
}
}
class jsapi: NSObject {
weak var viewController: ViewController?
func retryConnection() {
// We check if the delegate is set, otherwise it won't work
guard viewController = viewController else {
print("Error: delegate not available")
}
[... your code ...]
// We call the original (and hopefully unique) instance of ViewController
viewController.loadPagelive(url)
}
}

Thread 1:BAD_EXC_INSTRUCTION (code=EXC_1386_INVOP, subcode=0x0)

My code is posted below, not sure what the issue is but the return errors are listed as above along with fatal error:
unexpectedly found nil while unwrapping an Optional value
I have seen other answers to similar errors, but all of those involve things such as if let, which is not where my error seems to be occurring, the error message is connected to a line near the top that says "audioFile = try AVAudioFile(forReading: recordedAudioURL as URl)"
//
// PlaySoundsViewController+Audio.swift
// PitchPerfect
//
// Copyright © 2016 Udacity. All rights reserved.
//
import UIKit
import AVFoundation
extension PlaySoundsViewController: AVAudioPlayerDelegate {
struct Alerts {
static let DismissAlert = "Dismiss"
static let RecordingDisabledTitle = "Recording Disabled"
static let RecordingDisabledMessage = "You've disabled this app from recording your microphone. Check Settings."
static let RecordingFailedTitle = "Recording Failed"
static let RecordingFailedMessage = "Something went wrong with your recording."
static let AudioRecorderError = "Audio Recorder Error"
static let AudioSessionError = "Audio Session Error"
static let AudioRecordingError = "Audio Recording Error"
static let AudioFileError = "Audio File Error"
static let AudioEngineError = "Audio Engine Error"
}
// raw values correspond to sender tags
enum PlayingState { case Playing, NotPlaying }
// MARK: Audio Functions
func setupAudio() {
// initialize (recording) audio file
do {
audioFile = try AVAudioFile(forReading: recordedAudioURL as URL)
} catch {
showAlert(title: Alerts.AudioFileError, message: String(describing: error))
}
print("Audio has been setup")
}
func playSound(rate: Float? = nil, pitch: Float? = nil, echo: Bool = false, reverb: Bool = false) {
// initialize audio engine components
audioEngine = AVAudioEngine()
// node for playing audio
audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)
// node for adjusting rate/pitch
let changeRatePitchNode = AVAudioUnitTimePitch()
if let pitch = pitch {
changeRatePitchNode.pitch = pitch
}
if let rate = rate {
changeRatePitchNode.rate = rate
}
audioEngine.attach(changeRatePitchNode)
// node for echo
let echoNode = AVAudioUnitDistortion()
echoNode.loadFactoryPreset(.multiEcho1)
audioEngine.attach(echoNode)
// node for reverb
let reverbNode = AVAudioUnitReverb()
reverbNode.loadFactoryPreset(.cathedral)
reverbNode.wetDryMix = 50
audioEngine.attach(reverbNode)
// connect nodes
if echo == true && reverb == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, echoNode, reverbNode, audioEngine.outputNode)
} else if echo == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, echoNode, audioEngine.outputNode)
} else if reverb == true {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, reverbNode, audioEngine.outputNode)
} else {
connectAudioNodes(nodes: audioPlayerNode, changeRatePitchNode, audioEngine.outputNode)
}
// schedule to play and start the engine!
audioPlayerNode.stop()
audioPlayerNode.scheduleFile(audioFile, at: nil) {
var delayInSeconds: Double = 0
if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {
if let rate = rate {
delayInSeconds = Double(self.audioFile.length - playerTime.sampleTime) / Double(self.audioFile.processingFormat.sampleRate) / Double(rate)
} else {
delayInSeconds = Double(self.audioFile.length - playerTime.sampleTime) / Double(self.audioFile.processingFormat.sampleRate)
}
}
// schedule a stop timer for when audio finishes playing
self.stopTimer = Timer(timeInterval: delayInSeconds, target: self, selector: #selector(PlaySoundsViewController.stopAudio), userInfo: nil, repeats: false)
RunLoop.main.add(self.stopTimer!, forMode: RunLoopMode.defaultRunLoopMode)
}
do {
try audioEngine.start()
} catch {
showAlert(title: Alerts.AudioEngineError, message: String(describing: error))
return
}
// play the recording!
audioPlayerNode.play()
}
// MARK: Connect List of Audio Nodes
func connectAudioNodes(nodes: AVAudioNode...) {
for x in 0..<nodes.count-1 {
audioEngine.connect(nodes[x], to: nodes[x+1], format: audioFile.processingFormat)
}
}
func stopAudio() {
if let stopTimer = stopTimer {
stopTimer.invalidate()
}
configureUI(playState: .NotPlaying)
if let audioPlayerNode = audioPlayerNode {
audioPlayerNode.stop()
}
if let audioEngine = audioEngine {
audioEngine.stop()
audioEngine.reset()
}
}
// MARK: UI Functions
func configureUI(playState: PlayingState) {
switch(playState) {
case .Playing:
setPlayButtonsEnabled(enabled: false)
stopplaybackButton.isEnabled = true
case .NotPlaying:
setPlayButtonsEnabled(enabled: true)
stopplaybackButton.isEnabled = false
}
}
func setPlayButtonsEnabled(enabled: Bool) {
snailButton.isEnabled = enabled
chipmunkButton.isEnabled = enabled
rabbitButton.isEnabled = enabled
vaderButton.isEnabled = enabled
echoButton.isEnabled = enabled
reverbButton.isEnabled = enabled
}
func showAlert(title: String, message: String) {
let alert = UIAlertController(title: title, message: message, preferredStyle: .alert)
alert.addAction(UIAlertAction(title: Alerts.DismissAlert, style: .default, handler: nil))
self.present(alert, animated: true, completion: nil)
}
}
I was working on the same Udacity course, and I believe this is because the mic is not supported in the iOS 8.0+ simulator. One potential work-around is to drop this line into the viewDidLoad() function in the PlaySoundsViewController:
recordedAudioURL = Bundle.main.url(forResource: "yourSound", withExtension: "mp3")
...and then drag an mp3 named "yourSound.mp3" into your PitchPerfect project. It's a hack but it will stop the crash and allow you to test the audio modification buttons on that mp3, though you need to use an actual iOS device if you want to record your own samples in the app.
EDIT: I went through my entire codebase again and checked my audio settings. On the codebase, I had a method incorrectly placed (prepare for segue) in the PlaySoundsViewController, and for the audio settings, once I set the mic to my cinema display audio, the recording worked in the simulator. Good luck.
I had this same problem and it was because prepare didn't have correct syntax and the audioURL was not being propagated to the PlaySoundViewController.
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
if segue.identifier == "stopRecording" {
let playSoundsVC = segue.destination as! PlaySoundsViewController
let recordedAudioURL = sender as! URL
playSoundsVC.recordedAudioURL = recordedAudioURL
print("setting URL in PlaySoundsViewController")
}
}
fixed it for me.