I have to cache images one by one passing them into an array.
When I configure Controller, I got array of images from API. Images I got using Animation, every 0.1 sec I got new Image. But when I got all of them, I need to use cached images instead of load them again
Some variables for help
private var imagesCount = 0
private var nextImage = 0
private var imagesArray: [String] = []
private var isAnimating = true {
didSet {
if isAnimating {
animateImage()
}
}
}
Here I configure VC, and do imagesArray = images, then I will use only my array of urls
func configureView(images: ApiImages) {
guard let images = images.images else { return }
imagesArray = images
imagesCount = images.count
imageView.setImage(imageUrl: images[nextImage])
nextImage += 1
animateImage()
}
When I start my animation. Every 0.1 I get new one image, but after the end of cycle I want to use cached images
func animateImage() {
UIView.transition(with: self.imageView, duration: 0.1, options: .transitionCrossDissolve) { [weak self] in
guard let self = self else { return }
self.imageView.setImage(imageUrl: self.imagesArray[self.nextImage])
} completion: { [weak self] _ in
guard let self = self else { return }
if self.nextImage == self.imagesCount - 1{
//here is end of cycle
self.nextImage = 0
} else {
self.nextImage += 1
}
if self.isAnimating {
self.animateImage()
}
}
}
I use kingfisher, so what options I have to put here??
extension UIImageView {
func setImage(imageUrl: String, completion: (() -> ())? = nil) {
self.kf.setImage(with: URL(string: imageUrl), options: [.transition(.fade(0.5)), .alsoPrefetchToMemory]) { result in
completion?()
}
}
}
Thank you!
At first, YouTube videos run well. However, when the app enters the background thread, an empty screen appears. Even if the tableview is refreshed, the video screen does not return to normal, and the app must be restarted.
class HomeTableViewCell: UITableViewCell {
var disposeBag = DisposeBag()
var data = PublishRelay<ViewPost>()
var isLoaded = false
func bind() {
data.asDriver() { _ in .never() }
.drive(onNext: { [weak self] currentPost in
guard let self = self else { return }
if self.isLoaded == true {
self.videoContainerView.cueVideo(byId: currentPost.url, startSeconds: 0, suggestedQuality: .default)
}
else {
self.videoContainerView.load(withVideoId: currentPost.url)
self.isLoaded = true
}
......
} )
.disposed(by: disposeBag)
let videoContainerView: WKYTPlayerView = {
let videoContainerView = WKYTPlayerView()
videoContainerView.translatesAutoresizingMaskIntoConstraints = false
videoContainerView.clipsToBounds = true
videoContainerView.layer.cornerRadius = 15
return videoContainerView
}()
}
I'm currently making a multiplayer game with GameKit. I want to create a waiting viewController while each player receive the array of players and what they selected for their character.
Here is my extension
extension LoadingViewController: GKMatchDelegate {
func sendData() {
guard let match = match else { return }
do {
guard let data = gameModel.encode() else { return }
try match.sendData(toAllPlayers: data, with: .reliable)
} catch {
print("Send data failed")
}
}
func match(_ match: GKMatch, didReceive data: Data, fromRemotePlayer player: GKPlayer) {
guard let model = GameModel.decode(data: data) else { return }
gameModel = model
}
}
My Override which waits until 2 players fill the gameModel
override func viewDidLoad() {
super.viewDidLoad()
Timer.scheduledTimer(withTimeInterval: 2, repeats: true) { timer in
self.setupPlayers()
if self.gameModel.players.count == 2 {
if let view = self.view as! SKView? {
// Load the SKScene from 'GameScene.sks'
if let scene = SKScene(fileNamed: "GameScene") as? GameScene {
scene.match = self.match
scene.gameModel = self.gameModel
scene.localPlayer = self.localPlayer
scene.size = view.bounds.size
scene.scaleMode = .resizeFill
// Present the scene
view.presentScene(scene)
timer.invalidate()
view.ignoresSiblingOrder = true
view.showsFPS = true
view.showsNodeCount = true
}
}
}
}
}
And the setupPlayers which is called each time to try adding a player and set his preferencies
private func setupPlayers() {
guard let player2Name = match?.players.first?.displayName else { return }
let player1 = Player(displayName: GKLocalPlayer.local.displayName)
let player2 = Player(displayName: player2Name)
var players = [player1,player2]
players.sort { (player1, player2) -> Bool in
player1.displayName < player2.displayName
}
if players.first?.displayName == GKLocalPlayer.local.displayName {
if gameModel.players.count == 0 {
players[0].index = .one
players[0].race = .orc
gameModel.players.append(players[0])
localPlayer = players[0]
sendData()
}
} else {
if gameModel.players.count == 1 {
players[1].index = .two
players[1].race = .human
gameModel.players.append(players[1])
localPlayer = players[1]
sendData()
}
}
}
However the scene does not appear when i'm doing simulations, I tried finding the bug and when the first player go in setup Players it works and gameModel.players.count is now 1 but the second player never receive it and his own gameModel stay to one
Does anyone know why ?
I want to list all available audio devices in swift to provide a selection for input and output. My application should listen on a audio channel and "write" to another. I do not want the system default!
let devices = AVCaptureDevice.devices(for: .audio)
print(devices.count)
for device in devices {
print(device.localizedName)
}
The Code lists 0 devices. But I expect at least the internal output.
Some links to CoreAudio, AudioToolbox and AVFoundation that explain the audio source selection would be nice.
Here's some Swift 5 code that will enumerate all the audio devices.
You can use the uid with AVAudioPlayer's currentDevice property to output to a specific device.
import Cocoa
import AVFoundation
class AudioDevice {
var audioDeviceID:AudioDeviceID
init(deviceID:AudioDeviceID) {
self.audioDeviceID = deviceID
}
var hasOutput: Bool {
get {
var address:AudioObjectPropertyAddress = AudioObjectPropertyAddress(
mSelector:AudioObjectPropertySelector(kAudioDevicePropertyStreamConfiguration),
mScope:AudioObjectPropertyScope(kAudioDevicePropertyScopeOutput),
mElement:0)
var propsize:UInt32 = UInt32(MemoryLayout<CFString?>.size);
var result:OSStatus = AudioObjectGetPropertyDataSize(self.audioDeviceID, &address, 0, nil, &propsize);
if (result != 0) {
return false;
}
let bufferList = UnsafeMutablePointer<AudioBufferList>.allocate(capacity:Int(propsize))
result = AudioObjectGetPropertyData(self.audioDeviceID, &address, 0, nil, &propsize, bufferList);
if (result != 0) {
return false
}
let buffers = UnsafeMutableAudioBufferListPointer(bufferList)
for bufferNum in 0..<buffers.count {
if buffers[bufferNum].mNumberChannels > 0 {
return true
}
}
return false
}
}
var uid:String? {
get {
var address:AudioObjectPropertyAddress = AudioObjectPropertyAddress(
mSelector:AudioObjectPropertySelector(kAudioDevicePropertyDeviceUID),
mScope:AudioObjectPropertyScope(kAudioObjectPropertyScopeGlobal),
mElement:AudioObjectPropertyElement(kAudioObjectPropertyElementMaster))
var name:CFString? = nil
var propsize:UInt32 = UInt32(MemoryLayout<CFString?>.size)
let result:OSStatus = AudioObjectGetPropertyData(self.audioDeviceID, &address, 0, nil, &propsize, &name)
if (result != 0) {
return nil
}
return name as String?
}
}
var name:String? {
get {
var address:AudioObjectPropertyAddress = AudioObjectPropertyAddress(
mSelector:AudioObjectPropertySelector(kAudioDevicePropertyDeviceNameCFString),
mScope:AudioObjectPropertyScope(kAudioObjectPropertyScopeGlobal),
mElement:AudioObjectPropertyElement(kAudioObjectPropertyElementMaster))
var name:CFString? = nil
var propsize:UInt32 = UInt32(MemoryLayout<CFString?>.size)
let result:OSStatus = AudioObjectGetPropertyData(self.audioDeviceID, &address, 0, nil, &propsize, &name)
if (result != 0) {
return nil
}
return name as String?
}
}
}
class AudioDeviceFinder {
static func findDevices() {
var propsize:UInt32 = 0
var address:AudioObjectPropertyAddress = AudioObjectPropertyAddress(
mSelector:AudioObjectPropertySelector(kAudioHardwarePropertyDevices),
mScope:AudioObjectPropertyScope(kAudioObjectPropertyScopeGlobal),
mElement:AudioObjectPropertyElement(kAudioObjectPropertyElementMaster))
var result:OSStatus = AudioObjectGetPropertyDataSize(AudioObjectID(kAudioObjectSystemObject), &address, UInt32(MemoryLayout<AudioObjectPropertyAddress>.size), nil, &propsize)
if (result != 0) {
print("Error \(result) from AudioObjectGetPropertyDataSize")
return
}
let numDevices = Int(propsize / UInt32(MemoryLayout<AudioDeviceID>.size))
var devids = [AudioDeviceID]()
for _ in 0..<numDevices {
devids.append(AudioDeviceID())
}
result = AudioObjectGetPropertyData(AudioObjectID(kAudioObjectSystemObject), &address, 0, nil, &propsize, &devids);
if (result != 0) {
print("Error \(result) from AudioObjectGetPropertyData")
return
}
for i in 0..<numDevices {
let audioDevice = AudioDevice(deviceID:devids[i])
if (audioDevice.hasOutput) {
if let name = audioDevice.name,
let uid = audioDevice.uid {
print("Found device \"\(name)\", uid=\(uid)")
}
}
}
}
}
The code you posted works perfectly fine for audio input devices when I paste it into an Xcode Playground.
Note, however, that AVCaptureDevice API does not list audio output devices as they are no capture devices but playback devices. If a device supports both, input and output, you can still use the device's uniqueID in an output context, for example with AVPlayer's audioOutputDeviceUniqueID.
(Also note, that if you want your code to work on iOS as well, devices(for:) is marked as deprecated since iOS 11 and you should move to AVCaptureDevice.DiscoverySession instead.)
Regarding your request for additional info on Core Audio and AudioToolbox, this SO question has some pretty comprehensive answers on the matter. The question asks for input devices but the answers provide enough context to let you understand handling of the output side as well. There's even an answer with some (dated) Swift code. On a personal note I have to say calling Core Audio API from Swift is oftentimes more pain than gain. Because of that it might be faster, although a bit unsafer, wrapping those portions of code into Objective-C or plain C and exposing them via the Swift bridging header, if your project allows it.
If you want something like a actionSheet and need to switch between audio devices seamlessly. Use this code.
Code
import Foundation
import AVFoundation
import UIKit
#objc class AudioDeviceHandler: NSObject {
#objc static let shared = AudioDeviceHandler()
/// Present audio device selection alert
/// - Parameters:
/// - presenterViewController: viewController where the alert need to present
/// - sourceView: alertController source view in case of iPad
#objc func presentAudioOutput(_ presenterViewController : UIViewController, _ sourceView: UIView) {
let speakerTitle = "Speaker"
let headphoneTitle = "Headphones"
let deviceTitle = (UIDevice.current.userInterfaceIdiom == .pad) ? "iPad" : "iPhone"
let cancelTitle = "Cancel"
var deviceAction = UIAlertAction()
var headphonesExist = false
let optionMenu = UIAlertController(title: nil, message: nil, preferredStyle: .actionSheet)
guard let availableInputs = AVAudioSession.sharedInstance().availableInputs else {
print("No inputs available ")
return
}
for audioPort in availableInputs {
switch audioPort.portType {
case .bluetoothA2DP, .bluetoothHFP, .bluetoothLE :
let bluetoothAction = UIAlertAction(title: audioPort.portName, style: .default) { _ in
self.setPreferredInput(port: audioPort)
}
if isCurrentOutput(portType: audioPort.portType) {
bluetoothAction.setValue(true, forKey: "checked")
}
optionMenu.addAction(bluetoothAction)
case .builtInMic, .builtInReceiver:
deviceAction = UIAlertAction(title: deviceTitle, style: .default, handler: { _ in
self.setToDevice(port: audioPort)
})
case .headphones, .headsetMic:
headphonesExist = true
let headphoneAction = UIAlertAction(title: headphoneTitle, style: .default) { _ in
self.setPreferredInput(port: audioPort)
}
if isCurrentOutput(portType: .headphones) || isCurrentOutput(portType: .headsetMic) {
headphoneAction.setValue(true, forKey: "checked")
}
optionMenu.addAction(headphoneAction)
case .carAudio:
let carAction = UIAlertAction(title: audioPort.portName, style: .default) { _ in
self.setPreferredInput(port: audioPort)
}
if isCurrentOutput(portType: audioPort.portType) {
carAction.setValue(true, forKey: "checked")
}
optionMenu.addAction(carAction)
default:
break
}
}
// device actions only required if no headphone available
if !headphonesExist {
if (isCurrentOutput(portType: .builtInReceiver) ||
isCurrentOutput(portType: .builtInMic)) {
deviceAction.setValue(true, forKey: "checked")
}
optionMenu.addAction(deviceAction)
}
// configure speaker action
let speakerAction = UIAlertAction(title: speakerTitle, style: .default) { _ in
self.setOutputToSpeaker()
}
if isCurrentOutput(portType: .builtInSpeaker) {
speakerAction.setValue(true, forKey: "checked")
}
optionMenu.addAction(speakerAction)
// configure cancel action
let cancelAction = UIAlertAction(title: cancelTitle, style: .cancel)
optionMenu.addAction(cancelAction)
optionMenu.modalPresentationStyle = .popover
if let presenter = optionMenu.popoverPresentationController {
presenter.sourceView = sourceView
presenter.sourceRect = sourceView.bounds
}
presenterViewController.present(optionMenu, animated: true, completion: nil)
// auto dismiss after 5 seconds
DispatchQueue.main.asyncAfter(deadline: .now() + 5.0) {
optionMenu.dismiss(animated: true, completion: nil)
}
}
#objc func setOutputToSpeaker() {
do {
try AVAudioSession.sharedInstance().overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
} catch let error as NSError {
print("audioSession error turning on speaker: \(error.localizedDescription)")
}
}
fileprivate func setPreferredInput(port: AVAudioSessionPortDescription) {
do {
try AVAudioSession.sharedInstance().setPreferredInput(port)
} catch let error as NSError {
print("audioSession error change to input: \(port.portName) with error: \(error.localizedDescription)")
}
}
fileprivate func setToDevice(port: AVAudioSessionPortDescription) {
do {
// remove speaker if needed
try AVAudioSession.sharedInstance().overrideOutputAudioPort(AVAudioSession.PortOverride.none)
// set new input
try AVAudioSession.sharedInstance().setPreferredInput(port)
} catch let error as NSError {
print("audioSession error change to input: \(AVAudioSession.PortOverride.none.rawValue) with error: \(error.localizedDescription)")
}
}
#objc func isCurrentOutput(portType: AVAudioSession.Port) -> Bool {
AVAudioSession.sharedInstance().currentRoute.outputs.contains(where: { $0.portType == portType })
}
}
How to use
class ViewController: UIViewController {
#IBOutlet weak var audioButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
}
#IBAction func selectAudio(_ sender: Any) {
// present audio device selection action sheet
AudioDeviceHandler.shared.presentAudioOutput(self, audioButton)
}
}
Result
It is possible to list input and output devices. This is a simplification of stevex's answer.
For output devices:
if (audioDevice.hasOutput) {
if let name = audioDevice.name,
let uid = audioDevice.uid {
print("Found device \"\(name)\", uid=\(uid)")
}
}
For input devices:
if (!audioDevice.hasOutput) {
if let name = audioDevice.name,
let uid = audioDevice.uid {
print("Found device \"\(name)\", uid=\(uid)")
}
}
(Notice the ! before audioDevice.hasOutput.)
I have 2 (or more) reusable collectionView cell and each one have to play a different audio. My problem is that when the audio1 finish, audio2 file start in the same cell of the audio2. If I manually play on each cell there's no problem, but if I want to play all audio automatically one after the other, all audio are played in the same cell. How I can start the next audio in the next cell if the cell has not yet been created?
Here how I append to array:
func appendToArray() {
for (_, page) in self.resources.enumerate() {
for (index,resource) in page.enumerate() {
print("Passa di qui") // Qui passa
if resource.fileType() == .Audio {
S3Client.sharedInstance.downloadResource(resourceKey: resource.value, completion: { (success, file) in
// let files = String(file)
self.audioURLs.append(file)
/**
if self.audioURLs.count == self.resources.count {
// print("audioURLs \(self.audioURLs[index])")
MediaAudioPlayer.sharedInstance.queueTrack(self.audioURLs)
}
*/
})
}
}
}
}
This is the cellForItemAtIndexPath:
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell {
case .Audio:
let cell = collectionView.dequeueReusableCellWithReuseIdentifier(MediaAudioCell.kCellIdentifier, forIndexPath: indexPath) as! MediaAudioCell
cell.activityIndicator.startAnimating()
cell.activityIndicator.hidden = false
S3Client.sharedInstance.downloadResource(resourceKey: resource.value, completion: { (success, file) in
if success == true && file != nil {
cell.activityIndicator.stopAnimating()
cell.activityIndicator.hidden = true
cell.audioURL = file!
// Make slider indipendent from cell to another
cell.sliderAudio.tag = indexPath.row
cell.sliderAudio.addTarget(self, action: "sliderChange:", forControlEvents: .ValueChanged)
// print("ArrayURL: \(file)")
// print("CiaoCell : \(self.audioURLs.count)")
// print("Ciaoself.resources.countCell : \(self.resources.count)")
/**
if self.audioURLs.count == self.resources.count {
// print("audioURLs \(self.audioURLs[index])")
let item = self.audioURLs[indexPath.row] print("item: \(item)")
}
if self.audioURLs.count == self.resources.count {
// print("audioURLs \(self.audioURLs[index])")
// MediaAudioPlayer.sharedInstance.queueTrack(self.audioURLs)
}
*/
// Display total audio leinght
let asset = AVURLAsset(URL: file!, options: nil)
let audios = asset.tracksWithMediaType(AVMediaTypeAudio)
if let audios: AVAssetTrack = audios[0] {
let audioDuration:CMTime = audios.timeRange.duration
let seconds:Float64 = CMTimeGetSeconds(audioDuration)
cell.labelAudio.text = cell.stringFromTimeInterval(NSTimeInterval(seconds)) as String
}
}
})
return cell
}
This is part of cell's Class:
override func awakeFromNib() {
super.awakeFromNib()
// Partenza automatica, dopo 2secondi, se Accessibilità su ON
let delayTime = dispatch_time(DISPATCH_TIME_NOW, Int64(2 * Double(NSEC_PER_SEC)))
dispatch_after(delayTime, dispatch_get_main_queue()) {
if self.defaults.boolForKey("AutomaticStart") == true && self.defaults.boolForKey("goBackPressed") == false {
if let audioURL = self.audioURL {
// Set AVAudioSession for recording and playing at the same time
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayback)
try session.setActive(true)
} catch _ {}
// If audio is playing, do not pass to next if cell is created, but continue to playing.
if MediaAudioPlayer.sharedInstance.player?.playing == true { // Se metto a 'false', ed elimino 'else', non parte in automatico.
} else {
MediaVideoPlayer.sharedInstance.stop()
MediaAudioPlayer.sharedInstance.playPauseAudio(audioURL: audioURL, delegate: self)
}
}
}
}
}
And this is the player class:
class MediaAudioPlayer: NSObject, AVAudioPlayerDelegate {
static let sharedInstance = MediaAudioPlayer()
private var delegate: MediaAudioPlayerDelegate?
var player: AVAudioPlayer?
private var lastURL: NSURL?
private var timer: NSTimer?
internal var sliderTouched: Bool = false
var tracks = Array<NSURL?>()
var currentTrackIndex = 0
override init() {
super.init()
}
// MARK: Setup
func playPauseAudio(audioURL url: NSURL, delegate: MediaAudioPlayerDelegate) {
self.delegate?.playing = true // Set default play button on last delegate
self.delegate = delegate // Save delegate
self.sliderTouched = false
// Setup as new only when this audio has not been already set up
if (self.lastURL == nil) || (url != self.lastURL) {
self.lastURL = url
self.setupAudioSession(category: AVAudioSessionCategoryPlayback)
do { // Setup Player
self.player = try AVAudioPlayer(contentsOfURL: url)
} catch _ {}
self.player?.delegate = self
self.player?.prepareToPlay()
timer = NSTimer.scheduledTimerWithTimeInterval(0.4, target: self, selector: #selector(MediaAudioPlayer.update), userInfo: nil, repeats: true)
}
// Play - Pause
if self.player?.playing == true {
self.player?.pause()
self.delegate?.playing = true
} else {
self.player?.play()
self.delegate?.playing = false
}
}
// Transform second to minute
func stringFromTimeInterval(interval: NSTimeInterval) -> NSString {
let ti = NSInteger(interval)
let seconds = ti % 60
let minutes = (ti / 60) % 60
return NSString(format: "%0.2d:%0.2d", minutes, seconds)
}
// MARK: Audio Session
private func setupAudioSession(category category: String) {
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(category)
try session.setActive(true)
} catch _ {}
}
// MARK: Stop
func stop() {
self.player?.stop()
self.player = nil // Deinit player
self.delegate?.playing = true
self.delegate = nil // Deinit delegate
self.timer?.invalidate(); self.timer = nil
self.lastURL = nil
}
// MARK: Playing
internal func playing() -> Bool {
if player != nil {
return player?.rate == 1.0 ? true : false
}
return false
}
// MARK: Seek
func seekToPosition(position position: Float) {
if let duration = self.player?.duration {
player?.currentTime = Double(position) * duration
self.delegate?.currentTimeAudio = stringFromTimeInterval((player?.currentTime)!) as String
}
}
func update() {
if sliderTouched == false {
if let currentTime = self.player?.currentTime, duration = player?.duration {
let time = Float(currentTime) / Float(duration)
self.delegate?.sliderPosition = time
self.delegate?.currentTimeAudio = stringFromTimeInterval((player?.currentTime)!) as String
}
}
}
// MARK: Delegate
var counter = 0
func audioPlayerDidFinishPlaying(player: AVAudioPlayer, successfully flag: Bool) {
print("Called")
self.lastURL = nil
self.delegate?.playing = true
/**
if flag == true {
nextSong(true)
}*/
/**
if ((counter + 1) == tracks.count) {
counter = 0
self.delegate?.playing = false
nextSong(false)
} else {
self.delegate?.playing = true
nextSong(true)
}
*/
}
}
Thank you!!