Particles Speed Up When Recording A Video Using ARVideoKit - swift

We have an ARKit & SceneKit app that records videos using ARVideoKit pod (link). Inside our scene we have a fire particle. Fire is playing at a slow speed. However, when you start recording a video the fire particle speeds up. I can't figure out why the fire particle is speeding up. Please see the video (here) to see the issue.
Here's a sample project which you can use to test the issue: Project
I would appreciate if anyone can explain why this is happening.
Edit 1: Below is my ViewController Code
import UIKit
import ARKit
import SceneKit
import ARVideoKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
#IBOutlet weak var photoButton: UIButton!
#IBOutlet weak var videoButton: UIButton!
#IBOutlet weak var takeVideoButtonPressedLabel: UILabel!
// ARVideoKit Variables
var recorder : RecordAR?
var takenImage: UIImage?
var takenVideoAtURL : URL?
let recordingQueue = DispatchQueue(label: "recordingThread", attributes: .concurrent)
var arTrackingConfig = ARWorldTrackingConfiguration()
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.scene = SCNScene(named: "art.scnassets/fire.scn")!
self.resetTracking()
}
func resetTracking() {
arTrackingConfig = ARWorldTrackingConfiguration()
arTrackingConfig.isLightEstimationEnabled = true
sceneView.session.run(arTrackingConfig, options: [.resetTracking,
.removeExistingAnchors])
}
}
// MARK: - ARVideoKit Implementation
extension ViewController {
#IBAction func takePhoto(_ sender: UIButton) {
// Do Nothing
}
#IBAction func takeVideo(_ sender: UIButton) {
setupCamera()
startRecording()
}
func setupCamera() {
recorder = RecordAR(ARSceneKit: sceneView)
recorder?.prepare(arTrackingConfig)
}
func startRecording() {
takeVideoButtonPressedLabel.isHidden = false
recordingQueue.async {
self.recorder?.record(forDuration: 5) { path in
self.takenVideoAtURL = path
DispatchQueue.main.async {
self.takeVideoButtonPressedLabel.isHidden = true
}
}
}
}
}
Edit 2: Added sample project for testing.

While getting the image from SCNRenderer using snapshot method pass 0 as time , this way force rendering will not happen and animations will be smoother. In your case particles will not speed up

Related

How do I assign Y to X?

I couldn't figure out how to copy value of variable into another variable in Swift, an example code for this in python would be
def assignVariable():
x=1
y=x
return y
RESULT 1
When I did this it doesn't seem to work in Swift. Is there any solution to this or am I doing something wrong?
Edit: problem is at
var originalCount=countDown
it gave me Use of unresolved identifier 'countDown' but when I assign it literally it works. Here's my swift code
import Cocoa
class MainWindow: NSWindowController {
var hitCount = 0
var started:Bool = false
var timer = 10
var colorList: [NSColor] = [ NSColor.black,NSColor.blue,NSColor.brown,NSColor.cyan,NSColor.darkGray,NSColor.gray,NSColor.green,NSColor.lightGray,NSColor.magenta,NSColor.orange,NSColor.purple,NSColor.red,NSColor.white,NSColor.yellow]
#IBOutlet weak var button1: NSButton!
#IBOutlet weak var scrubber1: NSScrubber!
#IBOutlet weak var display: NSTextField!
override func windowDidLoad() {
super.windowDidLoad()
// Implement this method to handle any initialization after your window controller's window has been loaded from its nib file.
}
var countdown=10
var originalCount=countDown
//(countdown,originalCount) = (10,10) //it works if i use this instead
func startGame(){
if(countDown>0 || started==true){
display.stringValue=String(countDown)
countDown-=1
let seconds = 1.0
DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
self.startGame()
}
}else{
display.stringValue="Done "+String(hitCount)+" Taps in " + String(originalCount) + "Tap to RESET"
started=false
countDown=10;
}
}
#IBAction func labelPress(_ sender: Any) {
display.stringValue="__RESET__"
hitCount=0
countDown=10
started=false
}
#IBAction func buttonPressed(_ sender: Any) {
if started==false{
startGame()
}
button1.bezelColor = colorList[Int.random(in: 0..<colorList.count)]
started=true
button1.title=String(hitCount)
hitCount+=1
}
}
You can't initialise one variable with another at the top level in your class. Looking at your code I don't think that originalCount needs to be a property, move it inside startGame() instead and make it a local variable and also use let since it isn't changing
var countdown=10
func startGame(){
let originalCount = countDown
if(countDown>0 || started==true){
...
}

why currentTime and duration of AVAudioPlayer are nil?

why currentTime and duration of AVAudioPlayer are nil?
Recently faced with a task when creating a player
link to the mp3 working, checked
but at the same time track duration and currenttime nil
I try to change AVAudioPlayer to AVPlayer, but I have same problem
What I need to do to correct this?
Maybe I must to download this mp3 before read duration?
But it would be better to solve this problem without downloading mp3
import UIKit
import AVFoundation
class PlayerViewController: UIViewController {
//Instantiate the AVFoundation audio player class
var player: AVAudioPlayer?
//Timer for tracking the progress
var timer: Timer? = nil
#IBOutlet weak var bookCoverImageView: UIImageView!
#IBOutlet weak var timeSlider: CustomSlider!
#IBOutlet weak var timeFromStartLabel: UILabel!
#IBOutlet weak var remainingTimeLabel: UILabel!
#IBOutlet weak var previousButton: UIButton!
#IBOutlet weak var nextButton: UIButton!
#IBOutlet weak var playButton: UIButton!
#IBOutlet weak var bookNameLabel: UILabel!
#IBOutlet weak var authorNameLabel: UILabel!
override func viewDidLoad() {
super.viewDidLoad()
_ = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(updateSlider), userInfo: nil, repeats: true)
do {
//Set the path to the audio file (comes from the bundle)
let path = URL(string: "https://firebasestorage.googleapis.com/v0/b/audio-summary-v3.appspot.com/o/%D0%92%D1%8B%D1%81%D1%82%D1%83%D0%BF%D0%BB%D0%B5%D0%BD%D0%B8%D0%B5%20%D0%B2%20%D1%81%D1%82%D0%B8%D0%BB%D0%B5%20TED%20-%20%D0%94%D0%B6%D0%B5%D1%80%D0%B5%D0%BC%D0%B8%20%D0%94%D0%BE%D0%BD%D0%BE%D0%B2%D0%B0%D0%BD.mp3?alt=media&token=58cae883-36b8-445f-8338-cc04cd518eee")
//Unpacking the path string optional
if let unpackedPath = path {
try player = AVAudioPlayer(contentsOf: unpackedPath)
timer = Timer.scheduledTimer(withTimeInterval: 1.0, repeats: true) { timer in
self.timeFromStartLabel.text = String(format: "%d:%02d", Int(self.player!.currentTime) / 60, Int(self.player!.currentTime) % 60)
self.remainingTimeLabel.text = String(format: "%d:%02d", Int(self.player!.duration - self.player!.currentTime) / 60, Int(self.player!.duration - self.player!.currentTime) % 60)
}
player!.play()
timer!.fire()
}
} catch {
print(error)
}
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers, .allowAirPlay])
print("Playback OK")
try AVAudioSession.sharedInstance().setActive(true)
print("Session is Active")
} catch {
print(error)
}
I have error here
timeSlider.maximumValue = Float(self.player!.duration)
}
#IBAction func playButtonTapped(_ sender: Any) {
if player!.isPlaying {
player?.stop()
playButton.setImage(UIImage(named: "play_button"), for: .normal)
}
else {
player?.play()
playButton.setImage(UIImage(named: "stop_button"), for: .normal)
}
}
#IBAction func timeSliderScrolling(_ sender: Any) {
player?.stop()
player?.currentTime = TimeInterval(timeSlider.value)
player?.prepareToPlay()
player?.play()
}
#objc func updateSlider() {
I have error here
timeSlider.value = Float(player!.currentTime)
}
}
Use AVPlayer over AVAudioPlayer so that you can observe the current play time with addPeriodicTimeObserver.
import UIKit
import AVFoundation
class ViewController: UIViewController {
// MARK: - Variables
var player: AVPlayer?
#IBAction func playAudio(_ sender: UIBarButtonItem) {
player?.play()
}
override func viewDidLoad() {
super.viewDidLoad()
guard let path = Bundle.main.path(forResource: "myAudio", ofType:"m4a") else {
debugPrint("File not found")
return
}
let audioURL = URL(fileURLWithPath: path)
let playerItem = AVPlayerItem(url: audioURL)
player = AVPlayer(playerItem: playerItem)
player!.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, preferredTimescale: 1), queue: DispatchQueue.main) { (CMTime) -> Void in
if self.player!.currentItem?.status == .readyToPlay {
let time : Float64 = CMTimeGetSeconds(self.player!.currentTime());
print("Current play time: \(time)")
}
}
}
}

How to implement custom camera preview in iOS?

I am trying to implement Custom camera effect like:- Image
I Thought that this is achieve like this way
This type of functionality already implemented in one app which is available in App Store. here is the link enter link description here
I want to copy this app's camera functionality.
I have already implemented something like this.
I am using below code for achieved above functionality.
Into ViewController.swift class.
import UIKit
import AVFoundation
#available(iOS 10.0, *)
class ViewController: UIViewController
{
#IBOutlet weak var vc: UIView!
#IBOutlet weak var img: UIImageView!
override func viewDidLoad() {
super.viewDidLoad()
setupCamera()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
session.startRunning()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
session.stopRunning()
}
#IBOutlet fileprivate var previewView: PreviewView! {
didSet {
previewView.videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewView.layer.cornerRadius = previewView.layer.frame.size.width/2
previewView.clipsToBounds = true
}
}
#IBOutlet fileprivate var imageView: UIImageView! {
didSet {
imageView.layer.cornerRadius = imageView.layer.frame.size.width/2
imageView.clipsToBounds = true
}
}
fileprivate let session: AVCaptureSession = {
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPresetPhoto
return session
}()
fileprivate let output = AVCaptureStillImageOutput()
}
#available(iOS 10.0, *)
extension ViewController {
func setupCamera() {
let backCamera = AVCaptureDevice.defaultDevice(withMediaType:
AVMediaTypeVideo)
guard let input = try? AVCaptureDeviceInput(device: backCamera) else {
fatalError("back camera not functional.") }
session.addInput(input)
session.addOutput(output)
previewView.session = session
}
}
// MARK: - #IBActions
#available(iOS 10.0, *)
private extension ViewController {
#IBAction func capturePhoto() {
if let videoConnection = output.connection(withMediaType: AVMediaTypeVideo) {
output.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (CMSampleBuffer, Error) in
if let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(CMSampleBuffer) {
if let cameraImage = UIImage(data: imageData) {
self.imageView.image = cameraImage
UIImageWriteToSavedPhotosAlbum(cameraImage, nil, nil, nil)
}
}
})
}
}
}
Also Create Preview Class and this class into UIView from storyboard file.
From above code I have achived this image.
I need to add any shape of image layer as a frame into UIView. ButI have no idea how to achieved this type of functionality.
So, Basically my task is, how to add any shape of image layer into UIView and after capture image how to save image with image layer, like Final Image clue image

Trigger sound play() with NSSpeechRecognizer Swift

I'm trying to trigger a sound after I say a word. The speech recognizer recognizes the word when I say it and I've set it up so it puts out a string each time I say the command. What I'd like to do is trigger a sound after I say that specific word. This is what I have so far.
import Cocoa
import AVFoundation
class ViewController: NSViewController, NSSpeechRecognizerDelegate {
var ping = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("ping", ofType: "mp3")!)
var pingAudioPlayer = AVAudioPlayer()
var sr = NSSpeechRecognizer()
#IBOutlet var output: NSTextView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
pingAudioPlayer = AVAudioPlayer(contentsOfURL: ping, error: nil)
sr.delegate = self
sr.commands = ["Ping", "Ping Mac"]
sr.startListening()
}
func speechRecognizer(sender: NSSpeechRecognizer, didRecognizeCommand command: AnyObject?) {
output!.string! += "\(command)\n"
pingAudioPlayer.play()
}
override var representedObject: AnyObject? {
didSet {
// Update the view, if already loaded.
}
}
UPDATE:
import Cocoa
import AVFoundation
class ViewController: NSViewController, NSSpeechRecognizerDelegate {
var ping = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("ping", ofType: "mp3")!)
let pingAudioPlayer = AVAudioPlayer()
var sr = NSSpeechRecognizer()
#IBOutlet var output: NSTextView?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
sr.delegate = self
sr.commands = ["Ping", "Ping Mac"]
sr.startListening()
}
func speechRecognizer(sender: NSSpeechRecognizer, didRecognizeCommand command: AnyObject?) {
output!.string! += "\(command)\n"
var pingAudioPlayer = AVAudioPlayer(contentsOfURL: ping, error: nil)
pingAudioPlayer.prepareToPlay()
pingAudioPlayer.play()
}
override var representedObject: AnyObject? {
didSet {
// Update the view, if already loaded.
}
}
}
Not sure why the audio player is not playing the sound once the word is recognized. Any ideas?
You aren't telling the audio player which sound to play. Try this:
func speechRecognizer(sender: NSSpeechRecognizer, didRecognizeCommand command: AnyObject?) {
do{
let pingAudioPlayer = try AVAudioPlayer(contentsOfURL:ping)
pingAudioPlayer.prepareToPlay()
pingAudioPlayer.play()
}catch {
print("Error getting the audio file")
}
}

Crash when repeating a sound with AudioEngine in Swift

I'm trying to play sounds with different effects. In a previous viewController I record a sound, then in the next screen, it can be played with the effects. First time it works ok but the second time it crashes with error as follows:
2015-08-07 13:00:45.900 Pitch Perfect[9643:1121173] 13:00:45.900
ERROR: AVAudioEngine.mm:253: AttachNode: required condition is
false: !nodeimpl->HasEngineImpl() 2015-08-07 13:00:45.953 Pitch
Perfect[9643:1121173] Terminating app due to uncaught exception
'com.apple.coreaudio.avfaudio', reason: 'required condition is false:
!nodeimpl->HasEngineImpl()'
import UIKit
import AVFoundation
class PlaySoundsViewController: UIViewController, AVAudioPlayerDelegate {
var receivedAudio:RecordedAudio!
var audioPlayer: AVAudioPlayer!
var disabledButton:UIButton!
var firstTime:Bool = true
var audioEngine:AVAudioEngine!
var audioFile:AVAudioFile!
var audioPlayerNode:AVAudioPlayerNode!
var audioStopped:Bool!
var typeOfSound:IntegerLiteralType!
#IBOutlet weak var stopButton: UIButton!
#IBOutlet weak var reverbButton: UIButton!
#IBOutlet weak var echoButton: UIButton!
#IBOutlet weak var darthButton: UIButton!
#IBOutlet weak var chipmonkButton: UIButton!
#IBOutlet weak var snailButton: UIButton!
#IBOutlet weak var rabbitButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
audioPlayer = AVAudioPlayer(contentsOfURL: receivedAudio.filePathUrl, error: nil)
audioPlayer.enableRate=true
audioPlayer.delegate=self
var session = AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayback, error: nil)
audioPlayerNode=AVAudioPlayerNode();
audioEngine = AVAudioEngine()
audioFile = AVAudioFile(forReading: receivedAudio.filePathUrl, error: nil)
audioStopped=true;
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
func playAnimalSound(animal: String) {
if audioStopped==false {
resetAudio(typeOfSound)
}
typeOfSound = 1
audioPlayer.currentTime=0
switch animal {
case "snail":
audioPlayer.rate=0.5
case "rabbit":
audioPlayer.rate=2.0
default:
showMessage("Sound not found. How can it be?")
}
audioPlayer.play()
stopButton.hidden=false
stopButton.enabled=true
}
#IBAction func playSnailSound(sender: UIButton) {
highlightButton(sender)
playAnimalSound("snail")
}
#IBAction func playRabbitSound(sender: UIButton) {
highlightButton(sender)
playAnimalSound("rabbit")
}
func soundEnded() {
stopButton.hidden=true
disabledButton.enabled=true;
if(audioEngine.running) {
audioEngine.stop()
audioEngine.reset();
}
}
func playAudioWithVariablePitch(pitch: Float, type: String) {
if audioStopped==false {
resetAudio(typeOfSound)
}
audioEngine.attachNode(audioPlayerNode)
switch type {
case "normal":
typeOfSound = 2
var changePitchEffect = AVAudioUnitTimePitch()
changePitchEffect.pitch = pitch
audioEngine.attachNode(changePitchEffect)
audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
case "reverb":
typeOfSound = 3
var changeReverbEffect = AVAudioUnitReverb()
changeReverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset(rawValue: 4)!)
changeReverbEffect.wetDryMix=50;
audioEngine.attachNode(changeReverbEffect)
audioEngine.connect(audioPlayerNode, to: changeReverbEffect, format: nil)
audioEngine.connect(changeReverbEffect, to: audioEngine.outputNode, format: nil)
case "delay":
typeOfSound = 3
var changeDelayEffect = AVAudioUnitDelay()
audioEngine.attachNode(changeDelayEffect)
audioEngine.connect(audioPlayerNode, to: changeDelayEffect, format: nil)
audioEngine.connect(changeDelayEffect, to: audioEngine.outputNode, format: nil)
default:
showMessage("oops, there was an internal problem. Never mind")
}
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: soundEnded)
audioEngine.startAndReturnError(nil)
stopButton.hidden=false
stopButton.enabled=true
audioPlayerNode.play()
}
func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
if flag {
stopButton.hidden=true
disabledButton.enabled=true;
audioStopped=true
println("I hid stopButton and enabled the disabled one")
}
}
#IBAction func playDelaySound(sender: UIButton) {
highlightButton(sender)
playAudioWithVariablePitch(0, type: "delay")
}
#IBAction func playReverbSound(sender: UIButton) {
highlightButton(sender)
playAudioWithVariablePitch(0, type: "reverb")
}
#IBAction func playChipmunkSound(sender: UIButton) {
highlightButton(sender)
playAudioWithVariablePitch(1000.0, type: "normal")
}
#IBAction func playDarthVaderSound(sender: UIButton) {
highlightButton(sender)
playAudioWithVariablePitch(-900.0, type: "normal")
}
#IBAction func stopPlaying(sender: UIButton) {
resetAudio(typeOfSound)
stopButton.hidden=true
stopButton.enabled=false;
disabledButton.enabled=true;
}
func highlightButton(button: UIButton) {
if firstTime {
firstTime=false
} else {
disabledButton.enabled=true;
}
button.enabled=false;
disabledButton=button;
}
func resetAudio(type: IntegerLiteralType) {
switch type {
case 1 :
audioPlayer.stop()
println("case 1")
case 2 :
println("case 2")
if audioEngine.running {
audioEngine.stop()
}
audioEngine.reset()
case 3 :
audioEngine.stop()
default:
break
}
audioStopped=true;
}
func showMessage(msg: String) {
var message=UIAlertView(title: "Alert", message: msg, delegate: nil, cancelButtonTitle: "ok I won't panic")
}
}
Does anybody know why it crashes? I have researched the AVAudioEngine, AVAudioPlayer and AVAudioPlayerNode classes with no results.
Thanks
I know this is an old issue, but I didn't see the correct answer above.
The reason why it crashes is actually outlined in the error message:
AttachNode: required condition is false: !nodeimpl->HasEngineImpl()
In other words, when attaching a node it is mandatory that that node is not already attached to an engine (!nodeimpl->HasEngineImpl()).
The solution is to remove the node using audioEngine.detachNode before attempting to add it again.
Finally the crashed was caused by initializing the audioPlayerNode and the audioEngine objects in the viewDidLoad function. They need to be instantiated every time you use them, apparently, or maybe after being stopped and reset.
Placing those lines in the beginning of the playAudioWithVariablePitch function directly instead of in the viewDidLoad function, solved the crash problem. I still have a problem with the playback of the pitched, reverb and echo sounds. They get cut before it's due and I still don't know why. It has to do with the completionHandler of the audioPlayerNode.scheduleFile method.
In my case, my timer was calling again and again and this code was written inside my timer
self.recognitionTask?.finish()
node.removeTap(onBus: 0)
self.request.endAudio()
self.recognitionTask = nil
//Maybe this line was causing the issue
self.audioEngine.stop()
So if you already have stopped the request, removed the tap and stopped the engine, then these lines should not be called again.
Hope this helps some one
It looks like you are resetting the engine after playing the variable pitch effect.
func soundEnded() {
stopButton.hidden=true
disabledButton.enabled=true;
if(audioEngine.running) {
audioEngine.stop()
audioEngine.reset();
}
}
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: soundEnded)
audioEngine.startAndReturnError(nil)
stopButton.hidden=false
stopButton.enabled=true
audioPlayerNode.play()
So the engine hasnt been setup again with the nodes added and the chain linked. When you try to play the playerNode, it crashes.