iOS AVPlayer won't play short video in iPhone 7 - swift

I am trying to play mp4 video file continuously. But it is not played short video about 2 seconds. With videos more than 2 seconds(6 seconds) is worked well. It is happened only in iPhone 7 in device and simulator. But with other devices it is worked well also.
My code is here:
//Stop player before starting new play
player?.pause()
player?.replaceCurrentItem(with: nil)
player = nil
playerLayer?.removeFromSuperlayer()
playerLayer = nil
guard let url = Bundle.main.url(forResource: video, withExtension: "mp4") else {
return
}
try? AVAudioSession.sharedInstance().setCategory(.playAndRecord)
try? AVAudioSession.sharedInstance().setActive(true)
player = AVPlayer(url: url)
player?.volume = 1.0
playerLayer = AVPlayerLayer(player: player)
videoContainerView.layer.addSublayer(playerLayer)
playerLayer?.frame = videoContainerView.bounds
//track player progress
let interval = CMTime(value: 1, timescale: 2)
player?.addPeriodicTimeObserver(forInterval: interval, queue: .main, using: { [weak player] (progressTime) in
let seconds = CMTimeGetSeconds(progressTime)
guard let duration = player?.currentItem?.duration else {
return
}
let durationSeconds = CMTimeGetSeconds(duration)
if seconds >= durationSeconds {
// Seek and play again
player?.seek(to: CMTime.zero, completionHandler: { (completedSeek) in
player?.play()
})
}
})
player?.play()

Related

AVPlayer can't play HLS streaming with rate > x2.0 on iOS 15.x

I have a problem when playing the HLS stream with a rate > 2.0 on iOS 15.x
I try this code:
let strURL = "https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8"
if let url = URL(string: strURL) {
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
let layer = AVPlayerLayer(player: player)
layer.frame = viewPlaying.bounds
viewPlaying.layer.addSublayer(layer)
player?.playImmediately(atRate: 4.0)
}
It wasn't playing on iOS 15.x, if I change to rate <= 2.0, it's normal. How can I fix this issue?

AVAudioPlayer does not think it is playing when it is

I call the superPlay function to start audio playback a little while later I call the superPlay function again to stop the playback. This does not work though because player.isPlaying is false even knowing the sound is clearly playing in an infinite loop. I have no idea why please help!
func superPlay(timeInterval: TimeInterval, soundName: String) {
do {
alarmSound = soundName
//set up audio session
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, options: [.defaultToSpeaker, .duckOthers])
try AVAudioSession.sharedInstance().setActive(true)
let url = Bundle.main.url(forResource: alarmSound, withExtension: "mp3")
player = try! AVAudioPlayer(contentsOf: url!)
player.numberOfLoops = -1
//Start AVAudioPlayer
print(timeInterval)
print(time)
let playbackDelay = timeInterval // must be ≥ 0
if player.isPlaying {
player.stop()
} else {
player.play(atTime: player.deviceCurrentTime + playbackDelay) //time is a TimeInterval after which the audio will start
}
}
catch {
print(error)
}
}
I have spent a couple more days debugging this now. What is happening is the original play is being assigned to a specific AudioPlayer ID for example the print is: "Optional(<AVAudioPlayer: 0x600002802280>)"
When I call the function again to stop the play the AVAudioPlayer is assigned a different ID therefore it does not find that the old player is still playing and moves forward with playing a new sound on top of the old sound.
I am not sure how to store the AVAudioPlayer ID and then call the function so that it checks the store Player for if it is ".isPlaying"??
This happen because you initialize it before stopping current audio. Try this way:-
func superPlay(timeInterval: TimeInterval, soundName: String) {
// If audio is playing already then stop it.
if let audioPlayer = player {
if player.isplaying {
player.stop()
}
}
// Initilize audio player object with new sound
do {
alarmSound = soundName
//set up audio session
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, options: [.defaultToSpeaker, .duckOthers])
try AVAudioSession.sharedInstance().setActive(true)
let url = Bundle.main.url(forResource: alarmSound, withExtension: "mp3")
player = try! AVAudioPlayer(contentsOf: url!)
player.numberOfLoops = -1
//Start AVAudioPlayer
print(timeInterval)
print(time)
let playbackDelay = timeInterval // must be ≥ 0
player.play(atTime: player.deviceCurrentTime + playbackDelay) //time is a TimeInterval after which the audio will start
}
catch {
print(error)
}
}

Swift: Playing audio using AVPlayer - Audio is not playing, cannot hear audio

I am using AVPlayer to play an mp3 audio file only. I am using a url that I tested and works fine. I needed to use the AVPlayer because I needed to setup a UISlider programmatically and AVPlayer is convenient. The UISlider works and updates as the audio plays. The audio might be be playing but I cannot hear the sound. I say this because the UISlider is working.
Update: You can hear the audio when building the app on a simulator. Issue occurs when building it on device - mine is XS MAX.
Link to screen recordong ->
Visit: https://streamable.com/nkbn8
I have tried using the same URL with AVAudioPlayer and audio plays and you can hear it.
private func setupAudioContent() {
let urlString = "https://s3.amazonaws.com/kargopolov/kukushka.mp3"
if let url = NSURL(string: urlString) {
audioPlayer = AVPlayer(url: url as URL)
let playerLayer = AVPlayerLayer(player: audioPlayer)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
audioPlayer?.play()
audioPlayer?.volume = 1.0
audioPlayer?.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
let interval = CMTime(value: 1, timescale: 2)
audioPlayer?.addPeriodicTimeObserver(forInterval: interval, queue: DispatchQueue.main, using: { (progressTime) in
let currentTime = CMTimeGetSeconds(progressTime)
let currentTimeSecondsString = String(format: "%02d", Int(currentTime.truncatingRemainder(dividingBy: 60)))
let currentTimeMinutesString = String(format: "%02d", Int(currentTime / 60))
self.currentTimeLabel.text = "\(currentTimeMinutesString):\(currentTimeSecondsString)"
if let duration = self.audioPlayer?.currentItem?.duration {
let durationsSeconds = CMTimeGetSeconds(duration)
self.audioSlider.value = Float(currentTime / durationsSeconds)
}
})
}
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == "currentItem.loadedTimeRanges" {
isAudioPlaying = true
if let duration = audioPlayer?.currentItem?.duration {
let seconds = CMTimeGetSeconds(duration)
let secondsText = Int(seconds) % 60
let minutesText = String(format: "%02d", Int(seconds) / 60)
audioLengthLabel.text = "\(minutesText):\(secondsText)"
}
}
}
#objc func handleSliderChange() {
if let duration = audioPlayer?.currentItem?.duration {
let totalSeconds = CMTimeGetSeconds(duration)
let value = Float64(audioSlider.value) * totalSeconds
let seekTime = CMTime(value: Int64(value), timescale: 1)
audioPlayer?.seek(to: seekTime, completionHandler: { (completedSeek) in
})
}
}
Expected result: Hear Audio playing
Actual result: cannot hear audio playing. Seems like audio is playing just no sound.
When using AVPlayer you should make sure your device is not on silent mode as that will cause to not output audio even though your volume is at max.
If you would like to keep your device on silent mode and still play the audio you can use the following code before your .play():
do {
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [])
}
catch {
// report for an error
print(error)
}

Can't able to get Video Tracks from AVURLAsset for HLS videos(.m3u8 format) for AVPlayer?

I am developing a custom video player to stream HLS videos from server. I can successfully play HLS videos using AVPlayerItem and AVPlayer.
After that I want to add subtitle track and audio tracks for my video player. So I used AVMutableComposition to do so. So now the issue is when I am creating AVURLAsset for HLS Videos, I can't able to get video tracks from AVURLAsset. It is giving me always 0 tracks. I tried "loadValuesAsynchronously" of AVURLAsset and I tried adding KVO for "tracks" of AVPlayerItem. But None of these producing me any positive result.
I am using the following code.
func playVideo() {
let videoAsset = AVURLAsset(url: videoURL!)
let composition = AVMutableComposition()
// Video
let videoTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let tracks = videoAsset.tracks(withMediaType: .video)
guard let track = tracks.first else {
print("Can't get first video track")
return
}
try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: track, at: kCMTimeZero)
} catch {
print(error)
return
}
guard let subtitlesUrl = Bundle.main.url(forResource: "en", withExtension: "vtt") else {
print("Can't load en.vtt from bundle")
return
}
//Subtitles
let subtitleAsset = AVURLAsset(url: subtitlesUrl)
let subtitleTrack = composition.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let subTracks = subtitleAsset.tracks(withMediaType: AVMediaType.text)
guard let subTrack = subTracks.first else {
print("Can't get first subtitles track")
return
}
try subtitleTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: subTrack, at: kCMTimeZero)
} catch {
print(error)
return
}
// Prepare item and play it
let item = AVPlayerItem(asset: composition)
self.player = AVPlayer(playerItem: item)
self.playerLayer = AVPlayerLayer.init()
self.playerLayer.frame = self.bounds
self.playerLayer.contentsGravity = kCAGravityResizeAspect
self.playerLayer.player = player
self.layer.addSublayer(self.playerLayer)
self.player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
self.player.play()
}
This procedure working well for .mp4 videos but not for HLS Videos(.m3u8). Anyone have some working solution for this?
or
How can we get tracks from HLS videos using AVURLAsset? If this is not possible then How can achieve similar result ?
Please let me know you feedback.
Many more thanks in advance.
For HLS video tracks(withMediaType: .video) will return an empty array.
Use this instead: player.currentItem.presentationSize.width and player.currentItem.presentationSize.height.
Pls let me know if it works.
I didn't have the exact same problem as you. But I got around a similar problem (querying for HDR) by instead of querying the tracks on the AVURLAsset, I queried the tracks on the AVPlayerItem.
Set up an observer on the item status:
player?.observe(\AVPlayer.currentItem?.status,
options: [.new, .initial], changeHandler: { [weak self] player, _ in
DispatchQueue.main.async {
self?.observedItemStatus(from: player)
}
})
Then query the AVMediaType of your choice (in your case text).
func observedItemStatus(from avPlayer: AVPlayer) {
guard let currentItem = avPlayer.currentItem else { return }
// ideally execute code based on currentItem.status...for the brevity of this example I won't.
let hasLegibleMedia = currentItem.tracks.first(where: {
$0.assetTrack?.mediaType == AVMediaType.text
})?.assetTrack.hasMediaCharacteristic(.legible)
}
Alternatively if you need more than just a Bool, you could do a loop to access the assetTrack you really want.

Playing a video saved in document directory [duplicate]

This question already has answers here:
FileManager cannot find audio file
(1 answer)
AVPlayer playerWithURL not working after app restart
(2 answers)
Closed 4 years ago.
I am trying to play a video (.mov) saved in document directory but have failed. The URL of the video was saved into core data. The video file was saved in document directory (I checked it by downloading the directory onto my mac) and I saved the video recorded through my iPhone and saved it as ".MOV". The video was there. The "videourl" in the code returned, "file:///var/mobile/Containers/Data/Application/E45E8260-EA7C-4D30-91AD-D65A208D2057/Documents/myvideo.MOV".
I want to play the video when a user hit the button. When the button was hit, AVplayer shows up but the video is played.
Please help me.
#IBAction func buttonTwoInVmDetailForWatchingAndEditing(_ sender: UIButton) {
print("***********************************")
print(vmIDInWatchingAndEditingViewController)
let fetchRequest:NSFetchRequest<NSFetchRequestResult> = NSFetchRequest.init(entityName: "TaskAnalysisURL")
let predicateWithVmIDInVideoPhotoView = NSPredicate(format: "vmID == %#", vmIDInWatchingAndEditingViewController as CVarArg)
fetchRequest.predicate = predicateWithVmIDInVideoPhotoView
do { //try context.fetch(Users.fetchRequest())
let taskAnalysisURLsSelected = try context.fetch(fetchRequest)
let taskAnalysisURLsSelectedTofetch = taskAnalysisURLsSelected as! [TaskAnalysisURL]
if taskAnalysisURLsSelectedTofetch.count == 0 {
print("================== if ================")
print(taskAnalysisURLsSelectedTofetch.count)
} else {
print("================== else ================")
print(taskAnalysisURLsSelectedTofetch.count)
for taskAnalysisURLSelectedTofetch in taskAnalysisURLsSelectedTofetch as [NSManagedObject] {
let stepTwoURLToPlayVideo = taskAnalysisURLSelectedTofetch.value(forKey: "stepTwoURL")
print("===========URL=============")
print(stepTwoURLToPlayVideo!)
let videoUrl = "file://\(String(describing: stepTwoURLToPlayVideo!))"
print(videoUrl)
let videourl = URL(fileURLWithPath: stepTwoURLToPlayVideo! as! String)
print("#########################")
print(videourl)
let player = AVPlayer(url: videourl) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true) {
let playerItem = AVPlayerItem(url: videourl)
let playerTwo = AVPlayer(playerItem: playerItem)
let playerLayer = AVPlayerLayer(player: playerTwo)
playerLayer.frame = self.view.frame
self.view.layer.addSublayer(playerLayer)
playerViewController.player!.play()
}
}
}
}
catch {
print(error.localizedDescription)
}
}