Disable camera view growing animation - swift

Currently, I have a vie controller which modally presents a view controller which contains a camera. However, whenever I transition, the preview layer has an animation so it circularly grows from the top left corner to fill the rest of the screen. I've tried disabling CALayer implicit animations but to no success. Here's the code when the view appears.
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
previewLayer?.frame = self.view.frame
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
capturedImageView.center = self.view.center
captureSession = AVCaptureSession()
if usingFrontCamera == true {
captureSession?.sessionPreset = AVCaptureSession.Preset.hd1920x1080
}
else {
captureSession?.sessionPreset = AVCaptureSession.Preset.hd1280x720
}
captureDevice = AVCaptureDevice.default(for: AVMediaType.video)
do {
let input = try AVCaptureDeviceInput(device: captureDevice!)
if (captureSession?.canAddInput(input) != nil) {
captureSession?.addInput(input)
stillImageOutput = AVCapturePhotoOutput()
captureSession?.addOutput(stillImageOutput!)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.view.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
} catch {
}
}
Is there anyway to remove this growing animation? Here's a gif of the problem:

When you change the layer frame, there is an implicit animation. You can use CATransaction to disable the animation.
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
CATransaction.begin()
CATransaction.setDisableActions(true)
previewLayer?.frame = self.view.frame
CATransaction.commit()
}

You are doing things in two stages. In viewWillAppear, you add the preview layer without giving it any size at all, so it is a zero size layer at the zero origin:
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.view.layer.addSublayer(previewLayer!)
Then later, in viewDidAppear, you grow the preview layer by giving it an actual frame:
previewLayer?.frame = self.view.frame
The two stages happen in that order and we are able to see the jump caused by the change in the frame of the preview layer.
If you don't want to see a jump, don't do that. Don't add the preview layer until you can give it its actual frame first.

Related

Playback buttons do not appear in an AVPlayer after adding a layer - iOS 16

I am using an AVPlayer to present a video. The app has only one .mp4 but for a different use case, the same video needs to get flipped.
The buttons are there and completely functional, you can press the play and the 15 seconds forward/backward buttons but they don't appear on the screen (4th video in the attached image)
The issue seems to be that the flip layer I am adding overlays the new layout buttons.
The potential fix I was thinking of is to flip the video before adding it to the player.
Do you know if there is a straightforward solution for this?
Maybe there is an easy way to keep the iOS 15 playback button layout?
The code the app is using to flip the video is as follows:
#IBAction func pressButton(_ sender: Any) {
guard let path = Bundle.main.path(forResource: "sample-5s", ofType:"mp4") else {
return
}
let avPlayer = AVPlayer(url: URL(fileURLWithPath: path))
let avPlayerController = AVPlayerViewController()
present(avPlayerController, animated: true, completion: {
let flippedLayer = AVPlayerLayer(player: avPlayer)
let transform = CGAffineTransform(scaleX: -1.0, y: 1.0)
flippedLayer.frame = (avPlayerController as UIViewController).view.frame
flippedLayer.setAffineTransform(transform)
(avPlayerController as UIViewController).view.layer.addSublayer(flippedLayer)
avPlayerController.player = avPlayer
avPlayer.play()
})
}
Found a solution. It's all about adding AVPlayerViewController inside the parent view controller. Assume that you have a AVPlayerViewController inside a viewController. So first in viewDidLoad make sure to add it as a child.
var playerController = AVPlayerViewController()
lazy var playerBaseView: UIView = {
let view = UIView()
view.backgroundColor = .clear
return view
}()
override func viewDidLoad() {
// Make your player parent view constraints or do it on Storyboard
playerBaseView.snp.makeConstraints { (make) in
make.edges.equalTo(view.safeAreaLayoutGuide.snp.edges)
}
addChild(playerController) // Add as a child
playerBaseView.addSubview(playerController.view)
// Make your playerController view constraints or do it on Storyboard
playerController.view.snp.makeConstraints { (make) in
make.edges.equalToSuperview()
}
}
For Swift UI
CustomVideoPlayer()
.compositingGroup()
struct CustomVideoPlayer : UIViewControllerRepresentable {
var player: AVPlayer
func makeUIViewController(context: UIViewControllerRepresentableContext<CustomVideoPlayer>) -> AVPlayerViewController {
let controller = AVPlayerViewController()
controller.player = player
return controller
}
func updateUIViewController(_ uiViewController: AVPlayerViewController, context: UIViewControllerRepresentableContext<CustomVideoPlayer>) {
}
Fixed problem for me in IOS 16 Swift UI

Add alpha to parentVC.view

I am trying to add alpha to the background view when tapped on a button. So far achieved adding blur but alpha not so much.
How can I add alpha to the background so that when the bottom sheet appears background will be darker and disabled.
let maxDimmedAlpha: CGFloat = 0.2
lazy var dimmedView: UIView = {
let view = UIView()
view.backgroundColor = .black
view.alpha = maxDimmedAlpha
return view
}()
#objc func shareBtnClick() {
dimmedView.frame = self.parentVC.view.bounds
dimmedView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
self.parentVC.view.addSubview(dimmedView)
if self.parentVC.navigationController != nil {
if self.parentVC.navigationController?.viewControllers.count == 1 {
showBottomSheet()
} else {
NotificationCenter.default.post(name: NSNotification.Name("ShowBottomSheet"), object: nil, userInfo: ["itemId": modalSheet(), "delegate": self])
}
} else {
showBottomSheet()
}
}
func showBottomSheet() {
let modalSheet = MainBottomSheet()
modalSheet.data = self.modalSheet()
modalSheet.delegate = self
modalSheet.modalPresentationStyle = .overCurrentContext
self.parentVC.present(modalSheet, animated: true)
}
I was able to produce the dimmed effect using this code in XCode, I'm not sure why it won't work in your project but there is an easy way to debug this.
I suggest using Debug View Hierarchy, one of XCode's best tools in my opinion. This allows you to separate every single layer of the user interface. This way, you can see if your dimmedView is actually being added to the parent view and that its frame is matching the parent view's bounds.
Keep in mind if your background is dark, you won't see this dimmedView because its backgroundColor is set to UIColor.black.
Debug View Hierarchy button

ARKit Image Detection and Add Image From Assets.xcassets

I am playing around with the code I downloaded from Apple Developer site on AR Image Detection. I am trying to modify it to show a specific image in the AR Resource folder in Resources/Assets.xcassets once an image is detected. I see a similar question was posted 2 years ago and I tried the one and only answer on it but have no success. Can anyone help? Thank-you!
import ARKit
import SceneKit
import UIKit
class ViewController: UIViewController, ARSCNViewDelegate {
#IBOutlet var sceneView: ARSCNView!
#IBOutlet weak var blurView: UIVisualEffectView!
/// The view controller that displays the status and "restart experience" UI.
lazy var statusViewController: StatusViewController = {
return children.lazy.compactMap({ $0 as? StatusViewController }).first!
}()
/// A serial queue for thread safety when modifying the SceneKit node graph.
let updateQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! +
".serialSceneKitQueue")
/// Convenience accessor for the session owned by ARSCNView.
var session: ARSession {
return sceneView.session
}
// MARK: - View Controller Life Cycle
override func viewDidLoad() {
super.viewDidLoad()
sceneView.delegate = self
sceneView.session.delegate = self
// Hook up status view controller callback(s).
statusViewController.restartExperienceHandler = { [unowned self] in
self.restartExperience()
}
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Prevent the screen from being dimmed to avoid interuppting the AR experience.
UIApplication.shared.isIdleTimerDisabled = true
// Start the AR experience
resetTracking()
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
session.pause()
}
// MARK: - Session management (Image detection setup)
/// Prevents restarting the session while a restart is in progress.
var isRestartAvailable = true
/// Creates a new AR configuration to run on the `session`.
/// - Tag: ARReferenceImage-Loading
func resetTracking() {
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else {
fatalError("Missing expected asset catalog resources.")
}
let configuration = ARWorldTrackingConfiguration()
configuration.detectionImages = referenceImages
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
statusViewController.scheduleMessage("Look around to detect images", inSeconds: 7.5, messageType: .contentPlacement)
}
// MARK: - ARSCNViewDelegate (Image detection results)
/// - Tag: ARImageAnchor-Visualizing
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let imageAnchor = anchor as? ARImageAnchor else { return }
let referenceImage = imageAnchor.referenceImage
updateQueue.async {
// Create a plane to visualize the initial position of the detected image.
let plane = SCNPlane(width: referenceImage.physicalSize.width,
height: referenceImage.physicalSize.height)
let planeNode = SCNNode(geometry: plane)
planeNode.opacity = 0.25
/*
`SCNPlane` is vertically oriented in its local coordinate space, but
`ARImageAnchor` assumes the image is horizontal in its local space, so
rotate the plane to match.
*/
planeNode.eulerAngles.x = -.pi / 2
/*
Image anchors are not tracked after initial detection, so create an
animation that limits the duration for which the plane visualization appears.
*/
planeNode.runAction(self.imageHighlightAction)
plane.materials = [SCNMaterial()]
plane.materials[0].diffuse.contents = UIImage(named: "Macbook 12-inch")
// Add the plane visualization to the scene.
node.addChildNode(planeNode)
DispatchQueue.main.async {
let imageName = referenceImage.name ?? ""
self.statusViewController.cancelAllScheduledMessages()
self.statusViewController.showMessage("Detected image “\(imageName)”")
}
}
}
var imageHighlightAction: SCNAction {
return .sequence([
.wait(duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOpacity(to: 0.15, duration: 0.25),
.fadeOpacity(to: 0.85, duration: 0.25),
.fadeOut(duration: 0.5),
.removeFromParentNode()
])
}
}
In Xcode's Assets folder, click on + button and create a folder for reference images (use .png or .jpg formats). In Xcode's directories this folder will get .arresourcegroup extention.
// AR and Textures –> AR Resource Group
You can rename this folder. There's no need to put inside this folder Hi-Res images. Appropriate resolution for each image is 400x400. Put there not more than 100 images. That's all.
Your code may look like this:
guard let images = ARReferenceImage.referenceImages(
inGroupNamed: "AR Resources",
bundle: nil)
else { return }
let config = ARWorldTrackingConfiguration()
config.detectionImages = images
config.maximumNumberOfTrackedImages = 3
arView.session.run(config, options: [])

Swift 4 AppStore like transition

Trying to get this transition https://www.youtube.com/watch?v=4TVnRx7PTTs
First VC has a card like image and second one is tableview with same image as sqare on first cell.
let storyboard = self.storyboard?.instantiateViewController(withIdentifier: "vc2")
let firstClassView = self.view
let secondClassView = storyboard?.view
secondClassView?.frame = self.card.frame
if let window = UIApplication.shared.keyWindow {
window.insertSubview(secondClassView!, aboveSubview: firstClassView!)
UIView.animate(withDuration: 0.5, animations: { () -> Void in
secondClassView?.frame = self.view.frame
}, completion: {(Finished) -> Void in
self.present(storyboard!, animated: false, completion: nil)
})
}
so i am adding the second vc's view to first vc with the frame of cardImage and animating it to full frame of a view. but the cardImage is not square so it doesn't look as good as app store. any suggestions?

Why videoview width size is half?

Run cameraApp in xcode, Width size is not full
cameraview is not full on uiview.
please help me T_T
also I can't speak english well so ask question very difficult
override func viewWillAppear(_ animated: Bool) {
captureSession = AVCaptureSession()
stillImageOutput = AVCapturePhotoOutput()
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
do {
let input = try AVCaptureDeviceInput(device: device)
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(stillImageOutput)) {
captureSession.addOutput(stillImageOutput)
captureSession.startRunning()
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer!)
previewLayer?.position = CGPoint(x: self.cameraView.frame.width/2, y: self.cameraView.frame.height/2)
previewLayer?.bounds = cameraView.bounds
}
}
}
catch {
print(error)
}
}
If I use your code and add the previewLayer to my main view, I get a full size camera view in portrait but a half-size camera view in landscape.
So, the isuse might be in how you have set up the cameraView, or it could be something else. I can't be certain. Check the size and bounds of your cameraView to see how it is set up.
Also, the videoGravity property for previewLayer controls how the preview is displayed. You have it set to AVLayerVideoGravityResizeAspect which will fit the video within the layer's bounds. If the cameraView is set correctly, hou can try a different setting like AVLayerVideoGravityResizeAspectFill to see if that would give you the result you want.