Is it possible to simulate device orientation in a playground? - swift

Given this scenario:
XCPlaygroundPage.currentPage.liveView = TableViewController()
Would it be possible to simulate the device's orientation?

Based on some logic found in here the KIF testing framework, I suppose it should be possible.
The Objective-C code behind the link looks like this:
- (void)simulateDeviceRotationToOrientation:(UIDeviceOrientation)orientation
{
[[UIDevice currentDevice] setValue:[NSNumber numberWithInt:orientation] forKey:#"orientation"];
}
The Swift equivalent of this code would look like this:
func simulateDeviceRotation(toOrientation orientation: UIDeviceOrientation) {
let orientationValue = NSNumber(integer: orientation.rawValue)
UIDevice.currentDevice().setValue(orientationValue, forKey: "orientation")
}
And then we'd just call it something like this:
simulateDeviceRotation(toOrientation: .LandscapeLeft)
Or perhaps we want a function that runs some code for each orientation?
func forEachOrientation(block: () -> Void) {
for orientation in [UIDeviceOrientation.Portrait, UIDeviceOrientation.LandscapeLeft, UIDeviceOrientation.PortraitUpsideDown, UIDeviceOrientation.LandscapeRight] {
simulateDeviceRotation(toOrientation: orientation)
block()
}
}
And we can just call it like this:
forEachOrientation {
// do a thing
}
From my experience, this actually doesn't seem to work on iOS 9 simulators, but it does work on iOS 8 simulators. I don't know whether or not this would work on real devices.

Related

How, exactly, do I render Metal on a background thread?

This problem is caused by user interface interactions such as showing the titlebar while in fullsreen. That question's answer provides a solution, but not how to implement that solution.
The solution is to render on a background thread. The issue is, the code provided in Apple's is made to cover a lot of content so most of it will extraneous code, so even if I could understand it, it isn't feasible to use Apple's code. And I can't understand it so it just plain isn't an option. How would I make a simple Swift Metal game use a background thread being as concise as possible?
Take this, for example:
class ViewController: NSViewController {
var MetalView: MTKView {
return view as! MTKView
}
var Device: MTLDevice = MTLCreateSystemDefaultDevice()!
override func viewDidLoad() {
super.viewDidLoad()
MetalView.delegate = self
MetalView.device = Device
MetalView.colorPixelFormat = .bgra8Unorm_srgb
Device = MetalView.device
//setup code
}
}
extension ViewController: MTKViewDelegate {
func mtkView(_ view: MTKView, drawableSizeWillChange size: CGSize) {
}
func draw(in view: MTKView) {
//drawing code
}
}
That is the start of a basic Metal game. What would that code look like, if it were rendering on a background thread?
To fix that bug when showing the titlebar in Metal, I need to render it on a background thread. Well, how do I render it on a background thread?
I've noticed this answer suggests to manually redraw it 60 times a second. Presumably using a loop that is on a background thread? But that seems... not a clean way to fix it. Is there a cleaner way?
The main trick in getting this to work seems to be setting up the CVDisplayLink. This is awkward in Swift, but doable. After some work I was able to modify the "Game" template in Xcode to use a custom view backed by CAMetalLayer instead of MTKView, and a CVDisplayLink to render in the background, as suggested in the sample code you linked — see below.
Edit Oct 22:
The approach mentioned in this thread seems to work just fine: still using an MTKView, but drawing it manually from the display link callback. Specifically I was able to follow these steps:
Create a new macOS Game project in Xcode.
Modify GameViewController to add a CVDisplayLink, similar to below (see this question for more on using CVDisplayLink from Swift). Start the display link in viewWillAppear and stop it in viewWillDisappear.
Set mtkView.isPaused = true in viewDidLoad to disable automatic rendering, and instead explicitly call mtkView.draw() from the display link callback.
The full content of my modified GameViewController.swift is available here.
I didn't review the Renderer class for thread safety, so I can't be sure no more changes are required, but this should get you up and running.
Older implementation with CAMetalLayer instead of MTKView:
This is just a proof of concept and I can't guarantee it's the best way to do everything. You might find these articles helpful too:
I didn't try this idea, but given how much convenience MTKView generally provides over CAMetalLayer, it might be worth giving it a shot:
https://developer.apple.com/forums/thread/89241?answerId=268384022#268384022
Is drawing to an MTKView or CAMetalLayer required to take place on the main thread? and https://developer.apple.com/documentation/quartzcore/cametallayer/1478157-presentswithtransaction
class MyMetalView: NSView {
var displayLink: CVDisplayLink?
var metalLayer: CAMetalLayer!
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
setupMetalLayer()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupMetalLayer()
}
override func makeBackingLayer() -> CALayer {
return CAMetalLayer()
}
func setupMetalLayer() {
wantsLayer = true
metalLayer = layer as! CAMetalLayer?
metalLayer.device = MTLCreateSystemDefaultDevice()!
// ...other configuration of the metalLayer...
}
// handle display link callback at 60fps
static let _outputCallback: CVDisplayLinkOutputCallback = { (displayLink, inNow, inOutputTime, flagsIn, flagsOut, context) -> CVReturn in
// convert opaque context pointer back into a reference to our view
let view = Unmanaged<MyMetalView>.fromOpaque(context!).takeUnretainedValue()
/*** render something into view.metalLayer here! ***/
return kCVReturnSuccess
}
override func viewDidMoveToWindow() {
super.viewDidMoveToWindow()
guard CVDisplayLinkCreateWithActiveCGDisplays(&displayLink) == kCVReturnSuccess,
let displayLink = displayLink
else {
fatalError("unable to create display link")
}
// pass a reference to this view as an opaque pointer
guard CVDisplayLinkSetOutputCallback(displayLink, MyMetalView._outputCallback, Unmanaged<MyMetalView>.passUnretained(self).toOpaque()) == kCVReturnSuccess else {
fatalError("unable to configure output callback")
}
guard CVDisplayLinkStart(displayLink) == kCVReturnSuccess else {
fatalError("unable to start display link")
}
}
deinit {
if let displayLink = displayLink {
CVDisplayLinkStop(displayLink)
}
}
}

Unsupported IOSurface format: 0x26424741 using twilio video in scenekit

I am using twilio to send a video and use that video in a scenekit as a texture. But the problem is it works fine with iPhone X, but it gave this error Unsupported IOSurface format: 0x26424741 on iPhone XR and XS.
this is what I am doing:
Get Video:
func subscribed(to videoTrack: TVIRemoteVideoTrack, publication: TVIRemoteVideoTrackPublication, for participant: TVIRemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = TVIVideoView.init(frame: UIWindow().frame,
delegate:self)
videoTrack.addRenderer(remoteView!)
delegate.participantAdded(with: remoteView!)
}
delegate:
func participantAdded(with videoView: UIView) {
sceneView.addVideo(with: videoView)
}
and add video to plane:
func addVideo(with view: UIView){
videoPlane.geometry?.firstMaterial?.diffuse.contents = view
}
The problem was actually with renderingType of remoteView. For older devices using metal was fine but newer devices it needed openGLES. I dont know why but it was the fix.
I used this solution to find out the device type.
Next I determined which renderingType to use
var renderingType: VideoView.RenderingType {
get{
let device = UIDevice()
switch device.type{
case .iPhoneXS:
return .openGLES
case .iPhoneXR:
return .openGLES
case .iPhoneXSMax:
return .openGLES
default:
return .metal
}
}
}
And used it to initialize remoteView
func didSubscribeToVideoTrack(videoTrack: RemoteVideoTrack, publication: RemoteVideoTrackPublication, participant: RemoteParticipant) {
print("Participant \(participant.identity) added a video track.")
let remoteView = VideoView.init(frame: UIWindow().frame,
delegate:self,
renderingType: renderingType)
videoTrack.addRenderer(remoteView!)
delegate.participantAddedVideo(for: participant.identity, with: remoteView!)
}

How to animate NSSlider value change in Cocoa application using Swift

How do I animate NSSlider value change so it looks continuous?
I tried using NSAnimation context
private func moveSlider(videoTime: VLCTime) {
DispatchQueue.main.async { [weak self] in
NSAnimationContext.beginGrouping()
NSAnimationContext.current.duration = 1
self?.playerControls.seekSlider.animator().intValue = videoTime.intValue
NSAnimationContext.endGrouping()
}
}
My NSSlider still does not move smoothly.
To put you into the picture, I am trying to make a video player which uses NSSlider for scrubbing through the movie. That slider should also move as the video goes on. As I said, it does move but I can not get it to move smoothly.
There is an Apple sample code in objective-C for exactly what you are looking for. Below how it would look like in Swift.
Basically you need an extension on NSSlider
extension NSSlider {
override open class func defaultAnimation(forKey key: NSAnimatablePropertyKey) -> Any?{
if key == "floatValue"{
return CABasicAnimation()
}else {
return super.defaultAnimation(forKey: key)
}
}
}
Then you can simply use something like this in your move slider function.
private func moveSlider(videoTime: Float) {
NSAnimationContext.beginGrouping()
NSAnimationContext.current.duration = 0.5
slider.animator().floatValue = videoTime
NSAnimationContext.endGrouping()
}

ARKit and AVCamera simultaneously

As there is no autofocus in ARKit, I wanted to load ARKit in a view that is half the screen and second half will have AVFoundation -> AVCamera.
Is it possible to load AVCamera and ARKit simultaneously in same app?
Thanks.
Nope.
ARKit uses AVCapture internally (as explained in the WWDC talk introducing ARKit). Only one AVCaptureSession can be running at a time, so if you run your own capture session it’ll suspend ARKit’s session (and break tracking).
Update: However, in iOS 11.3 (aka "ARKit 1.5"), ARKit enables autofocus by default, and you can choose to disable it with the isAutoFocusEnabled option.
Changing the camera focus would disrupt the tracking - so this is definitely not possible (right now at least).
Update: See #rickster answer above.
I managed to use AVFoundation with ARKit by calling self.sceneView.session.run(self.sceneView.session.configuration!) right after taking the photo.
Use self.captureSession?.stopRunning() right after taking photo to make the session resume faster.
self.takePhoto()
self.sceneView.session.run(self.sceneView.session.configuration!)
func startCamera() {
captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSession.Preset.photo
cameraOutput = AVCapturePhotoOutput()
if let device = AVCaptureDevice.default(for: .video),
let input = try? AVCaptureDeviceInput(device: device) {
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(cameraOutput)) {
captureSession.addOutput(cameraOutput)
captureSession.startRunning()
}
} else {
print("issue here : captureSesssion.canAddInput")
}
} else {
print("some problem here")
}
}
func takePhoto() {
startCamera()
let settings = AVCapturePhotoSettings()
cameraOutput.capturePhoto(with: settings, delegate: self)
self.captureSession?.stopRunning()
}

Adjust NSVisualEffectView blur radius and transparency

Is it possible to adjust the blur radius and transparency of an NSVisualEffectView when it's applied to an NSWindow (Swift or Objective-C)? I tried all variations of NSVisualEffectMaterial (dark, medium, light) - but that's not cutting it. In the image below I've used Apple's non-public API with CGSSetWindowBackgroundBlurRadius on the left, and NSVisualEffectView on the right.
I'm trying to achieve the look of what's on the left, but it seems I'm relegated to use the methods of the right.
Here's my code:
blurView.blendingMode = NSVisualEffectBlendingMode.BehindWindow
blurView.material = NSVisualEffectMaterial.Medium
blurView.state = NSVisualEffectState.Active
self.window!.contentView!.addSubview(blurView)
Possibly, related - but doesn't answer my question:
OS X NSVisualEffect decrease blur radius? - no answer
Although I wouldn't recommend this unless you are ready to fall back to it not working in a future release, you can subclass NSVisualEffectView with the following to do what you want:
- (void)updateLayer
{
[super updateLayer];
[CATransaction begin];
[CATransaction setDisableActions:YES];
CALayer *backdropLayer = self.layer.sublayers.firstObject;
if ([backdropLayer.name hasPrefix:#"kCUIVariantMac"]) {
for (CALayer *activeLayer in backdropLayer.sublayers) {
if ([activeLayer.name isEqualToString:#"Active"]) {
for (CALayer *sublayer in activeLayer.sublayers) {
if ([sublayer.name isEqualToString:#"Backdrop"]) {
for (id filter in sublayer.filters) {
if ([filter respondsToSelector:#selector(name)] && [[filter name] isEqualToString:#"blur"]) {
if ([filter respondsToSelector:#selector(setValue:forKey:)]) {
[filter setValue:#5 forKey:#"inputRadius"];
}
}
}
}
}
}
}
}
[CATransaction commit];
}
Although this doesn't use Private APIs per se, it does start to dig into layer hierarchies which you do not own, so be sure to double check that what you are getting back is what you expect, and fail gracefully if not. For instance, on 10.10 Yosemite, the Backdrop layer was a direct decedent of the Visual Effect view, so things are likely to change in the future.
I had the same issue as you had and I have solved it with a little trick seems to do the job that I wanted. I hope that it will also help you.
So in my case, I have added the NSVisualEffectView in storyboards and set its properties as follows:
and my View hierarchy is as follows:
All that reduces the blur is in the NSViewController in:
override func viewWillAppear() {
super.viewWillAppear()
//Adds transparency to the app
view.window?.isOpaque = false
view.window?.alphaValue = 0.98 //tweak the alphaValue for your desired effect
}
Going with your example this code should work in addition with the tweak above:
let blurView = NSVisualEffectView(frame: view.bounds)
blurView.blendingMode = .behindWindow
blurView.material = .fullScreenUI
blurView.state = .active
view.window?.contentView?.addSubview(blurView)
Swift 5 code
For anyone interested here is a link to my repo where I have created a small prank app which uses the code above:
GitHub link