I'm trying to execute a method when the system volume changes.
I've tried using
DistributedNotificationCenter.default().addObserver(self,selector: #selector(volumeChanged(_:)),name: NSNotification.Name(rawValue: "com.apple.sound.settingsChangedNotification"),object: nil) but it didn't work.
Well, it does work. But only if the System Preferences app is open.
What's the right way to accomplish this task?
Ps: note that it's on MacOS, not iOS
After trying countless ways I found a nice workaround: instead of searching for a probably non-existent notification, I try to get the physical key-press event.
As the media keys are not sending a normal CGEvent I came up with this solution: Capture OSX media control buttons in Swift
Note that the TouchBar simulates such a key event, so any app that you'll write using this method will also work for those MacBook models which have the TouchBar.
It's probably not the ideal solution, but it works. If anyone knows a better way please let me know.
Related
I am completely new to iOS or Mac development, and I am trying to implement opening and reading files in an app for MacOS. By default I had my app use SwiftUI. Looking up how to implement such a functionality using SwiftUI I saw suggestions to use UIDocumentPickerViewController. However I cannot find a proper documentation as to how to use it in practice. Apple's documentation page is not informative at all -- it doesn't provide any information as of how to actually use this class.
Trying to follow some examples I found elsewhere on the Internet, I am now stuck with getting Cannot find UIDocumentPickerViewController in scope compilation error. I have tried importing UIKit, AppKit, CoreServices, MobileCoreServices, Cocoa but nothing seems to help -- extending the class as described in another StackOverflow answer just fails with the same compilation error.
How do I properly use UIDocumentPickerViewController, or how do I implement the same functionality using some other method if this one is wrong?
Apparently UIDocumentPickerViewController is not available when building for Mac OS X, and NSOpenPanel seems to be a way to get the necessary functionality.
When I update tilesets on mapbox, changes don't appear in the iOS app unless I re-install it. There is seemingly documentation on this here: https://docs.mapbox.com/ios/api/maps/5.2.0/Classes/MGLOfflineStorage.html#/c:objc(cs)MGLOfflineStorage(im)setMaximumAmbientCacheSize:withCompletionHandler: but I can't figure out how exactly to implement it. I don't have an MGLOfflineStorage object because I am not worried about offline map storage right now, I just want to refresh the cache in the app. There are good examples of how to do this in android, but not on iOS. Any help is appreciated (preferably in swift)
It appears to be correct to call the methods on the shared MGLOfflineStorage object. The method parameter should be a closure containing any code you want to execute upon completion.
MGLOfflineStorage.shared.invalidateAmbientCache { error in
print("Invalidated")
}
Naturally you should check the error in the usual 'safe' way.
I am using a Oculus DK2 (v0.8) and OSVR SDK. I'm having a problem getting the HMD to run/display anything.
The Oculus samples and the OSVR samples do work however, so the osvr_server seems to run fine.
My application itself renders a test scene just fine when not using a HMD.
I tried two approaches:
First, just creating a osvr context and creating a DisplayConfig object. This seems to work, but DisplayConfig::checkStartup() fails (I do this in a loop, calling update on the context when the checkStartup call is failing). I used the OpenGLSample.cpp as a guide for this
Second, I tried using a RenderManager, but the call to createRenderManager results in a crash within the RenderManager.dll. I get the same crash wether I create the graphics lib object myself or if I let the library create it.
I am quite stuck now, since the demos and examples do work, I have no idea where to look for the error on my side. Creating the context works, querying interfaces as well, but the crash with createRenderManager is beyond me.
Does anyone have any hints or ideas what the problem could possibly be?
Regards and thanks in advance
pettersson
RenderManager should not crash during open. There have been a couple of bug fixes recently to avoid that happening, and the latest RenderManager binaries, libraries and header files are available with the SDK download from http://osvr.github.io/using/ along with updated copies of the example programs.
When something goes wrong in RenderManager, it usually reports that to standard error. We're moving that to a logging interface, but for now it should show up on the console. Posting an output of that as an issue at https://github.com/sensics/OSVR-RenderManager/issues is a good way to let the developers know that there is a problem. Of course, providing the same sort of information you provided here will be helpful as well.
In my view.designer.cs my outlets are generated by xcode.
When starting debug I have a null reference exception on my properties when adding bindings, in the code bellow this.SampleText is null.
public override void ViewDidLoad ()
{
base.ViewDidLoad ();
this.AddBindings(
new Dictionary<object, string>()
{
{ this.SampleText, "{'Text':{'Path':'VMText'}}"}
});
}
I noticed the following error in Application's output :
"Application windows are expected to have a root view controller at the end of application launch"
what did I miss ?
It sounds like you might have a more general problem with your iOS setup - possibly some issues with the XIB file synchronisation between xcode and MonoTouch.
Before you try to add the MvvmCross binding take a look at the MonoTouch layer - if this.SampleText is null in the MonoTouch layer then you need to solve that before you can add the MvvmCross binding code.
The MonoTouch soft debugger is an excellent tool to help debug this - and the debug cycle with the simulator is pretty quick - so this often helps solve these problems.
From your partial answer to this question, it does sounds like you were trying to use a XIB in a Dialog based UI - that's probably not going to work - I think MonoTouch.Dialog is always based on a single Table - so it's not expecting a XIB (at least, that's my experience!)
As for "Application windows are expected to have a root view controller at the end of application launch" that sounds more like a general problem in your AppDelegate.cs perhaps. Alternatively it might be a problem in the presenter. Which presenter are you using?
That error message itself has a lot of hits on StackOverflow - e.g. Applications are expected to have a root view controller at the end of application launch - but I'm not sure which of these is appropriate to your case right now.
I must admit everything is not clear for me with iOS and MVVMCross
For coders coming from a C# background I don't think this is unusual - I think the jump from VS on Windows to MonoDevelop on Mac (with a little xcode) is a non-trivial leap. It takes more time to switch from WP to iOS development than it does from WP to Droid - this is the case independent of whether you use MvvmCross - if you are doing MT development you are doing native code, so you do have to take some time to understand iOS a bit (in my experience/opinion!).
For the most part, I've personally written tens of thousands of lines of code, authored hundreds of blog posts and stackoverflow answers, and presented maybe ten sessions on mvvmcross. I've done this across five distinct operating systems, four of which I've learnt as I've coded and all of which I've battled against platform and tooling bugs and idiosyncrasies.
I'll continue to post as much as I can - and continue to operate for 'free'
I do also encourage every user to post and blog about their experience too. In this way I hope knowledge will be generated and shared. To anyone who is reading this, using mvvmcross and learning something about cross platform coding - good or bad - then please do consider sharing that knowledge. For inspiration, check out some of the presentations and blog posts users have written - I try to list them on http://slodge.blogspot.co.uk/p/mvvmcross-quicklist.html. Thanks :)
Also, when asking questions, please can you indicate which version of mvvmcross you are using and which sample(s) you're basing your code on - there are differences between master and vNext and there have been fixes over time - so posting this info will help me try to understand and/or replicate the errors you are seeing.
Thanks :)
I know this is a little old, but I was just having the same issue.
My outlet properties in my View.designer.cs file were null when I attempted to access them when ViewDidLoad was called.
Turned out that my xib file in Visual Studio was no longer set to a Build Action of InterfaceDefinition.
Hopefully, this helps someone else who stumbles upon this issue.
Two questions actually.
First : I know iPhone is missing auto-redialing functionality but is there any other way to achieve it by iPhone application, as I can call by my application but facing problem cannot auto redial.
Second : Before calling I want to implement functionality of loud-speaker on a button action.
Is there any way to achieve the above 2 functionalities?
I spent 4-5 hours on googgling about it and the result is only this
I go through to apples doc and found some code hereand also tried this but cant get the right way to implement above functions ...
Any help would be greatly appreciated!
Thanks!!!
Neither of those actions are possible with the SDK.
For the Second case it's definitely NO.
But for the First case, I have no sure about this, but can be tried:
Subscribe with the CTCallCenter for call states notifications and use some background application type (voip, location or audio) or use some waiting block, that will allow to prevent going to suspended state as long as possible
Open URL using tel://
If the call notification about call failure comes in, try to repeat opening an URL
Once again, just an idea, may not work at all :/