I have a button hooked up to a raspberry pi that is running a node server. I am using Socket.IO to interface with said server. The node server works well and when the button is pressed emits to the phone.
Basically all I want to do is have a label in the view that says "Button is Currently Pressed/Button is not Currently Pressed". I want to do this asynchronously in swift, so I can add other features later on that won't interfere with checking that button state. I haven't really found any good leads by google.
Thanks for any help or suggestions!
Related
I'm using AudioKit framework to implement MIDI in one of my hobby projects. In this project, I'm trying to make an app which has navigation buttons (left right up down) and play button (Just like Ableton Push MIDI controller has).
To make them function, I first recorded MIDI data that comes out of Push to map all the keys. I then used MIDI Utility by AudioKit as a starter and sent note values from the app to Ableton Live software, where it successfully triggered sounds. (Kept channel as 0)
Now, I'm trying to replicate the cc functionality of arrow keys which are cc54, cc55, cc62, cc63 and cc85 for Play. When I send this cc MIDI data using MIDI Utility, it send midi data successfully to Ableton (as I can see the light feedback), but it simply does not do what Ableton Push hardware controller would've done.
Am I missing something significant?
I also tested that when a button is pressed value goes to 127, and on button release it goes to 0. Despite of replicating it, it still doesn't work.
This problem is not related to AudioKit at all. But someone who has understanding of how midi channels, sending etc works in Ableton Push might be able to help me.
Is ableton identifying your controller as a push? Ableton has special scripting to deal with various controllers (They use Python, and if you hunt around you might find examples). Thats likely both the problem and the solution. The script is not recognizing your software as a push. However it might be possible to create a new device profile in python that'll give you the flexibility to really get in there and tweak.
I have an iOS shopping list app, where items are added and displayed in a tableView. I want to create a Watch App Extension but I’m thinking of what is the best function call to use in this case, the updateApplicationContext(:) method or the sendMessage(:replyHandler:errorHandler:) method I was reading the documentation but I’m a little confused since both seem to work.
Here is the functionality I’m expecting to have…
What I want is to be able to add items in the iOS app even if the Watch app is Off, which is normal behavior, BUT I want the Watch app to update with whatever is in the tableView (in iOS) as soon as it is turned on and even if the iPhone is not On at the time the Watch is turned on.
In other words, I want the data in the iOS app to always be in sync with the Watch app.
Which is the best function call to use in this case, the updateApplicationContext(_:) method or the sendMessage(_:replyHandler:errorHandler:) method?
Thanks
As for me, I would use updateApplicationContext(_:) since you would want to update it in real time or in the background as it gets connected.
as for this sendMessage(_:replyHandler:errorHandler:) The cons is The isReachable property must currently be true for these methods to succeed. so you might get a slight delay to update your UI until it is reachable and ask for updates.
I submitted a background audio app for certification and has failed with two reasons in which I could not figure out why.
Reason 1:
This app failed to correctly respond to at least one of the play,
pause, or play/pause events.
I understand that the MediaControl events for Play, Pause, Stop and PlayPause need to be catered, and have done so (and tested on both tablets and local devices that they are working) in the code. However, due to the reason that stopping a media stream and restarting it requires a longer-than-expected time, I used MediaElement.Pause() for both "Pause" and "Stop".
I read another post who had similar problem at the certification phase. Somebody recommended to use MediaElement.PlaybackRate = 0; instead. However, this is not ideal for long pauses as the stream will not move on.
What I wish to know is am I doing this the right way? For all my MediaControl events I have made sure that the MediaControl.IsPlaying property is correctly set as well.
Also, another reason it failed was this:
App failed the Perf test in the Windows ACK. See the following links
for more information: Test cases ran:
http://msdn.microsoft.com/en-us/library/windows/apps/hh920274.aspx
I have ran my app against the ACK and it all passed. The only thing I can think of is that the app does not enter suspend mode when the hardware (or on-screen) media control pause button is pressed. I have placed a debugger in the App_Suspending event but it never hits there.
As the description is too vague I am not sure if this is the problem. But if it's the case, can I know how do I force the app to enter suspended mode? I tried looking in the Window.Current class and Application.Current class, but to no avail.
Thanks!
For your first issue be sure that your media element is ready to play using :
while (CurrentTrack.CurrentState == MediaElementState.Opening || CurrentTrack.CurrentState == MediaElementState.Buffering)
{
await Task.Delay(100);
}
CurrentTrack.Play();
Also you have to stop your media element when the view is unload.
Regards.
After nearly 10 attempts in releasing the app, I finally got to the root of the problem, thanks to some guessing work by the folks at Microsoft too.
My app will automatically start the MediaElement streaming after the app is started. The background-capable audio will prevent the app from passing WACK because it will never enter suspended mode!
So, in order to get pass the store's WACK I had to remove the auto-starting feature, and now the app is in the store! (Phew).
I have an iPhone app that uses ASIHTTPRequest to transmit data input by the user to a remote MySQL database via a PHP webservice layer. This works perfectly.
If the user presses the submit button the data should be sent regardless...the problem arises when there is insufficient bandwidth...rather than displaying some uialert to inform the user, I would like to implement some kind of function that constantly 'sniffs' for an internet connection even when the app isn't running (in main view) that ensures that the user only has to press 'submit' once.
How is this possible? Has anyone come across any tutorials/examples of anything similar?
Check out this example application from Apple: Reachability
It'll help you with some code to detect when the connection has changed.
Here's a link about backgrounding tasks. As you'll read, you can request additional time to complete a task, but it won't wait an infinite amount of time until it's complete. Background Tasks
Use the Reachability API in conjunction with a flag of some sort that will perform the desired action once it detects that a connection is available.
Sorry if it is such a dumb question or if I'm not ninja google-ing enough. I just need the answer quick.
I have an external accessory which gets data from ANT+ sensors. My question is, would it be possible that my application using the external accessory continue to run in the background, or at least send push notifications?
Thanks and feel free to downvote because I think it's very lazy of me to ask here.
According to the Apple specs your application won't receive events immediately while in the back ground but all events should be hold in a queue and sent when back foreground. Depending on the frequency your device is running and the elapsed time while staying in the background you should be prepared to get a bunch of events.
See External Accessory Programming Topics: Monitoring Accessory-Related Events
and Multitasking Support for more information.