Can I render to an offline context in a web worker? - web-audio-api

I want to render audio into an offline audio context using the web audio API, but I don't want doing so to block user interactions on the main UI thread.
Can I somehow render my audio graph in a web worker? I don't think I can pass an offline context to the worker because the offline context can't be serialized like a primitive javascript object.

The OfflineAudioContext actually does its rendering in a separate thread (in Chrome, at least). So you don't need to do this manually.
If you DID want to do it manually, you'd need for OfflineAudioContext to be supported in Worker threads, and it isn't yet.

Related

Is there a way to use AudioContext in a web worker

I relatively new to web workers (simply had no need until now) and I did a lot of research and think I get the basics...
But :-)...
I'm stuck and hope for definitive input.
I'm rendering a graphic representation of an audio-file with the WebAudioAPI into an SVG. NO rocket science and it works to my satisfaction. With larger Audio-Files however it would be great to do it an web worker, The problem I have is however that inside a web worker I do not have access to the window object, and therefore I cannot access the AudioContext, which I would need to decode the raw data into an AudioBuffer. Is there another way to do it or a way around?
No, it is not possible to use WebAudio in a Worker. You will have to use the main thread with WebAudio and then transfer the data you need to the worker.
But see also the spec issue on supporting AudioContext in a Worker

Flutter + Provider - Is it possible to restart/re-initialize a provided object?

I am having an issue where my audio engine stops working (or experiences dropouts) after being interrupted by a phone call, change to the audio output device, or other system audio events. I'd like to simply re-initialize my audio classes after such an event.
I am using a multi-Provider to provide the audio service throughout the app. I have a couple models that keep track of settings/state and could restore the state and the expected behavior after an interruption.
I considered doing something similar to the solution described here
However my models and audio service are all being provided at the same level with multiprovider so any restarting of a widget tree will restart both.
I basically want to re-create my audio service with a fresh object. Any suggestions as to how to approach "restarting" a provider provided service?

UIImageView+AFNetworking image request queue blocks other network requests from RestKit

I am downloading Data with RestKit by creating RKObjectRequestOperations and adding them to RestKit's queue:
RKObjectRequestOperation *operation = [RK objectRequestOperationWithRequest:request
success:...
failure:...];
[RK enqueueObjectRequestOperation:operation];
That one works well. Additionally, this data is displayed in a list view that contains UIImageViews displaying related user icons. These user icons, however, are not downloaded via RestKit, but via the underlying AFNetworking library. UIImageView+AFNetworking does the job also pretty well (including the caching):
[self setImageWithURLRequest:userIconRequest
placeholderImage:anonymousUser
success:nil
failure:...];
The problem is that those 15 user icon requests from the image views block the processing of the RestKit request that should be loading the next page immediately. I can see the list displaying the "loading" row as well as the first user icons. In the moment the last image finished loading the next page appends itself.
Looking into the implementation of UIImageView+AFNetworking shows that it is using an own NSOperation queue instance that is serializing requests. However that should not interfere with RestKit I suppose.
Also adding NSOperationQueuePrioritys to all requests does not change a thing. Maybe internally, network requests are serialized in a different way? How can I prioritize those requests?
Thanks in advance.
NSURLConnection has an undocumented maximum number of connections.
Additionally, UIImageView+AFNetworking's operation queue has NSOperationQueueDefaultMaxConcurrentOperationCount maximum current requests, which is likely a bad choice for your use case according to this convincing-sounding AFNetworking discussion.
You need to throttle. I see two simple solutions:
Modify UIImageView+AFNetworking to have, say, 4 maximum concurrent operations.
Modify UIImageView+AFNetworking to use the same operation queue as RestKit, in which case the priority you set will matter.

Play 2.0 - Push current state of execution to page

So I currently have an application independent of Play which may take a long time in its execution.
I want to put a UI on top of it using Play where the application may be invoked and to display some of the details of the execution inside of the application to the user. I would like the page to be updated automatically as the execution proceeds e.g. if a variable in the application increments this would be reflected on the page.
I'm not sure where to begin with this - do I need to split the application up into models + controllers ? Or do I just need to have code in a controller to instantiate the classes I have already coded and call the methods I need ?
What about constantly showing the execution state on the page?
Any resources I should know about/read ? Code examples?
Thanks
You may have already done so, but a good starting point is to create a skeleton Play application using the play new command, while referring the creating a new application section. You will have "views" (HTML template pages) and one controller (in Application.scala). You could add more controllers, but as you will have just a single page that should suffice.
You can add jars from your app (if it's a JVM app) to the lib directory of your Play application. From this: "Or do I just need to have code in a controller to instantiate the classes I have already coded and call the methods I need?" it sounds like you would be happy to have your app run in the process of the Jetty + Play server. Check out the Global object for starting your app at process startup.
Check out the section on comet sockets for sending updates from the Play app to the browser. You'll need a bit of Javascript in the web page.
Do you want to have this application running outside of play, perhaps on another server? Can you modify the application, or is this 3rd party software?
If so, you have to have some way to send data back and forth between your play front end and your application. You can either have your application expose a websocket, and then your play front end and your application can push data back and forth to each other. You can then have your client page have a websocket open to you play front end, and then play can push the updates to the client. If your application can't support a websocket, you could also expose some URLs on your front end for the application to POST to. You can then use some sort of message bus or database mechanism (RabbitMQ, redis, Mongo capped collection, or even just a shared Queue object) so that the front end websocket can get those updates and send them to the client.

Which to use when? NSURLConnection vs. lower level socket API

I am developing an iPhone application which streams data(e.x.ECGData like points) from a server and displays(means Plotting) it on the screen -- i.e., live streaming. For that purpose, I am using NSURLConnection.
The problem I am facing is that, since the data is coming so speedily from the server to the iPhone, the cache buffer is increasing rapidly, causing the displayed data to lag behind the actual data coming from the server. After some time, the application goes too slowly, and gets a memory warning.
So my question is, how should I handle this data coming from the server? Should I continue with NSURLConnection or go for lower level socket programming?
I propose you implement some sort of flow control:
The simplest approach is to drop data if your buffers are full. For video streams, frames can be dropped. I don't know whether the same is possible with your data.
Another approach is to switch from the event-based API of NSURLConnection (where the framework controls when you have to react) to CFSocket class where you can read data when you are ready for it. It's more low-level, requires a separate thread and some advanced logic like going to sleep when the buffer is full and being woken up when the main thread has displayed more data and made more space in the buffer. With this approach you are basically building on top of TCPs flow control mechanism.
Yet another approach would be to use another network protocol where you have more control about the amount of data being sent.
I would use ASIHttpRequest streaming. You can implement a delegate method request:didReceiveData: to get your data in chunks as it comes in, deflate it if needed pares it and display. If you need cache you can always save it to file.