I'm programming an app on ios with xcode and I after a lot of work I found out that the functionality is really dependant on the accuracy of the methods calls.
Calling them one line after the other in the code doesn't help, they're still called at up tp 150 ms difference after each other.
So I need to make two methods run at the minimal time difference, hence "at the same time".
These two tasks I'm performing are actually audio and video ones so I understand it might include also in-process latency and delays, so I was wondering maybe you guys would have any isight on how to sync an audio task and a video task so that they start running together, with a very tiny time gap.
I tried using dispatch queues and stuff like that, but they don't work.
I'll be happy to elaborate if I wasn't clear enough.
Thanks!
You can't make an interrupt driven, multi-tasking OS behave like a realtime OS.
It just doesn't work that way.
You'll need to use the various multimedia APIs to set up a context of playback where the audio and video are synchronized within (which I don't know).
Apple has APIs and documentation to go with them on syncing audio and video. http://developer.apple.com
Obviously calling methods sequentially (serially) won't do what you are asking since each method call will take some finite amount of time, during which time the subsequent methods will be blocked.
To run multiple general-purpose tasks concurrently (potentially truly simultaneously on a modern multi-core device) you do need threading via "dispatch queues and stuff like that." GCD most certainly does work, so you are going to need to elaborate on what you mean by "they don't work."
But this is all probably for naught, because you aren't talking about general-purpose tasks. If you are handling audio and video tasks, you really shouldn't do all this with your own code on the CPU; you need hardware acceleration, and Apple provides frameworks to help. I'd probably start by taking a look at the AV Foundation framework, then, depending on what you are trying to do (you didn't really say in your question...) take a look at, variously, OpenAL, Core Video, and Core Audio. (There's a book out on the latter that's getting rave reviews, but I haven't seen it myself and, truth be told, I'm not an A/V developer.)
As you may know, in any multitasking environment (especially on single core devices), you'll not be able to have guarantees on timing between statements you're executing.
However, since you're mentioning playback of audio and video, Apple does provide some fundamentals that can accomplish this through its frameworks.
AVSynchronizedlayer - If some of your video or audio can playback from an AVPlayerItem (which supports a variety of formats), you can build a tree of other events which are synchronized with it, from keyframe animations to other AVPlayerItems.
If multiple audio tracks are what you're looking to in particular, there are some simpler constructs in Core Audio to do that too.
If you can give a more specific example of your needs someone may be able to provide additional ideas.
Related
I have some code that runs slowly when I test it on an iPhone 4. I am thinking about researching Grand Central Despatch and using a background thread for some tasks. However, I understand that the iPhone 4 is a single core device. Does this mean there will be no benefit on this device to using a background thread?
I couldn't find much in Apple's documentation about different device capabilities in this regard and am new to background processing.
Yes as long as its running iOS 4 or later. GCD is a good design in that it can be used equally well on single core machines all the way up to 16 Core Mac Pro's. In fact Apple emphasized this when they introduced GCD. If your code is well written it should work equally well on a single core iPhone as well as multicore iOS Devices out there. Theoretically you should see performance improvements on multicore devices over the single core devices.
It all depends what your code is actually doing. If your code is targeted to calculate only one thing without stopping there is no benefit of using multithreading on one core cpu in terms of performance. If however some of your tasks are waiting for something like network data, waiting on disk operation, sleeping etc. then your other threads might be using that time to do something useful even on one core cpu. Generally if you are interacting with UI then it is recommended to do time consuming tasks in background so you won't block user interface thus providing better experience for the end user.
Are there any tricks/methods to optimize the battery usage that you app uses?
I've an app that can play streaming audio in the background quiet well. It does the basics like put the app in the background after a little time of no interfacing with the screen by the user.
Are there any other tricks to be done to stop it eating the battery like a fat man at an all you can eat buffet!
Thanks,
-Code
You can decrease the usage of the internet and in the case that you are using the Location librairies don't use them except if you need them in which case don't use cutting eye accuracy. The most valid trick for this is no GPS no Geolocation.
I hope this helps!
Reading in data over a network connection is one of the most power hungry operations on the device. Depending on your protocol, you maybe able to optimize the streaming by buffering larger chunks at a time (if possible). Obviously if this is realtime streaming of a live feed that is not an option.
Review Apple's guidelines here: Performance, make sure you scroll down to the Reduce Power Consumption section. Basically, to reduce power consumption, you should do as little as possible. If you are turning on frameworks like CoreLocation or using the accelerometers, you should disable those as often as possible. Try releasing as many resources as you possibly can when in the background. Less memory means less overhead for the system to keep track of as well.
I've worked on several iPhone applications which all require some subset of the same services: RemoteIO audio, GPS, push notifications, face sensor activation, idle timeout disabling, etc, etc. The application delegate callback methods become bloated with all of this initialization code which is slightly different in each app.
So my question is: is there a library for handling all this? Some system that lets me say, "this app uses services A, B, and C, and they should launch in this order"? The services would be defined so that they'll automatically get the application lifecycle callbacks they need, like the application going into the background, audio interruptions, etc.
This is pretty ill-defined, which is why I'm hesitant to write this code yet. If someone else has solved the problem then I can avoid duplicating all of the mistakes they made on their approach to a solution.
The only one i could think of would be three20, but not sure if that isn't a bit too much. ( check http://api.three20.info/annotated.php)
I have implemented one application in which i want to implement pedometer functionality.Is it possible to implement pedometer?Please give me idea.
This is certainly possible, and I believe has been done. The main drawback is that since the iPhone does not permit 3rd party background tasks, your app will have to be open to collect pedometer data.
The simplest way to do this would be to setup an Accelerometer listener, and watch for 'step events'. You'd have to do some experimentation to determine the type and size-range of these, but that wouldn't be too difficult.
Expanding on Ben's answer: you might want to write an app (or use an existing one) that continuously records the data from the accelerometer and shows a graph thereof, then have that app running while you and a few different people walk around with the device in their pocket. That should give you an idea of the patterns of acceleration you need to be looking for, as well as the variation possible in peoples' stride lengths, gaits, etc.
I was told that the iPhone does not support multitasking and multithreading. This did not make sense to me, so I tested on the simulator: pthreads works, fork() doesn't.
This result does make sense to me, but now I'm not sure: will the pthread library also work on the real device?
Thanks.
Multithreading will work. It's multitasking that won't. The iphone won't let more than one third party application run at once. That reasoning makes fork live outside of the application's sandbox.
You can create threads to poll sockets, read files, handle an AI player all you want, or until the performance gains start to go away.
Yes, the pthread library will work on the iPhone. Alternately you can use Cocoa-native threads with NSThread. Multitasking will not work, as Apple is explicitly restricting that.
Most likely.
Multitasking is disabled by default to prevent apps from spawning a bunch of processes and either slowing down the iPhone or doing malicious things.
The iPhones CPU really isn't that fast, but by only running 1 program at a time, it seems speedy. Multitasking would introduce a lot of overhead and other problems which would slow down the iPhone.
I'm not actually sure about multithreading, but since threads are contained to your own process, it seems likely that they would work.
And as you said, pthreads work and fork() doesn't, so its logical it would work on the real one as well.
Multithreading is very much possible - the iPhone actually uses the same Cocoa threading APIs that are available on the Mac. I write a collaborative drawing app that uses 6 threads to handle drawing, network communication, etc. I think creating too many threads would be a bad idea, since the iPhone only has one processor. They work very well in my experience, though!