low framerate when running iframeextractor on iphone device - iphone

Would like to ask if anyone has encounter low framerate when running iframeextractor on iphone device. When i running it on simulator the video is fine but when code is run on iphone device, the framerate became slow. Do anyone know whats the cause? Thanks in advance.

The iphone processor is much slower than your desktop mac, and has less resources. rgb image conversion is slow.

Related

AVAssetExportSession works on iPad, no audio on iPhone

I have the exact same code running on both the iPad and iPhone versions of my app and the code works fine on the iPad (the video is being exported properly with audio), but the exported video on the iPhone doesn't have any sound. I even ran the iPhone version on the iPad and it worked fine, which means that nothing should be wrong with the code itself.
Any insight on why the iPhone isn't exporting the video with audio would be much appreciated.
I have done some research and somebody mentioned that memory issues could be causing some export problems. The memory and CPU usage are fairly high during the video processing/exporting, but never high enough to receive a memory warning.
Thanks in advance.
You didn't mention if you stepped through the code (line by line) on the iPhone, setting breakpoints, watching each variable to make sure the value is correct, etc. This would be the first step.

Differences in audio frameworks on iPhone 3G, 3GS and 4

I'm building an app that measures sound volume. I understand that audio hardware in the iPhone is not as accurate as professional hardware, which is OK, but I need to know if there are any differences between the different iPhone models. For example, is it possible that the volume measured on an iPhone 3G will be different on an iPhone 4? Unfortunately I do not possess any models earlier than the 4 so I'm unable to test this myself.
The audio frameworks seem to be identical for identical iOS versions (except for the 2G). However the physical microphones (and acoustical environments) are different. People have published various test results, such as:
http://blog.faberacoustical.com/2009/iphone/iphone-microphone-frequency-response-comparison/
and
http://blog.faberacoustical.com/2010/iphone/iphone-4-audio-and-frequency-response-limitations/
But it's possible that the mic response may vary with manufacturing batches as well. YMMV.
I'd just like to add that no matter what I try, there seems to be no way of exporting audio through AVAssetExportSession using an iPhone 3G. It's working with 3GS, iPod Touches, 4, iPad and so on.

Use images form camera without taking a photo

Is it possible to analyze the images without taking a foto on the iPhone ?
I want to analyze some matrix codes, without taking any foto. I saw this on some Nokia models and it's quite impressive: really fast!
On the iPhone, I've seen analyzing codes after taking a snapshot (or photo), and it's slower.
Also the iPhone camera is not good as some other mobile cameras.
I'm refering to iPhone 3G/3GS.
thanks,
r.
Yes.
Under iOS 4.x, you can use the new AVFoundation Framework to access camera pixel buffers on the 3G and 3GS, as well as the newer iPhone 4, without taking a photo/snapshot. The newer devices have higher resolution camera's.
I don't know if it's possible. But I think you should take a look at the AVCamDemo Apple's Code from de WWDC 2010. I think it could help you even if I didn't read a lot of code (juste compiled tje xCode project and tried it)
(Sorry but I can't find back the link)
You should be able to find the code in your emails if you are registred as an iPhone Developer. Or maybe on the developer.apple.com/iphone/ website.
By the way, don't think doing it on iPhone 3G (impossible I think). The iPhone 3GS should be fast enough for this kind of app.
Good Luck !

DIRAC2 for real time pitch shifting and autotune?

Has anyone implemented the DIRAC2 library from http://www.dspdimension.com/technology-licensing/dirac2-iphone/ for real time pitch correction on the iPhone? The library doesn't appear to support real time processing but perhaps someone has done it?
Thx
I integrated DIRAC2 into an iPhone app so I could modify the playback speed, and it does indeed work in real-time on an iPhone 4. I had to use the lowest possible settings to keep the CPU usage down, but it does play without any skips and I am able to change the playback speed seamlessly.
Running the same project on a 3GS device yielded lesser results - namely that the audio had enough skipping that it wasn't really usable. One caveat to this, though, is that I was running my test on the free version of DIRAC2 which only supports a 44100 sample rate, which is much higher than I need. If the pro version is used and you slash the sample rate down to 22050 or lower, it might work on a 3GS, but don't quote me on that.
Anything older than a 3GS has absolutely no chance of real-time playback.
Hope this helps.
Confirmed from DSP Dimensions, that the current DIRAC2 library will not work in real time.

Iphone simulator and webcam

Can we make able the iphone simulator to capture an image via webcam, I've written a program to take image from iphone camera. Can i test this with the iphone simulator??? Pls help
You cannot take image via Mac's webcam from iPhone Simulator. For camera, you need to test it on a device.
Imagine if this would be possible on the Simulator, your Mac camera resoultion would be very different from the device one and your mac performances too. This could lead to bad surprises when moving and testing the application on the device.
(Sorry for my english)
In android world, the phone emulator don't have a live camera preview so the solution is create a "Webcam Server" running on emulator host (PC/Mac). This program is a socket server to capture frames from the built-in Webcam and transmit them using a socket.
Then in the phone code (android emulator) you can read the frame through socket and then shows and simulate a real phone camera.
Is that posible in iPhone simulator?
You cannot use mac's camera. For camera, you need to test it on a device.
From Apple documentation:
The hardware features that are not simulated as of iOS 8.2 are:
Motion support (accelerometer and gyroscope) are unsupported.
Audio and video input (camera and microphone) are unsupported.
Proximity sensor
Barometer Ambient light sensor