Does anybody know that what is the maximum sampling rate of the iphone accelerometer.
I want to have have high update rate. i set it to updateInterval to 1.0/ 300.0
But it seems that i am not getting that much update rate.
So can any body tell me that what is the maximum update rate that we can get or how i can get the high update rate.
The max accelerometer and gyroscope sampling rate on the iPhone 6 is 100Hz. You can empirically test this yourself. Here is the code.
/******************************************************************************/
// First create and initialize two NSMutableArrays. One for accel data and one
// for gyro data. Then create and initialize CMMotionManager. Finally,
// call this function
- (void) TestRawSensors
{
speedTest = 0.0001; // Lets try 10,000Hz
motionManager.accelerometerUpdateInterval = speedTest;
motionManager.gyroUpdateInterval = speedTest;
[motionManager startAccelerometerUpdatesToQueue: [NSOperationQueue currentQueue]
withHandler: ^(CMAccelerometerData *accelerometerData, NSError *error)
{
[rawAccelSpeedTest addObject: [NSNumber numberWithDouble: accelerometerData.timestamp]];
[rawAccelSpeedTest addObject: [NSNumber numberWithDouble: accelerometerData.acceleration.x]];
if (error)
{
NSLog(#"%#", error);
}
if (rawAccelSpeedTest.count > 100)
{
[motionManager stopAccelerometerUpdates];
for (uint16_t i = 0; i < rawAccelSpeedTest.count; i+=2)
{
NSLog(#"Time: %f Accel: %f", [rawAccelSpeedTest[i] doubleValue],
[rawAccelSpeedTest[i+1] doubleValue]);
}
}
}];
[motionManager startGyroUpdatesToQueue: [NSOperationQueue currentQueue]
withHandler: ^(CMGyroData *gyroData, NSError *error)
{
[rawGryoSpeedTest addObject: [NSNumber numberWithDouble: gyroData.timestamp]];
[rawGryoSpeedTest addObject: [NSNumber numberWithDouble: gyroData.rotationRate.x]];
if (error)
{
NSLog(#"%#", error);
}
if (rawGryoSpeedTest.count > 100)
{
[motionManager stopGyroUpdates];
for (uint16_t i = 0; i < rawGryoSpeedTest.count; i+=2)
{
NSLog(#"Time: %f Rate: %f", [rawGryoSpeedTest[i] doubleValue],
[rawGryoSpeedTest[i+1] doubleValue]);
}
}
}];
}
Despite the documentation saying The maximum frequency at which you can request updates is hardware-dependent but is usually at least 100 Hz. it looks to me like the maximum sample rate is still 100Hz.
My approach to figure out was taking the existing sample code for CoreMotion called MotionGraphs and adapting the startUpdates function to look like this:
func startUpdates() {
guard let motionManager = motionManager, motionManager.isGyroAvailable else { return }
sampleCount = 0
let methodStart = Date()
motionManager.gyroUpdateInterval = TimeInterval(1.0/100000.0) // Hardcoded to something verfy fast
motionManager.startGyroUpdates(to: .main) { gyroData, error in
self.sampleCount += 1
//...view update code removed
if (self.sampleCount >= 100) {
let methodFinish = Date()
let executionTime = methodFinish.timeIntervalSince(methodStart)
print("Duration of 100 Gyro samples: \(executionTime)")
self.stopUpdates()
}
}
}
I also set motionManager.deviceMotionUpdateInterval = TimeInterval(1.0/100000.0) for good measure (in case it is a global rate).
With that code in place for both Accelerometer and Gyroscope I confirm that an iPhone 8 on iOS 11.4 still maxes out right around 100Hz for both.
Duration of 100 Accelerometer samples: 0.993090987205505
Duration of 100 Accelerometer samples: 0.995925068855286
Duration of 100 Accelerometer samples: 0.993505954742432
Duration of 100 Accelerometer samples: 0.996459007263184
Duration of 100 Accelerometer samples: 0.996203064918518
Duration of 100 Gyro samples: 0.989820957183838
Duration of 100 Gyro samples: 0.985687971115112
Duration of 100 Gyro samples: 0.989449977874756
Duration of 100 Gyro samples: 0.988754034042358
Maybe duplicate. Look at
update frequency set for deviceMotionUpdateInterval it's the actual frequency?
Actual frequency of device motion updates lower than expected, but scales up with setting
The same should be valid if using old UIAccerometerDelegate interface.
Related
I've taken a look around, and there aren't that many talks or examples on inertial navigation for iOS5. I know that iOS5 introduced some very cool sensor fusion algorithms:
motionQueue = [[NSOperationQueue alloc] init];
[motionQueue setMaxConcurrentOperationCount:1];
motionManager = [[CMMotionManager alloc] init];
motionManager.deviceMotionUpdateInterval = 1/20.0;
[motionManager startDeviceMotionUpdatesUsingReferenceFrame:CMAttitudeReferenceFrameXTrueNorthZVertical toQueue:motionQueue withHandler:^(CMDeviceMotion *motion, NSError *error) {
}];
I've taken a look at both videos from WWDC that deal with the block above.
The resulting CMDeviceMotion contains a device attitude vector, along with the acceleration separated from the user induced acceleration.
Are there any open source inertial navigation projects specifically for iOS 5 that take advantage of this new sensor fusion ? I'm talking about further integrating this data with the GPS and magnetometer output to get a more accurate GPS position.
A bonus question: is this kind of fusion even possible from a hardware standpoint? Will I melt my iPhone4 if I start to do 20hz processing of all available sensor data over extended periods of time?
I'm ready to start tinkering with these, but would love to get something more solid to start with than the empty block above :)
Thank you for any pointers!
I am writing an app for scuba divers and hoped to add inertial navigation since GPS and other radio based navigation is unavailable underwater. I did quite a bit of research and found that there is just too much jitter in the sensor data on the iPhone for accurate inertial navigation. I did a quick experiment and found that even when the device is perfectly still, the "drift" due to noise in the signal showed that the device "moved" many meters after only a few minutes. Here is the code I used in my experiment. If you can see something I am doing wrong, let me know. Otherwise, I want my afternoon back!
- (void)startCoreMotion {
CMMotionManager *manager = [[CMMotionManager alloc] init];
if ([manager isAccelerometerAvailable]) {
manager.deviceMotionUpdateInterval = 1.0/updateHz;
[manager startDeviceMotionUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMDeviceMotion *motion, NSError *error) {
xVelocity += (( 9.8 * motion.userAcceleration.x ) / updateHz);
yVelocity += (( 9.8 * motion.userAcceleration.y ) / updateHz);
zVelocity += (( 9.8 * motion.userAcceleration.z ) / updateHz);
xPosition += ( xVelocity * ( 1 / updateHz ));
yPosition += ( yVelocity * ( 1 / updateHz ));
zPosition += ( zVelocity * ( 1 / updateHz ));
self.xPositionLabel.text = [NSString stringWithFormat:#"x = %f m", xPosition];
self.yPositionLabel.text = [NSString stringWithFormat:#"y = %f m", yPosition];
self.zPositionLabel.text = [NSString stringWithFormat:#"z = %f m", zPosition];
self.xVelocityLabel.text = [NSString stringWithFormat:#"vx = %f m/s", xVelocity];
self.yVelocityLabel.text = [NSString stringWithFormat:#"vy = %f m/s", yVelocity];
self.zVelocityLabel.text = [NSString stringWithFormat:#"vz = %f m/s", zVelocity];
self.distanceLabel.text = [NSString stringWithFormat:#"dist = %f m", sqrt( pow(xPosition, 2) + pow(yPosition, 2) + pow(zPosition, 2))];
}];
}
}
I am testing Core Motion and using the gyroscope. Right now I am getting values that I am not understanding. My assumption was that with each x, y and z I would get a value between 0-360 which would be a full rotation, but this isn't the case.
[self.motionManager startGyroUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMGyroData *gyroData, NSError *error) {
NSString *x = [NSString stringWithFormat:#"%.02f",gyroData.rotationRate.x];
NSLog(#"X: %#", x);
NSString *y = [NSString stringWithFormat:#"%.02f",gyroData.rotationRate.y];
NSLog(#"Y: %#", y);
NSString *z = [NSString stringWithFormat:#"%.02f",gyroData.rotationRate.z];
NSLog(#"Z: %#", z);
frequency = gyroData.rotationRate.y*500;
float rate = gyroData.rotationRate.z;
if (fabs(rate) > .2) {
float direction = rate > 0 ? 1 : -1;
rotation += (direction * M_PI/90.0)*1000;
NSLog(#"Rotation: %f", rotation);
}
}];
It is possible to get more human readable rotation values? Is my assumption that I should be getting values between 0-360 wrong?
The values are in radians, not degrees, so they should be between 0 and 2Pi. Also, they are a rate, not an angle. They are radians per second.
I'm developing an iPhone 4 (iOS 4) application that show an level meter.
This app measures human voice. But it has a problem. When there is a lot of noise, it doesn't work. It measures also background noise.
To measure sound, I use this:
- (void) initWithPattern:(Pattern *)pattern
{
mode = figureMode;
[self showFigureMeter];
patternView.pattern = pattern;
NSURL *url = [NSURL fileURLWithPath:#"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 2], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
NSError *error;
if (recorder == nil)
recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
if (recorder) {
[recorder prepareToRecord];
recorder.meteringEnabled = YES;
[recorder record];
levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03
target: self
selector: #selector(levelTimerCallback:)
userInfo: nil
repeats: YES];
}
}
- (void)levelTimerCallback:(NSTimer *)timer
{
[recorder updateMeters];
float peakPower = [recorder peakPowerForChannel:0];
if (mode == figureMode)
{
if (peakPower < -40) {
;
} else if ((peakPower > -40) && (peakPower < -30)) {
;
} else if ((peakPower > -30) && (peakPower < -20)) {
;
} else if ((peakPower > -20) && (peakPower < -10)) {
;
} else if (peakPower > -10) {
;
}
}
}
Is there any way to remove background noise?
Noise reduction usually involves sampling the sound (as raw PCM samples), and doing some non-trivial digital signal processing (DSP). One needs a well defined characterization of the noise and how it is different from the desired signal (frequency bands, time, external gating function, etc.) for this to be tractable at all.
You can't just use AVAudioRecorder metering.
You can measure the noise level when no one's speaking (either ask for silence or simply select the lowest measured level) then subtract from the instantaneous level.
Or you can use an FFT to attempt to filter out the background noise by selecting only "voice" frequencies (no success guaranteed).
Is there a way to determine the bit-rate of the stream an MPMovieController is playing?
I am programming in objective-c on iOS
You can get the indicated bit rate from event, which is the bit rate of the stream according to the m3u8. To calculate the actual bit rate I divide event.numberOfBytesTransferred / event.durationWatched and multiply by 8.
NSArray *events = self.player.accessLog.events;
MPMovieAccessLogEvent *event = (MPMovieAccessLogEvent *)[events lastObject];
double calculatedBitRate = 8 * event.numberOfBytesTransferred / event.durationWatched;
value = [nf stringFromNumber:[NSNumber numberWithDouble:calculatedBitRate]];
self.calculatedBitRateLabel.text = [NSString stringWithFormat:#"My calculated bit rate = %#", value];
Found it, the "accessLog" gives you periodic stats which include the observed bitrate:
MPMovieAccessLogEvent *evt=nil;
MPMovieAccessLog *accessL=[moviePlayer accessLog];
NSArray *events = accessL.events;
for (int i=0; i<[events count]; i++) {
evt=[events objectAtIndex:i];
}
return evt.observedBitrate
First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.
I'm experimenting with AVFoundation and time lapse photography.
My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.
The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection:, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.
I've tried using CMSampleBufferCreateCopyWithNewTiming(), but then after 13 frames are written to the file, the captureOutput:idOutputSampleBuffer:fromConnection: stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.
How can I accomplish my goal of 30fps playback?
How can I tell where the app is getting lost and why?
I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.
Thanks for any help.
Here is where I'm trying to do the retiming.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;
//NSLog(#"in captureOutpput sample buffer method");
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( #"sample buffer is not ready. Skipping sample" );
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
if (! [inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(#"Not ready for data.");
}
else {
// Write every first frame of n frames (30 native from camera).
intervalFrames++;
if (intervalFrames > 30) {
intervalFrames = 1;
}
else if (intervalFrames != 1) {
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
// Need to initialize start session time.
if (writtenFrames < 1) {
if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"Starting CMtime");
CMTimeShow(imageSourceTime);
}
if (useNativeTime) {
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;
//NSLog(#"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo, // maybe a little confused on this param.
&newSampleBuffer);
// These confirm the good heath of our newSampleBuffer.
if (myStatus != 0) NSLog(#"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(#"CMSampleBufferIsValid NOT!");
// No affect.
//myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ?
//if (myStatus != 0) NSLog(#"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
//CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
//CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
}
if (!appendSuccessFlag)
{
NSLog(#"Failed to append pixel buffer");
}
else {
writtenFrames++;
NSLog(#"writtenFrames: %i", writtenFrames);
}
}
//[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting.
}
My setup routine.
- (IBAction) recordingStartStop: (id) sender
{
NSError * error;
if (self.isRecording) {
NSLog(#"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
self.isRecording = NO;
[recordingStarStop setTitle: #"Record" forState: UIControlStateNormal];
//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
NSLog(#"finished CMtime");
CMTimeShow(imageSourceTime);
// Really, I should loop through the outputs and close all of them or target specific ones.
// Since I'm only recording video right now, I feel safe doing this.
[self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];
[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(#"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
NSLog(#"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
NSLog(#"filePath: %#", [projectPaths movieFilePath]);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
NSLog(#"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
NSLog(#"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
}
else {
NSLog(#"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
projectPaths = [[ProjectPaths alloc] initWithProjectFolder: #"TestProject"];
intervalFrames = 30;
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue: value forKey: key];
[videoOutput setVideoSettings: cameraVideoSettings];
[videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
queue = dispatch_queue_create("cameraQueue", NULL);
[videoOutput setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
[outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];
NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
//[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
[outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];
inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;
outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
[outputWriter retain];
if (error) NSLog(#"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
else NSLog(#"can not add input");
if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(#"ouptutSettings are NOT supported");
if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
else NSLog(#"could not addOutput: videoOutput to captureSession");
//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle: #"Stop" forState: UIControlStateNormal];
writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
//[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
NSLog (#"recording to fileURL: %#", [projectPaths movieURLPath]);
}
NSLog(#"isRecording: %#", self.isRecording ? #"YES" : #"NO");
[self displayOuptutWritterStatus];
}
OK, I found the bug in my first post.
When using
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
you need to balance that with a CFRelease(newSampleBuffer);
The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.
Hope this is helpful to someone else.
With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.
In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.
InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
For completeness to understand the code below, I also have these three lines in the setup method.
fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.
CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.