I have an audio file in my iPhone app that I convert to an NSData object. Ideally, I would like to get an array of doubles from the audio file. Is there a way to convert NSData to an array of doubles?
Here is the current output from the line NSLog(#"%#\n", data) where data is an NSData object of the audio file:
<0000001c 66747970 6d703432 00000001 6d703431 6d703432 69736f6d 00000008 77696465 004d956e 6d646174 21000340 681c210c 53ed990c 1f33e94d ab588b95 55a61078 08799c67 f1f706cc 595b4eb6 08cfb807 ea0e3c40 03c13303 e674e05a...
The -bytes method return the raw bytes of the NSData. You can then cast it to a pointer to double (assuming correct endianness).
const double* array_of_doubles = (const double*)[data bytes];
NSUInteger size_of_array = [data length] / sizeof(double);
Edit: The data is an MP4 file. You cannot convert MP4 to a meaningful double array directly. I don't know what you want to do, but maybe AVAssetReader would help.
Related
I have a stream of H.264/AVC NALs consisting of types 1 (P frame), 5 (I frame), 7 (SPS), and 8 (PPS). I want to write them into an .mov file without re-encoding. I'm attempting to use AVAssetWriter to do this. The documentation for AVAssetWriterInput states:
Passing nil for outputSettings instructs the input to pass through appended samples, doing no processing before they are written to the output file. This is useful if, for example, you are appending buffers that are already in a desirable compressed format. However, passthrough is currently supported only when writing to QuickTime Movie files (i.e. the AVAssetWriter was initialized with AVFileTypeQuickTimeMovie). For other file types, you must specify non-nil output settings.
I'm trying to create CMSampleBuffers out of these NALs and append them to the asset writer input, but I am unable to input the data in a way that yields a well-formed .mov file, and I can't find any clue anywhere on how to do this.
The best result I've gotten so far was passing in the NALs in Annex B byte stream format (in the order 7 8 5 1 1 1....repeating) and playing the result in VLC. Because of this, I know the NALs contain valid data, but because the .mov file did not have an avcC atom and the mdat atom was filled with an Annex B byte stream, QuickTime will not play the video.
Now I'm trying to pass in the NALs with a 4-byte (as specified by the lengthSizeMinusOne field) length field instead of the Annex B delimiter, which is how they're supposed to be packed into the mdat atom as far as I know.
I am at a loss for how to get the asset writer to write an avcC atom. Every sample I append just gets shoved into the mdat atom.
Does anyone know how I can pass raw H.264 data into an AVAssetWriterInput configured for pass through (nil outputSettings) and have it generate a properly formed QuickTime file?
I have submitted a TSI with apple and found the answer. I hope this saves someone time in the future.
The CMSampleBuffers have associated with them a CMFormatDescription, which contains a description of the data in the sample buffer.
The function prototype for creating the format description is as follows:
OSStatus CMVideoFormatDescriptionCreate (
CFAllocatorRef allocator,
CMVideoCodecType codecType,
int32_t width,
int32_t height,
CFDictionaryRef extensions,
CMVideoFormatDescriptionRef *outDesc
);
I learned, from the Apple technician, that I can use the extensions argument to pass in a dictionary containing the avcC atom data.
The extensions dictionary should be of the following form:
[kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms ---> ["avcC" ---> <avcC Data>]]
The []'s represent dictionaries. This dictionary can potentially be used to pass in data for arbitrary atoms aside from avcC.
Here is the code I used to create the extensions dictionary that I pass into CMVideoFormatDescriptionCreate:
const char *avcC = "avcC";
const CFStringRef avcCKey = CFStringCreateWithCString(kCFAllocatorDefault, avcC, kCFStringEncodingUTF8);
const CFDataRef avcCValue = CFDataCreate(kCFAllocatorDefault, [_avccData bytes], [_avccData length]);
const void *atomDictKeys[] = { avcCKey };
const void *atomDictValues[] = { avcCValue };
CFDictionaryRef atomsDict = CFDictionaryCreate(kCFAllocatorDefault, atomDictKeys, atomDictValues, 1, nil, nil);
const void *extensionDictKeys[] = { kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms };
const void *extensionDictValues[] = { atomsDict };
CFDictionaryRef extensionDict = CFDictionaryCreate(kCFAllocatorDefault, extensionDictKeys, extensionDictValues, 1, nil, nil);
my problem is the following. I would like to encapsulate a NSString in a NSMutableData object. But I would like to do it together with other items and without first encapsulating it into a NSData. It's only bytes after all isn't it?
My final NSMutableData object would look something like
[header | stringLength | NSString]
where header is a char and stringLength is an unsigned short.
I build my packet like this
unsigned short stringLength = myString.length;
NSMutableData* nData = [NSMutableData dataWithBytes:(const void*)&header length:sizeof(char)];
[nData appendBytes:(const void*)&dataLength length:sizeof(unsigned short)];
[nData appendBytes:(const void*)myString length:stringLength];
I would then send this over a gkSession and at the other end I would extract the sting lenght and then the string itself:
NSString* temp = [NSString alloc];
[data getBytes:(void*)&temp range:NSMakeRange(sizeof(char)+sizeof(unsigned short), stringLenght)];
For some reasons this is giving me bad memory access. I suspect that myString.length is not doing exactly what I would expect. Do you have any hints? Thanks in advance.
This line is incorrect:
[nData appendBytes:(const void*)myString length:stringLength];
This is encoding the first part of the underlying NSString structure (which is larger than stringLength).
What you mean is:
[nData appendBytes:[myString UTF8String]
length:[myString lengthOfBytesUsingEncoding:NSUTF8StringEncoding]];
length is the number of characters. This can be substantially smaller than the number of bytes for multibyte characters.
Just as a side note: if you can shorten your length to 1 byte (0-255), then that encoding is called a Pascal String, and CFString can handle that encoding natively (see CFStringGetPascalString()). Not something you generally want to do, but interesting. It's particularly nice with CFStringCreateWithPascalStringNoCopy(), since you can avoid the memory copy operation entirely. This is mostly for legacy support, and I wouldn't jump through any hoops to use it, but sometimes it's handy.
i'm trying to create a simple stringdecompression algorithm for my app.
/*
Decompresses the source buffer into the destination buffer. sourceLen is
the byte length of the source buffer. Upon entry, destLen is the total size
of the destination buffer, which must be large enough to hold the entire
uncompressed data. (The size of the uncompressed data must have been saved
previously by the compressor and transmitted to the decompressor by some
mechanism outside the scope of this compression library.) Upon exit, destLen
is the actual size of the uncompressed buffer.
uncompress returns Z_OK if success, Z_MEM_ERROR if there was not
enough memory, Z_BUF_ERROR if there was not enough room in the output
buffer, or Z_DATA_ERROR if the input data was corrupted or incomplete.
*/
[Base64 initialize];
NSData * data = [Base64 decode:#"MDAwMDAwNTB42vPMVkhKzVNIBeLsnNTMPB0IpVCWWZyVqpAJkalKTVUoS8xTSMpJLC0HALWrEYi="];
NSString * deBase64 = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
int lengteOP = [[deBase64 substringWithRange:NSMakeRange(0,8)] intValue];
NSUInteger lengteIP = [deBase64 length];
const unsigned char *input = (const unsigned char *) [[deBase64 substringFromIndex:8] cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char * dest;
uncompress(dest, lengteOP, input, lengteIP);
A get a EXC_BADD_ACCESS error when i try this.
The string is build with code in delphi using ZLib, same as the library in the iPhone sdk
its a base64 encode string with the first 8 characters representing the length of the string followed by the zlib-ed string.
I did not run the code but your issue is likely to be dest. Here is a snippet from the documentation.
Upon entry, destLen is the total size
of the destination buffer, which must
be large enough to hold the entire
uncompressed data.
The destination needs to have the memory allocated before calling the function, otherwise it will attempt to write data to invalid memory causing EXC_BAD_ACCESS.
Try the following:
unsigned char * dest = malloc(sizeof(unsigned char) * lengteOP);
uncompress(dest, lengteOP, input, lengteIP);
//Use dest (create NSString with proper encoding for example)
free(dest);
- finished code
/*
Decompresses the source buffer into the destination buffer. sourceLen is
the byte length of the source buffer. Upon entry, destLen is the total size
of the destination buffer, which must be large enough to hold the entire
uncompressed data. (The size of the uncompressed data must have been saved
previously by the compressor and transmitted to the decompressor by some
mechanism outside the scope of this compression library.) Upon exit, destLen
is the actual size of the uncompressed buffer.
uncompress returns Z_OK if success, Z_MEM_ERROR if there was not
enough memory, Z_BUF_ERROR if there was not enough room in the output
buffer, or Z_DATA_ERROR if the input data was corrupted or incomplete.
ZEXTERN int ZEXPORT uncompress OF((Bytef *dest, uLongf *destLen,
const Bytef *source, uLong sourceLen));
*/
[Base64 initialize];
NSData * data = [Base64 decode:#"MDAwMDAwNTB42vPMVkhKzVNIBeLsnNTMPB0IpVCWWZyVqpAJkalKTVUoS8xTSMpJLC0HALWrEYi="];
NSString * deBase64 = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
uLongf lengthOriginal = [[deBase64 substringWithRange:NSMakeRange(0,8)] floatValue];
uLongf * lengteOP = malloc(sizeof(uLongf));
lengteOP = &lengthOriginal;
NSUInteger lengteIP = [deBase64 length];
NSString * codedString = [deBase64 substringFromIndex:8];
const unsigned char *input = (const unsigned char *) [codedString cStringUsingEncoding:NSISOLatin1StringEncoding];
unsigned char * dest = malloc((sizeof(unsigned char) * lengthOriginal));
uncompress(dest, lengteOP, input, lengteIP);
NSString * bla = #"Decoded string :";
NSLog([bla stringByAppendingString:[NSString stringWithCString:dest encoding:NSASCIIStringEncoding]]);
free(dest);
free(lengteOP);
when i use this code for generate an hash256 in my iPhone app:
unsigned char hashedChars[32];
NSString *inputString;
inputString = [NSString stringWithFormat:#"hello"];
CC_SHA256([inputString UTF8String],
[inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding ],
hashedChars);
NSData * hashedData = [NSData dataWithBytes:hashedChars length:32];
The hash256 of inputString, is created correctly, but if i use a string like this #"\x00\x25\x53\b4", the hash256 is different from the real string with "\x" characters.
I think that the problem is in encoding "UTF8" instead of ascii.
Thanks!
I would be suspicious of the first character, "\x00" - thats going to terminate anything that thinks its dealing with "regular C strings".
Not sure whether lengthOfBytesUsingEncoding: takes that stuff into account, but its something I'd experiment with.
You're getting the bytes with [inputString UTF8String] but the length with [inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding]. This is obviously wrong. Moreover (assuming you mean "\xB4" and that it turns into something not in ASCII), "\xB4" is not likely to be in ASCII. The docs for NSString say
Returns 0 if the specified encoding cannot be used to convert the receiver
So you're calculating the hash of the empty string. Of course it's wrong.
You're less likely to have problems if you only generate the data once:
NSData * inputData = [inputString dataUsingEncoding:NSUTF8StringEncoding];
CC_SHA256(inputData.bytes, inputData.length, hashedChars);
I am using following code to get bytes array. thx to this post.
Data *data = [NSData dataWithContentsOfFile:filePath];
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
it is the right method to do so?
Now, How i can pass bytes array into url?
Thank You for Help,
You're using the right method to extract the raw bytes from the data. To get those into a URL, you'll need to convert them to a string. Exactly what string depends on the format that you're submitting (ie. it could be just a list of 1s and 0s, or YES and NO, or any other character(s) as required by the server you're talking to.)