Update 4
Per Greg's suggestion I've created one pair of image/text that shows the output from a 37k image to base64 encoded, using 100k chunks. Since the file is only 37k it's safe to say the loop only iterated once, so nothing was appended. The other pair shows the output from the same 37k image to base64 encoded, using 10k chunks. Since the file is 37k the loop iterated four times, and data was definitely appended.
Doing a diff on the two files shows that on the 10kb chunk file there's a large difference that begins on line 214 and ends on line 640.
Small Image (37k) - 100k Chunks - Image output
Small Image (37k) - 100k Chunks - Base64 Text output
Small Image (37k) - 10k Chunks - Image output
Small Image (37k) - 10k Chunks - Base64 Text output
Update 3
Here's where my code is now. Cleaned up a bit but still producing the same effect:
// Read data in chunks from the original file
[originalFile seekToEndOfFile];
NSUInteger fileLength = [originalFile offsetInFile];
[originalFile seekToFileOffset:0];
NSUInteger chunkSize = 100 * 1024;
NSUInteger offset = 0;
while(offset < fileLength) {
NSData *chunk = [originalFile readDataOfLength:chunkSize];
offset += chunkSize;
// Convert the chunk to a base64 encoded string and back into NSData
NSString *base64EncodedChunkString = [chunk base64EncodedString];
NSData *base64EncodedChunk = [base64EncodedChunkString dataUsingEncoding:NSASCIIStringEncoding];
// Write the encoded chunk to our output file
[encodedFile writeData:base64EncodedChunk];
// Cleanup
base64EncodedChunkString = nil;
base64EncodedChunk = nil;
// Update progress bar
[self updateProgress:[NSNumber numberWithInt:offset] total:[NSNumber numberWithInt:fileLength]];
}
Update 2
So it looks like files that are larger than 100 KB get scrambled, but files under 100 KB are fine. It's obvious that something is off on my buffer/math/etc, but I'm lost on this one. Might be time to call it a day, but I'd love to go to sleep with this one resolved.
Here's an example:
Update 1
After doing some testing I have found that the same code will work fine for a small image, but will not work for a large image or video of any size. Definitely looks like a buffer issue, right?
Hey there, trying to base64 encode a large file by looping through and doing it one small chunk at a time. Everything seems to work but the files always end up corrupted. I was curious if anyone could point out where I might be going wrong here:
NSFileHandle *originalFile, *encodedFile;
self.localEncodedURL = [NSString stringWithFormat:#"%#-base64.xml", self.localURL];
// Open the original file for reading
originalFile = [NSFileHandle fileHandleForReadingAtPath:self.localURL];
if (originalFile == nil) {
[self performSelectorOnMainThread:#selector(updateStatus:) withObject:#"Encoding failed." waitUntilDone:NO];
return;
}
encodedFile = [NSFileHandle fileHandleForWritingAtPath:self.localEncodedURL];
if (encodedFile == nil) {
[self performSelectorOnMainThread:#selector(updateStatus:) withObject:#"Encoding failed." waitUntilDone:NO];
return;
}
// Read data in chunks from the original file
[originalFile seekToEndOfFile];
NSUInteger length = [originalFile offsetInFile];
[originalFile seekToFileOffset:0];
NSUInteger chunkSize = 100 * 1024;
NSUInteger offset = 0;
do {
NSUInteger thisChunkSize = length - offset > chunkSize ? chunkSize : length - offset;
NSData *chunk = [originalFile readDataOfLength:thisChunkSize];
offset += [chunk length];
NSString *base64EncodedChunkString = [chunk base64EncodedString];
NSData *base64EncodedChunk = [base64EncodedChunkString dataUsingEncoding:NSASCIIStringEncoding];
[encodedFile writeData:base64EncodedChunk];
base64EncodedChunkString = nil;
base64EncodedChunk = nil;
} while (offset < length);
I wish I could give credit to GregInYEG, because his original point about padding was the underlying issue. With base64, each chunk has to be a multiple of 3. So this resolved the issue:
chunkSize = 3600
Once I had that, the corruption went away. But then I ran into memory leak issues, so I added the autorelease pool apprach taken from this post: http://www.cocoadev.com/index.pl?ReadAFilePieceByPiece
Final code:
// Read data in chunks from the original file
[originalFile seekToEndOfFile];
NSUInteger fileLength = [originalFile offsetInFile];
[originalFile seekToFileOffset:0];
// For base64, each chunk *MUST* be a multiple of 3
NSUInteger chunkSize = 24000;
NSUInteger offset = 0;
NSAutoreleasePool *chunkPool = [[NSAutoreleasePool alloc] init];
while(offset < fileLength) {
// Read the next chunk from the input file
[originalFile seekToFileOffset:offset];
NSData *chunk = [originalFile readDataOfLength:chunkSize];
// Update our offset
offset += chunkSize;
// Base64 encode the input chunk
NSData *serializedChunk = [NSPropertyListSerialization dataFromPropertyList:chunk format:NSPropertyListXMLFormat_v1_0 errorDescription:NULL];
NSString *serializedString = [[NSString alloc] initWithData:serializedChunk encoding:NSASCIIStringEncoding];
NSRange r = [serializedString rangeOfString:#"<data>"];
serializedString = [serializedString substringFromIndex:r.location+7];
r = [serializedString rangeOfString:#"</data>"];
serializedString = [serializedString substringToIndex:r.location-1];
// Write the base64 encoded chunk to our output file
NSData *base64EncodedChunk = [serializedString dataUsingEncoding:NSASCIIStringEncoding];
[encodedFile truncateFileAtOffset:[encodedFile seekToEndOfFile]];
[encodedFile writeData:base64EncodedChunk];
// Cleanup
base64EncodedChunk = nil;
serializedChunk = nil;
serializedString = nil;
chunk = nil;
// Update the progress bar
[self updateProgress:[NSNumber numberWithInt:offset] total:[NSNumber numberWithInt:fileLength]];
// Drain and recreate the pool
[chunkPool release];
chunkPool = [[NSAutoreleasePool alloc] init];
}
[chunkPool release];
How are you converting back the base64 data to an image? Some implementations limit the maximum line length they will accept. Try inserting a line break every so many characters.
Related
so what im doing is compressing an image repeatedly until it fits within a size frame. It it takes to many attempts, give up and later we tell the user the image is to large.
My issue is that after the image size has been reduced, its size goes WAY back up again when its changed to a UIImage.
Here is the code for reducing the image size:
double compressionRatio = 1;
int resizeAttempts = 5;
NSData * imgData = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"],compressionRatio);
NSLog(#"Starting Size: %i", [imgData length]);
//Trying to push it below around about 0.4 meg
while ([imgData length] > 400000 && resizeAttempts > 0) {
resizeAttempts -= 1;
NSLog(#"Image was bigger than 400000 Bytes. Resizing.");
NSLog(#"%i Attempts Remaining",resizeAttempts);
//Increase the compression amount
compressionRatio = compressionRatio*0.5;
//Test size before compression
NSLog(#"Current Size: %i",[imgData length]);
imgData = UIImageJPEGRepresentation([info objectForKey:#"UIImagePickerControllerOriginalImage"],compressionRatio);
//Test size after compression
NSLog(#"New Size: %i",[imgData length]);
}
//Set image by comprssed version
savedImage = [UIImage imageWithData:imgData];
//Check how big the image is now its been compressed and put into the UIImageView
NSData *endData = UIImageJPEGRepresentation(savedImage,1.0);
NSLog(#"Ending Size: %i", [endData length]);
Ok now here is the console report:
Starting Image Size: 4076994
Image was bigger than 400000 Bytes. Resizing.
4 Attempts Remaining
Current Size: 4076994
New Compressed Size: 844482
Image was bigger than 400000 Bytes. Resizing.
3 Attempts Remaining
Current Size: 844482
New Compressed Size: 357459
Ending Image Size: 2090332
As you can see the image started at 4mb big. It was compressed to 350kb, then in the end it was made into a UIImage as 2mb.
Thanks in advance to anyone who can make any sense of this.
I have modified your function a bit and got result, try it, it reduce my 10MB image to 3MB.
-(void)compraseImage
{
UIImage *largeImage = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
double compressionRatio = 1;
int resizeAttempts = 5;
NSData * imgData = UIImageJPEGRepresentation(largeImage,compressionRatio);
NSLog(#"Starting Size: %i", [imgData length]);
//Trying to push it below around about 0.4 meg
while ([imgData length] > 400000 && resizeAttempts > 0) {
resizeAttempts -= 1;
NSLog(#"Image was bigger than 400000 Bytes. Resizing.");
NSLog(#"%i Attempts Remaining",resizeAttempts);
//Increase the compression amount
compressionRatio = compressionRatio*0.5;
NSLog(#"compressionRatio %f",compressionRatio);
//Test size before compression
NSLog(#"Current Size: %i",[imgData length]);
imgData = UIImageJPEGRepresentation(largeImage,compressionRatio);
//Test size after compression
NSLog(#"New Size: %i",[imgData length]);
}
//Set image by comprssed version
savedImage = [UIImage imageWithData:imgData];
//Check how big the image is now its been compressed and put into the UIImageView
// *** I made Change here, you were again storing it with Highest Resolution ***
NSData *endData = UIImageJPEGRepresentation(largeImage,compressionRatio);
NSLog(#"Ending Size: %i", [endData length]);
NSString *path = [self createPath:#"myImage.jpg"];
NSLog(#"%#",path);
[endData writeToFile:path atomically:YES];
}
-(NSString *) createPath:(NSString *)withFileName
{
NSArray *paths =NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,
YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *path = [documentsDirectory stringByAppendingPathComponent:withFileName];
return path;
}
NSData * buffer = [fileHandle readDataOfLength:chunkSize];
while ([buffer length] > 0)
{
[streamBIG writeData:buffer];
offset += [buffer length];
[fileHandle seekToFileOffset:offset];
buffer = [fileHandle readDataOfLength:chunkSize];
}
I use these particular process to zip more then 1 file of 25 - 30 MB but these increases memory i.e. live bytes continuously increase till all files are not written, because of which my app crashes
Try like this with Auto release pool
NSData * buffer = [fileHandle readDataOfLength:chunkSize];
while ([buffer length] > 0)
{
#autoreleasepool
{
[streamBIG writeData:buffer];
offset += [buffer length];
[fileHandle seekToFileOffset:offset];
buffer = [fileHandle readDataOfLength:chunkSize];
}
}
it will work...
Can anyone suggest a method to read bytes from NSData (like read function in #interface NSInputStream : NSStream)
How to read binary bytes in NSData? may help you:
NSString *path = #"…put the path to your file here…";
NSData * fileData = [NSData dataWithContentsOfFile: path];
const char* fileBytes = (const char*)[fileData bytes];
NSUInteger length = [fileData length];
NSUInteger index;
for (index = 0; index<length; index++) {
char aByte = fileBytes[index];
//Do something with each byte
}
You can also create an NSInputStream from an NSData object, if you need the read interface:
NSData *data = ...;
NSInputStream *readData = [[NSInputStream alloc] initWithData:data];
[readData open];
However, you should be aware that initWithData copies the contents of data.
One of the simplest ways is to use NSData getBytes:range:.
NSData *data = ...;
char buffer[numberOfBytes];
[data getBytes:buffer range:NSMakeRange(position, numberOfBytes)];
where position and length is the position you want to read from in NSData and the length is how many bytes you want to read. No need to copy.
Alex already mentioned NSData getBytes:range: but there is also NSData getBytes:length: which starts from the first byte.
NSData *data = ...;
char buffer[numberOfBytes];
[data getBytes:buffer length:numberOfBytes];
May way of doing that..
do not forget to free byte array after usage.
NSData* dat = //your code
NSLog(#"Receive from Peripheral: %#",dat);
NSUInteger len = [dat length];
Byte *bytedata = (Byte*)malloc(len);
[dat getBytes:bytedata length:len];
int p = 0;
while(p < len)
{
printf("%02x",bytedata[p]);
if(p!=len-1)
{
printf("-");
}//printf("%c",bytedata[p]);
p++;
}
printf("\n");
// byte array manipulation
free(bytedata);
I am converting my recorded audio which is in .m4a format to .caf format. The settings of the recorded audio is as given below:
/* Record settings for recording the audio*/
recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:[NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey,
[NSNumber numberWithInt:44100.0],AVSampleRateKey,
[NSNumber numberWithInt: 2],AVNumberOfChannelsKey,
[NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
nil];
I convert the audio to .caf using this function:
-(NSString *)handleConvertToPCM:(NSURL *)convertUrl
{
[self performSelectorOnMainThread:#selector(showActivity) withObject:nil waitUntilDone:NO];
DEBUG_LOG(#"DEBUGGING");
DEBUG_LOG(#"handleConvertToPCM");
// open an ExtAudioFile
NSLog (#"opening %#", convertUrl);
ExtAudioFileRef inputFile;
CheckResult (ExtAudioFileOpenURL((CFURLRef)convertUrl, &inputFile),
"ExtAudioFileOpenURL failed");
// prepare to convert to a plain ol' PCM format
AudioStreamBasicDescription requiredPCMFormat;
requiredPCMFormat.mSampleRate = 44100; // todo: or use source rate?
requiredPCMFormat.mFormatID = kAudioFormatLinearPCM ;
requiredPCMFormat.mFormatFlags = kAudioFormatFlagsCanonical;
requiredPCMFormat.mChannelsPerFrame = 2;
requiredPCMFormat.mFramesPerPacket = 1;
requiredPCMFormat.mBitsPerChannel = 16;
requiredPCMFormat.mBytesPerPacket = 4;
requiredPCMFormat.mBytesPerFrame = 4;
CheckResult (ExtAudioFileSetProperty(inputFile, kExtAudioFileProperty_ClientDataFormat,
sizeof (requiredPCMFormat), &requiredPCMFormat),
"ExtAudioFileSetProperty failed");
// allocate a big buffer. size can be arbitrary for ExtAudioFile.
UInt32 outputBufferSize = 0x10000;
void* ioBuf = malloc (outputBufferSize);
UInt32 sizePerPacket = requiredPCMFormat.mBytesPerPacket;
UInt32 packetsPerBuffer = outputBufferSize / sizePerPacket;
// set up output file
self.outputPath = [NSString stringWithFormat:#"%#/export-pcm.caf",DOCUMENTS_FOLDER];
self.outputURL = [NSURL fileURLWithPath:self.outputPath];
DEBUG_LOG(#"creating output file %#", self.outputURL);
AudioFileID outputFile;
CheckResult(AudioFileCreateWithURL((CFURLRef)outputURL,
kAudioFileCAFType,
&requiredPCMFormat,
kAudioFileFlags_EraseFile,
&outputFile),
"AudioFileCreateWithURL failed");
// start convertin'
UInt32 outputFilePacketPosition = 0; //in bytes
while (true)
{
// wrap the destination buffer in an AudioBufferList
AudioBufferList convertedData;
convertedData.mNumberBuffers = 1;
convertedData.mBuffers[0].mNumberChannels = requiredPCMFormat.mChannelsPerFrame;
convertedData.mBuffers[0].mDataByteSize = outputBufferSize;
convertedData.mBuffers[0].mData = ioBuf;
UInt32 frameCount = packetsPerBuffer;
// read from the extaudiofile
CheckResult (ExtAudioFileRead(inputFile,
&frameCount,
&convertedData),
"Couldn't read from input file");
if (frameCount == 0)
{
printf ("done reading from file");
break;
}
// write the converted data to the output file
CheckResult (AudioFileWritePackets(outputFile,
false,
frameCount,
NULL,
outputFilePacketPosition / requiredPCMFormat.mBytesPerPacket,
&frameCount,
convertedData.mBuffers[0].mData),
"Couldn't write packets to file");
DEBUG_LOG(#"Converted %ld bytes", outputFilePacketPosition);
// advance the output file write location
outputFilePacketPosition += (frameCount * requiredPCMFormat.mBytesPerPacket);
}
// clean up
ExtAudioFileDispose(inputFile);
AudioFileClose(outputFile);
return(self.outputPath);
}
My problem is that the size of the converted file is very high compared to the file given for conversion.Is there anyway to decrease the size by changing the conversion settings.
I tried compressing the file obtained , but it takes much time to compress.So I would like to get a way to decrease size along with conversion.
Decompressing a highly compressed audio file almost always results in a much larger result file, unless you re-compress using an even lossier compression format.
Hi I am writing a iphone application where I need to store binary data i.e.; image in the Ultralite database.
I am using following code for this purpose.
NSString *filePath = [[NSBundle mainBundle] pathForResource:#"file_name" ofType:#"png"];
NSData *data = [NSData dataWithContentsOfFile:filePath];
NSUInteger len = [data length];
ul_binary *byteData = (ul_binary*)malloc(len);
memcpy(byteData, [data bytes], len);
ULTable *table = connection->OpenTable("NAMES");
if(table->InsertBegin()){
table->SetInt(1, (maxId+1));
table->SetString(2, [name UTF8String]);
table->SetBinary(3, byteData);
table->Insert();
table->Close();
connection->Commit();
}
This code is giving error 'EXC_BAD_ERROR' on line::
table->SetBinary(3, byteData);
This code works fine if i comment this line.
Any help would be appreciated!
Thanks
The definition of ul_binary is this:
typedef struct ul_binary {
/// The number of bytes in the value.
ul_length len;
/// The actual data to be set (for insert) or that was fetched (for select).
ul_byte data[ MAX_UL_BINARY ];
} ul_binary, * p_ul_binary;
So it's a struct. By simply doing the memcpy as you do, you also overwrite the len field and everythings messed up. So here's how you should do it (as far as I can see):
ul_binary *byteData = (ul_binary *)malloc(sizeof(ul_binary));
memcpy(&byteData->data, [data bytes], len);
byteData->len = len;
You also need to check that len <= MAX_UL_BINARY before you try to allocate the memory. And don't forget to free(byteData);.