How to change the compression style of a TIFF image using sips? - command-line

I am a Mac guy,
How to get the PackBits compression using sips?
Below one for LZW works fine.
sips -s formatOptions lzw /DefaultGroup.tif
But this fails:
sips -s formatOptions packbits /DefaultGroup.tif
Any idea why?

Instead of
sips -s formatOptions packbits /DefaultGroup.tif
try
sips -s formatOptions pacbits /DefaultGroup.tif
In other words, use 'pacbits' instead of 'packbits'.
Kind of surprising, but there it is.

The code below works.
But still I could not find the actual answer to my query.
int compression = NSTIFFCompressionPackBits;
CGImageSourceRef source = CGImageSourceCreateWithData((CFDataRef)dataToWrite, NULL);
CGImageRef imageRef = CGImageSourceCreateImageAtIndex(source, 0, NULL);
CFMutableDictionaryRef saveMetaAndOpts = CFDictionaryCreateMutable(nil,0,&kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
CFMutableDictionaryRef tiffProfsMut = CFDictionaryCreateMutable(nil, 0,
&kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(tiffProfsMut, kCGImagePropertyTIFFCompression, CFNumberCreate(NULL, kCFNumberIntType, &compression));
CFDictionarySetValue(saveMetaAndOpts, kCGImagePropertyTIFFDictionary, tiffProfsMut);
NSURL *outURL = [[NSURL alloc] initFileURLWithPath:filename];
CGImageDestinationRef dr = CGImageDestinationCreateWithURL ((CFURLRef)outURL, (CFStringRef)#"public.tiff" , 1, NULL);
CGImageDestinationAddImage(dr, imageRef, saveMetaAndOpts);
CGImageDestinationFinalize(dr);
CFRelease(dr);
[outURL release];
CFRelease(tiffProfsMut);
CFRelease(saveMetaAndOpts);
CFRelease(imageRef);
CFRelease(source);

loopqoob is right.
The man page for sips is wrong. You must use "pacbits" NOT "packbits" when using sips.
In other words, this will work. It does on my Mac running High Sierra:
E.G.
sips -s format tiff -s formatOptions pacbits '/Users/rob/Downloads/image20.jpg' --out '/Users/rob/Downloads/image20.tiff
After I do that, I can open up the tiff image in Preview. Click "Tools" and then "Show Inspector" on the drop down menu. You will see that the TIFF image is compressed with Packbits. (Preview shows the correct name of the compression used).

Related

Using AVAssetWriter with raw NAL Units

I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded.
The settings used for encoding the media appended to the output. Pass nil to specify that appended samples should not be re-encoded.
I want to take advantage of this feature to pass in a stream of raw H.264 NALs, but I am having trouble adapting my raw byte streams into a CMSampleBuffer that I can pass into AVAssetWriterInput's appendSampleBuffer method. My stream of NALs contains only SPS/PPS/IDR/P NALs (1, 5, 7, 8). I haven't been able to find documentation or a conclusive answer on how to use pre-encoded H264 data with AVAssetWriter. The resulting video file is not able to be played.
How can I properly package the NAL units into CMSampleBuffers? Do I need to use a start code prefix? A length prefix? Do I need to ensure I only put one NAL per CMSampleBuffer? My end goal is to create an MP4 or MOV container with H264/AAC.
Here's the code I've been playing with:
-(void)addH264NAL:(NSData *)nal
{
dispatch_async(recordingQueue, ^{
//Adapting the raw NAL into a CMSampleBuffer
CMSampleBufferRef sampleBuffer = NULL;
CMBlockBufferRef blockBuffer = NULL;
CMFormatDescriptionRef formatDescription = NULL;
CMItemCount numberOfSampleTimeEntries = 1;
CMItemCount numberOfSamples = 1;
CMVideoFormatDescriptionCreate(kCFAllocatorDefault, kCMVideoCodecType_H264, 480, 360, nil, &formatDescription);
OSStatus result = CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, NULL, [nal length], kCFAllocatorDefault, NULL, 0, [nal length], kCMBlockBufferAssureMemoryNowFlag, &blockBuffer);
if(result != noErr)
{
NSLog(#"Error creating CMBlockBuffer");
return;
}
result = CMBlockBufferReplaceDataBytes([nal bytes], blockBuffer, 0, [nal length]);
if(result != noErr)
{
NSLog(#"Error filling CMBlockBuffer");
return;
}
const size_t sampleSizes = [nal length];
CMSampleTimingInfo timing = { 0 };
result = CMSampleBufferCreate(kCFAllocatorDefault, blockBuffer, YES, NULL, NULL, formatDescription, numberOfSamples, numberOfSampleTimeEntries, &timing, 1, &sampleSizes, &sampleBuffer);
if(result != noErr)
{
NSLog(#"Error creating CMSampleBuffer");
}
[self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];
});
}
Note that I'm calling CMSampleBufferSetOutputPresentationTimeStamp on the sample buffer inside of the writeSampleBuffer method with what I think is a valid time before I'm actually trying to append it.
Any help is appreciated.
I managed to get video playback working in VLC but not QuickTime. I used code similar to what I posted above to get H.264 NALs into CMSampleBuffers.
I had two main issues:
I was not setting CMSampleTimingInfo correctly (as my comment above states).
I was not packing the raw NAL data correctly (not sure where this is documented, if anywhere).
To solve #1, I set timing.duration = CMTimeMake(1, fps); where fps is the expected frame rate. I then set timing.decodeTimeStamp = kCMTimeInvalid; to mean that the samples will be given in decoding order. Lastly, I set timing.presentationTimeStamp by calculating the absolute time, which I also used with startSessionAtSourceTime.
To solve #2, through trial and error I found that giving my NAL units in the following form worked:
[7 8 5] [1] [1] [1]..... [7 8 5] [1] [1] [1]..... (repeating)
Where each NAL unit is prefixed by a 32-bit start code equaling 0x00000001.
Presumably for the same reason it's not playing in QuickTime, I'm still having trouble moving the resulting .mov file to the photo album (the ALAssetLibrary method videoAtPathIsCompatibleWithSavedPhotosAlbum is failing stating that the "Movie could not be played." Hopefully someone with an idea about what's going on can comment. Thanks!

How to edit the default instrument of an AUGraph?

I'm working with the MusicPlayer API. I understand that when you load in a .mid as a sequence, the API creates a default AUGraph for you that includes an AUSampler. This AUSampler uses a simple sine-wave based instrument to synthesize the notes in the .mid
My question is, how does one change the default instrument in the AUSampler? I understand that you can use SoundFont2 files (.sf2) and add them using the AudioUnitSetProperty method. But, how does one access this default AUGraph? Do you have to open the graph before you can edit the AudioUnit or is opening a graph only for editing connections between nodes?
Thanks :)
I've written a tutorial on this but here but here's an outline of the process:
Function to load a Sound Font file (taken from the Apple documentation):
-(OSStatus) loadFromDLSOrSoundFont: (NSURL *)bankURL withPatch: (int)presetNumber {
OSStatus result = noErr;
// fill out a bank preset data structure
AUSamplerBankPresetData bpdata;
bpdata.bankURL = (__bridge CFURLRef) bankURL;
bpdata.bankMSB = kAUSampler_DefaultMelodicBankMSB;
bpdata.bankLSB = kAUSampler_DefaultBankLSB;
bpdata.presetID = (UInt8) presetNumber;
// set the kAUSamplerProperty_LoadPresetFromBank property
result = AudioUnitSetProperty([pointer to your AUSampler node here],
kAUSamplerProperty_LoadPresetFromBank,
kAudioUnitScope_Global,
0,
&bpdata,
sizeof(bpdata));
// check for errors
NSCAssert (result == noErr,
#"Unable to set the preset property on the Sampler. Error code:%d '%.4s'",
(int) result,
(const char *)&result);
return result; }
Then you need to load the Sound Font from your Resources folder:
NSURL *presetURL = [[NSURL alloc] initFileURLWithPath:[[NSBundle mainBundle] pathForResource:#"Name of sound font" ofType:#"sf2"]];
// Initialise the sound font
[self loadFromDLSOrSoundFont: (NSURL *)presetURL withPatch: (int)10];
Hope this helps!
You might take a look at the Audiograph example. It doesn't use soundFonts but should give you an idea of how to set up a graph.
When I use the MusicPlayer I always generate the midi note data from code/GUI and create the AUGraph (with a mixer) from scratch. There are ways to derive/extract the default generated AUGraph & AUSampler resulting from loading a midi file (example code below) but I never had success setting a new soundFont this way. On the other hand, creating the AUGraph from scratch and then loading an .sf2 file works great.
AUGraph graph;
result = MusicSequenceGetAUGraph (sequence, &graph);
MusicTrack firstTrack;
result = MusicSequenceGetIndTrack (sequence, 0, &firstTrack);
AUNode myNode;
result = MusicTrackGetDestNode(firstTrack,&myNode);
AudioUnit mySamplerUnit;
result = AUGraphNodeInfo(graph, myNode, 0, &mySamplerUnit);

Converting Image to Byte Array

I am looking to convert a signature that is captured from the user.
I have the following
NSData *imageData = UIImagePNGRepresentation(drawImage.image);
NSUInteger len = [imageData length];
byteData = (Byte*)malloc(len);
memcpy(byteData, [imageData bytes], len);
Which I saw from a similar question, My problem is I can't use byteData anywhere, it shoots back a Bad_access error. E Am I converting it properly to a byte Array? If I output imageData to console i get
<89504e47 0d0a1a0a 0000000d 49484452 00000140 0000016f 08060000 003b6a12 49000020 00494441 547801ed 9d07b464 4599c71d 494a5219 5832c210 84251d19 51118519 755d0c18 402489a0 b206c015 44443d0a a8e82a8a 804a7005 040cc0c2 022eac89 a30c02a3 4bf02c41 24391266 605601c9 9267ffff 377d39df f474bfd7 affb7657 75d7afce f9dead9b eaabfa7d f7febb6e ddf0a62c 58b0e079 24084000 02251278 7e898da6 cd108000 044c0001 e4388000 048a2580 00161b7a 1a0e0108 20801c03 108040b1 0410c062 434fc321 00010490 63000210 28960002 586ce869 38042080 00720c40 0002c512 40008b0d 3d0d8700 0410408e 010840a0 58020860 b1a1a7e1 10800002 c8310001 08144b00 012c36f4 341c0210 40003906 20008162 092080c5 869e8643 00020820 c7000420 502c0104 b0d8d0d3 70084000 01e41880 00048a25 8000161b 7a1a0e01 0820801c 03108040 b10410c0 62434fc3 21000104 90630002 10289600 02586ce8 69380420 8000720c 400002c5 1240008b 0d3d0d87 00041040 8e010840 a0580208 60b1a1a7 e1108000 02c83100 0108144b 00012c36 f4341c02 10400039 06200081 62092080 c5869e86 43000208 20c70004 20502c01 04b0d8d etc..
To convert data to string use:
[NSString stringWithCString: encoding:];
[NSString stringWithUTF8String:];
If you want to send it via HTTP use the second one. And make sure it is zero-terminated:
byteData = (Byte*)calloc(len+1, sizeof(Byte));
Solved it. Problem was i was encoding it with the format. I need to do a base64 Encoding. The following the site was was where the answer for the base64 encoder is http://www.cocoadev.com/index.pl?BaseSixtyFour

Compress/Decompress NSString in objective-c (iphone) using GZIP or deflate

I have a web-service running on Windows Azure which returns JSON that I consume in my iPhone app.
Unfortunately, Windows Azure doesn't seem to support the compression of dynamic responses yet (long story) so I decided to get around it by returning an uncompressed JSON package, which contains a compressed (using GZIP) string.
e.g
{"Error":null,"IsCompressed":true,"Success":true,"Value":"vWsAAB+LCAAAAAAAB..etc.."}
... where value is the compressed string of a complex object represented in JSON.
This was really easy to implement on the server, but for the life of me I can't figure out how to decompress a gzipped NSString into an uncompressed NSString, all the examples I can find for zlib etc are dealing with files etc.
Can anyone give me any clues on how to do this? (I'd also be happy for a solution that used deflate as I could change the server-side implementation to use deflate too).
Thanks!!
Steven
Edit 1: Aaah, I see that ASIHTTPRequest is using the following function in it's source code:
//uncompress gzipped data with zlib
+ (NSData *)uncompressZippedData:(NSData*)compressedData;
... and I'm aware that I can convert NSString to NSData, so I'll see if this leads me anywhere!
Edit 2: Unfortunately, the method described in Edit 1 didn't lead me anywhere.
Edit 3: Following the advice below regarding base64 encoding/decoding, I came up with the following code. The encodedGzippedString is as you can guess, a string "Hello, my name is Steven Elliott" which is gzipped and then converted to a base64 string. Unfortunately, the result that prints using NSLog is just blank.
NSString *encodedGzippedString = #"GgAAAB+LCAAAAAAABADtvQdgHEmWJSYvbcp7f0r1StfgdKEIgGATJNiQQBDswYjN5pLsHWlHIymrKoHKZVZlXWYWQMztnbz33nvvvffee++997o7nU4n99//P1xmZAFs9s5K2smeIYCqyB8/fnwfPyK+uE6X2SJPiyZ93eaX+TI9Lcuiatvx/wOwYc0HGgAAAA==";
NSData *decodedGzippedData = [NSData dataFromBase64String:encodedGzippedString];
NSData* unGzippedJsonData = [ASIHTTPRequest uncompressZippedData:decodedGzippedData];
NSString* unGzippedJsonString = [[NSString alloc] initWithData:unGzippedJsonData encoding:NSASCIIStringEncoding];
NSLog(#"Result: %#", unGzippedJsonString);
After all this time, I finally found a solution to this problem!
None of the answers above helped me, as promising as they all looked. In the end, I was able to compress the string on the server with gzip using the chilkat framework for .net ... and then decompress it on the iphone using the chilkat framework for iOS (not yet released, but available if you email the guy directly).
The chilkat framework made this super easy to do so big thumbs up to the developer!
Your "compressed" string is not raw GZIP'd data, it's in some encoding that allows those bytes to be stored in a string-- looks like base-64 or something like it. To get an NSData out of this, you'll need to decode it into the NSData.
If it's really base-64, check out this blog post an accompanying code:
http://cocoawithlove.com/2009/06/base64-encoding-options-on-mac-and.html
which will do what you want.
Once you have an NSData object, the ASIHTTPRequest method will probably do as you like.
This worked for me:
from a string gzipeed, then base64 encoded
to un-gzipped string (all utf8).
#import "base64.h"
#import "NSData+Compression.h"
...
+(NSString *)gunzipBase64StrToStr:(NSString *)stringValue {
//now we decode from Base64
Byte inputData[[stringValue lengthOfBytesUsingEncoding:NSUTF8StringEncoding]];//prepare a Byte[]
[[stringValue dataUsingEncoding:NSUTF8StringEncoding] getBytes:inputData];//get the pointer of the data
size_t inputDataSize = (size_t)[stringValue length];
size_t outputDataSize = EstimateBas64DecodedDataSize(inputDataSize);//calculate the decoded data size
Byte outputData[outputDataSize];//prepare a Byte[] for the decoded data
Base64DecodeData(inputData, inputDataSize, outputData, &outputDataSize);//decode the data
NSData *theData = [[NSData alloc] initWithBytes:outputData length:outputDataSize];//create a NSData object from the decoded data
//NSLog(#"DATA: %# \n",[theData description]);
//And now we gunzip:
theData=[theData gzipInflate];//make bigger==gunzip
return [[NSString alloc] initWithData:theData encoding:NSUTF8StringEncoding];
}
#end
I needed to compress data on the iPhone using Objective-c and decompress on PHP. Here is what I used in XCode 11.5 and iOS 12.4:
iOS Objective-c Compression Decompression Test
Include libcompression.tbd in the Build Phases -> Link Binary With Library. Then include the header.
#include "compression.h"
NSLog(#"START META DATA COMPRESSION");
NSString *testString = #"THIS IS A COMPRESSION TESTTHIS IS A COMPRESSION TESTTHIS IS A COMPRESSION TESTTHIS IS A COMPRESSION TESTTHIS IS A COMPRESSION TESTTHIS IS A COMPRESSION TEST";
NSData *theData = [testString dataUsingEncoding:NSUTF8StringEncoding];
size_t src_size = theData.length;
uint8_t *src_buffer = (uint8_t*)[theData bytes];
size_t dst_size = src_size+4096;
uint8_t *dst_buffer = (uint8_t*)malloc(dst_size);
dst_size = compression_encode_buffer(dst_buffer, dst_size, src_buffer, src_size, NULL, COMPRESSION_ZLIB);
NSLog(#"originalsize:%zu compressed:%zu", src_size, dst_size);
NSData *dataData = [NSData dataWithBytes:dst_buffer length:sizeof(dst_buffer)];
NSString *compressedDataBase64String = [dataData base64EncodedStringWithOptions:0];
NSLog(#"Compressed Data %#", compressedDataBase64String);
NSLog(#"START META DATA DECOMPRESSION");
src_size = compression_decode_buffer(src_buffer, src_size, dst_buffer, dst_size, NULL, COMPRESSION_ZLIB);
NSData *decompressed = [[NSData alloc] initWithBytes:src_buffer length:src_size];
NSString *decTestString;
decTestString = [[NSString alloc] initWithData:decompressed encoding:NSASCIIStringEncoding];
NSLog(#"DECOMPRESSED DATA %#", decTestString);
free(dst_buffer);
On the PHP side I used the following function to decompress the data:
function decompressString($compressed_string) {
//NEED RAW GZINFLATE FOR COMPATIBILITY WITH IOS COMPRESSION_ZLIB WITH IETF RFC 1951
$full_string = gzinflate($compressed_string);
return $full_string;
}

Can anyone provide a working example of AudioFileStreamSeek for the iPhone?

I find Apple's documentation quite limited on AudioFileStreamSeek and I cannot find any examples of actual usage anywhere. I have a working streaming audio player, but I just can't seem to get AudioFileStreamSeek to work as advertised...
Any help tips or a little example would be greatly appreciated!
I am told this works:
AudioQueueStop(audioQueue, true);
UInt32 flags = 0;
err = AudioFileStreamParseBytes(audioFileStream, length, bytes,
kAudioFileStreamParseFlag_Discontinuity);
OSStatus status = AudioFileStreamSeek(audioFileStream, framePacket.mPacket,
&currentOffset, &flags);
NSLog(#"Setting next byte offset to: %qi, flags: %d", (long long)currentOffset, flags);
// then read data from the new offset set by AudioFileStreamSeek
[fileHandle seekToFileOffset:currentOffset];
NSData* data = "" readDataOfLength:4096];
flags = kAudioFileStreamParseFlag_Discontinuity;
status = AudioFileStreamParseBytes( stream, [data length], [data bytes], flags);
if (status != noErr)
{
NSLog(#"Error parsing bytes: %d", status);
}
Unless I'm mistaken, this is only available in the 3.0 SDK, and therefore under NDA. Maybe you should take this to the Apple Beta forums?
I stand corrected. AudioFileStreamSeek doesn't show up if you do a search in the online 2.2.1 documentation. You have to manually dig into the docs to find it.
Don't forget to add the data offset (kAudioFileStreamProperty_DataOffset) to the byte offset returned by AudioFileStreamSeek. The return value is an offset into the audio data and ignores the data offset.
It's also a good idea to stop and then re-start the AudioQueue before/after seeking.
Matt Gallagher uses AudioFileStreamSeek in his example "Streaming and playing an MP3 stream".
Look at Matt's code AudioStreamer.m:
SInt64 seekPacket = floor(newSeekTime / packetDuration);
err = AudioFileStreamSeek(audioFileStream, seekPacket, &packetAlignedByteOffset, &ioFlags);