How to init NSInputStream with a part of file? - iphone

I was woking on YouTube Resumable Uploads: https://developers.google.com/youtube/2.0/developers_guide_protocol_resumable_uploads#Sending_a_Resumable_Upload_API_Request
I use ASIHttpRequest to upload the videos.
For a direct uploading, I can use this method.
- (void)appendPostDataFromFile:(NSString *)file
But for an resume uploading, I can't append post data from the whole video file.
Maybe I can write a method like this:
- (void)appendPostDataFromFile:(NSString *)file offset:(long long)offset
But I don't know how to make it work. Any help will be appreciated!
And here is the code from ASIHttpRequest:
- (void)appendPostDataFromFile:(NSString *)file
{
[self setupPostBody];
NSInputStream *stream = [[[NSInputStream alloc] initWithFileAtPath:file] autorelease];
[stream open];
NSUInteger bytesRead;
while ( !_isTryCanceled && [stream hasBytesAvailable] ) {
unsigned char buffer[1024*256];
bytesRead = [stream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == 0) {
break;
}
if ([self shouldStreamPostDataFromDisk]) {
[[self postBodyWriteStream] write:buffer maxLength:bytesRead];
} else {
[[self postBody] appendData:[NSData dataWithBytes:buffer length:bytesRead]];
}
}
[stream close];
}
As the code show above, if I can get a seekable NSInputStream, the problem will be solved. Is it possible to do so?

Related

Can't receive jpeg image from server

I am trying to receive a jpeg image from my c# server. The weird thing is when I run it with the debugger and have a break point anywhere in the method it works perfectly fine. Without a breakpoint then I get this error
Corrupt JPEG data: premature end of data segment
Here is my code
(void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
NSMutableData *data;
data = [NSMutableData new];
switch(eventCode) {
case NSStreamEventHasBytesAvailable:
{
uint8_t buffer[1024];
int len;
while([inputStream hasBytesAvailable]) {
len = [inputStream read:buffer maxLength:sizeof(buffer)];
if (len > 0)
{
[data appendBytes:(const void*)buffer length:sizeof(buffer)];
}
}
UIImage *images = [[UIImage alloc]initWithData:data];
[dvdCover setImage:images];
} break;
case NSStreamEventEndEncountered:
{
//UIImage *images = [[UIImage alloc]initWithData:data];
//[dvdCover setImage:images];
} break;
}
}
hi you can check this code hop it will help you...
case NSStreamEventHasBytesAvailable:
{
uint32_t max_size = 1000000; // Max size of the received imaged you can modify it as your reqirement.
NSMutableData* buffer = [[NSMutableData alloc] initWithLength: max_size];
NSInteger totalBytesRead = 0;
NSInteger bytesRead = [(NSInputStream *)stream read: [buffer mutableBytes] maxLength: max_size];
if (bytesRead != 0) {
while (bytesRead > 0 && totalBytesRead + bytesRead < max_size) {
totalBytesRead+= bytesRead;
bytesRead = [(NSInputStream *)stream read: [buffer mutableBytes] + totalBytesRead maxLength: max_size - totalBytesRead];
}
if (bytesRead >= 0) {
totalBytesRead += bytesRead;
}
else {
// read failure, report error and bail (not forgetting to release buffer)
}
[buffer setLength: totalBytesRead];
yourImageName.image = [UIImage imageWithData: buffer];
[buffer release];
} break;
It seems you are assuming the whole JPEG image will be transferred in one chunk, and you can just read it with one occurence of 'HasBytesAvailable' event. However you should also consider the case where the JPEG image is transferred to you in multiple chunks.
It might work for you if you set breakpoint, because your code execution might be halted somewhere, and your network buffer had plenty time to receive all bytes of the image. But without breakpoints it might not have time to do so.
Try refactoring your code to accumulate the bytes chunk, and only assume it's done when all bytes have been transferred. (Normally you have to know beforehand how many bytes the image is going to be -- or you can just capture the end of stream event)

CFReadStreamCopyError report error on iPad device

I am running into an issue. Same code running OK on iPhone (iOS 5) and iPhone/iPad simulator. But it does not work on an iPad (iOS 5). I'd appreciate any help.
Here is read port code:
//Code for read port.
CFIndex bytesRead = CFReadStreamRead(inputStream, bufferPoint, 1024);
if (bytesRead < 0) {
NSLog(#"bytesRead < 0");
CFErrorRef error = CFReadStreamCopyError(inputStream);
//reportError(error);
DEBUGLOG(#"readResponse error \n")
Before above, there is connection part code.
//prevent to release before relocate
if ((inputStream != nil) && (outputStream != nil)) {
[inputStream release];
inputStream = nil;
[outputStream release];
outputStream = nil;
}
[NSStream getStreamsToHostNamed:relayHost port:relayPort inputStream:&inputStream outputStream:&outputStream];
//[self lgetStreamsToHostNamed:relayHost port:relayPort inputStream:&inputStream outputStream:&outputStream];
if ((inputStream != nil) && (outputStream != nil))
{
sendState = kIMAPConnecting;
isSecure = NO;
[inputStream retain];
[outputStream retain];
[inputStream setDelegate:self];
[outputStream setDelegate:self];
result = [inputStream setProperty:NSStreamSocketSecurityLevelNegotiatedSSL forKey:NSStreamSocketSecurityLevelKey];
DEBUGLOG(#"inputStream setProperty result: %d", result);
result =[ outputStream setProperty:NSStreamSocketSecurityLevelNegotiatedSSL forKey:NSStreamSocketSecurityLevelKey];
DEBUGLOG(#"outputStream setProperty result: %d", result);
if (!CFReadStreamOpen(inputStream)) {
DEBUGLOG(#"inputStream open failed");
return NO;
}
if (!CFWriteStreamOpen(outputStream)) {
DEBUGLOG(#"outputStream open failed");
return NO;
}
self.inputString = [NSMutableString string];
DEBUGLOG(#"SCRIMAPMessage startToConnect end with YES\n");
return YES;
}
The following is not available via iOS:
[NSStream getStreamsToHostNamed:relayHost port:relayPort inputStream:&inputStream outputStream:&outputStream];
I really do not know how it can work on iOS for the iPhone.
Your options are fairly simple...
A) Create a category on NSStream as described in this technical note from Apple: here
B) Use CFStreamCreatePairWithSocketToHost() and simply bridge CFReadStreamRef/CFWriteStreamRef
I recommend (B) as it will give you the best option for flexibility. More specifically you can create your own StreamObject class to handle this and the stream delegate all in one.
Happy coding!

iPhone OS 4.0: NSFileHandleDataAvailableNotification not providing callback at file-end

I'm a bit new to iPhone development, so be gentle! I'm supporting an app which loads a wav file from a URL file-stream, and plays it back through an AudioQueue.
We run a continual loop in another thread, and stop the Queue if we detect that it has no buffers in use, and the input FileStream has reached its end. In turn, we detect if the FileStream has ended within the waitForDataInBackgroundAndNotify callback on the NSFileHandleDataAvailableNotification for the stream, by checking whether the availableData has length 0.
This works under iOS 3.0 - we get a notification of 0 available data at the end of the file - but on iOS 4.0, we don't seem to receive the callback at file end. This happens on an OS 4.0 device, regardless of the target OS version.
Has the API changed between the two versions? How can I detect the end of the file now?
Hopefully-relevant code:
data-available callback:
- (void)readFileData:(NSNotification *)notification
{
#try
{
NSData *data = [[notification object] availableData];
if ([data length] == 0 && self.audioQueueState != AQS_END)
{
/***********************************************************************/
/* We've hit the end of the data but it's possible that more may be */
/* appended to the file (if we're still downloading it) so we need to */
/* wait for the availability of more data. */
/***********************************************************************/
[self setFileStreamerState:FSS_END];
[[notification object] waitForDataInBackgroundAndNotify];
}
else if (self.audioQueueState == AQS_END)
{
TRC_DBG(#"ignore read data as ending");
}
else
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
TRC_DBG(#"Read %d bytes", [data length]);
[self setFileStreamerState:FSS_DATA];
if (discontinuous)
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, discontinuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], kAudioFileStreamParseFlag_Discontinuity);
discontinuous = NO;
}
else
{
TRC_DBG(#"AudioFileStreamParseBytes %d bytes, continuous", [data length]);
err = AudioFileStreamParseBytes(audioFileStream, [data length], [data bytes], 0);
}
/***********************************************************************/
/* If error then get out, otherwise wait again for more data. */
/***********************************************************************/
if (err != 0)
{
[self failWithErrorCode:AS_FILE_STREAM_PARSE_BYTES_FAILED];
}
else
{
[[notification object] waitForDataInBackgroundAndNotify];
}
[pool release];
}
}
#catch (NSException *exception)
{
TRC_ERR(#"Exception: %#", exception);
TRC_ERR(#"Exception reason: %#", [exception reason]);
//[self failWithErrorCode:AS_FILE_AVAILABLE_DATA_FAILED];
}
}

How to Create UIImage from NSData and Avatar Data of XMPP?

This question is related to Iphone SDK, NSData and UIImage.
I am trying to create an image from the Avatar Data returned from the xmpp like the following:
<presence from='yyy#184.73.164.51/spark' to='ken#184.73.164.51/424978324712783686768453' id='Oj02v-45'><status>Away due to idle.</status><priority>0</priority><show>away</show><x xmlns='vcard-temp:x:update'><photo>a3f549fa9705e7ead2905de0b6a804227ecdd404</photo></x><x xmlns='jabber:x:avatar'><hash>a3f549fa9705e7ead2905de0b6a804227ecdd404</hash></x></presence>
So in this case, I assume that a3f549fa9705e7ead2905de0b6a804227ecdd404 is the photo data.
So How can I transfer this into NSData?
I think if I can get the NSData object,
I can easily create the UIImage, right?
I think "a3f549fa9705e7ead2905de0b6a804227ecdd404" is the photo data
this is my codes:
NSString* command = #"a3f549fa9705e7ead2905de0b6a804227ecdd404";
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length]/2; i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
UIImage *image = [UIImage imageWithData: commandToSend];
However,
it doesn't work.
Anyone knows what's wrong with it?
In XMPPPresence.m add this method
-(NSString *)photo {
NSXMLElement *xElement = [self elementForName:#"x" xmlns:#"vcard-temp:x:update"];
NSString *photoHash = [[xElement elementForName:#"photo"]stringValue];
return photoHash;
}
// In XMPPStream's delegate:
- (void)xmppStream:(XMPPStream *)stream didReceivePresence:
(XMPPPresence *)presence {
NSString *photoHash = [presence photo];
if ([photoHash length] > 0) { // in case when there's no photo hash
XMPPJID *rosterJID = [presence from];
BOOL requestPhoto = ... // determine if you need to request new
photo or nor
if (requestPhoto) {
NSXMLElement *iqAvatar = [NSXMLElement elementWithName:#"iq"];
NSXMLElement *queryAvatar = [NSXMLElement elementWithName:#"vCard"
xmlns:#"vcard-temp"];
[iqAvatar addAttributeWithName:#"type" stringValue:#"get"];
[iqAvatar addAttributeWithName:#"to" stringValue:[rosterJID full]];
[iqAvatar addChild:queryAvatar];
XMPPIQ *avatarRequestIQ = [XMPPIQ iqFromElement:iqAvatar];
[stream sendElement:avatarRequestIQ];
}
}
}
// And when buddy will send photo, it will be in vcard BASE64-encoded.
// You will receive it as IQ:
- (BOOL)xmppStream:(XMPPStream *)stream didReceiveIQ:(XMPPIQ *)iq {
XMPPElement *vCardPhotoElement = (XMPPElement *)[[iq
elementForName:#"vCard"] elementForName:#"PHOTO"];
if (vCardPhotoElement != nil) {
// avatar data
NSString *base64DataString = [[vCardPhotoElement
elementForName:#"BINVAL"] stringValue];
NSData *imageData = [NSData
dataFromBase64String:base64DataString]; // you need to get NSData
BASE64 category
UIImage *avatarImage = [UIImage imageWithData:imageData];
XMPPJID *senderJID = [iq from];
[self xmppStream:stream didReceiveImage:avatarImage
forBuddy:senderJID]; // this is my custom delegate method where I
save new avatar to cache
}
return NO;
}
Hope this will help you.
That is the picture hash you now have to send a vcard request which will contain the same hash for verification and binval containing the picture data in base64

problem with NSInputStream on real iPhone

I have a problem with NSInputStream. Here is my code:
case NSStreamEventHasBytesAvailable:
printf("BYTE AVAILABLE\n");
int len = 0;
NSMutableData *data = [[NSMutableData alloc] init];
uint8_t buffer[32768];
if(stream == iStream)
{
printf("Receiving...\n");
len = [iStream read:buffer maxLength:32768];
[data appendBytes:buffer length:len];
}
[iStream close];
I try to read small data and it works perfectly on simulator and real iPhone.
If I try to read large data (more than 4kB or maybe 5kB), the real iPhone just can read 2736 bytes and stop.
Why is it? Help me plz!
Merci d'avance!
Your data object needs to be external to your stream handler. It is often the case that when large abounts of data are coming in, you get it in chunks and not all at once. Just keep appending data to it until you receive bytesRead == 0; Then you can close your stream and use the data.
case NSStreamEventHasBytesAvailable: {
NSInteger bytesRead;
uint8_t buffer[32768];
// Pull some data off the network.
bytesRead = [self._networkStream read:buffer maxLength:sizeof(buffer)];
if (bytesRead == -1) {
[self _stopReceiveWithFailure];
} else if (bytesRead == 0) {
[self _stopReceiveWithSuccess];
} else {
[data appendBytes:buffer length:len];
}
Looks like you're creating a new data object every time... perhaps you should be creating & retaining it as a property, and appending to it as you are above.