NSData over GameKit and EXC_BAD_ACCESS strange problem - iphone

I'm trying to stream video data over a peer-to-peer connection created with GameKit. I have a method that receives an NSData object and uses it to draw a video stream onto a CALayer:
- (void)recieveVideoFromData:(NSData *)data;
Here are the first few lines of that method which convert the NSData to CMSampleBufferRefs and begins processing:
CMSampleBufferRef imgData = (CMSampleBufferRef)data.bytes;
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imgData);
CVPixelBufferLockBaseAddress(imageBuffer,0);
Now, when I feed the video stream from the local camera into this method as follows, everything works just fine and the video stream displays on screen:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSData *data = [[NSData alloc] initWithBytes:sampleBuffer length:malloc_size(sampleBuffer)];
[self recieveVideoFromData:data];
}
But, when I send a stream of those NSData packets over a peer-to-peer connection and receive them in the following fashion, I get an EXC_BAD_ACCESS error:
- (void)match:(GKMatch *)match didReceiveData:(NSData *)data fromPlayer:(NSString *)playerID {
[self recieveVideoFromData:data];
}
Using the debugger, I learned that the bad access occurs on this line:
CVPixelBufferLockBaseAddress(imageBuffer,0);
I have no idea why the NSData sent over the network should be any different than the NSData sent from another method on the same device. I have checked that the data received over the network is being received at the same interval and is the same length (336 bytes) as the data produced on the local device. I also checked the retain count of the data object is 1 before it is used. It seems that the imageBuffer variable is somehow getting lost.
A couple of questions:
Is casting data.bytes to a CMSampleBufferRef the right way to go about unpacking NSData?
How do I assert that the data being received is actually a CMSampleBuffer object? I want to protect my code but I'm not sure how to accomplish a class verification for Core Foundation classes.
Thanks in advance!

Why are you 'unpacking' (that's not unpacking) your CMSampleBuffer by casting the bytes of NSData? That's never going to work, because CMSampleBuffer is not a continuous block in memory.
You have to retrieve all the relevant data from a CMSampleBuffer yourself before sending, stuff it into and NSData object, and reasemble it on the other side via
OSStatus CMSampleBufferCreate (
CFAllocatorRef allocator,
CMBlockBufferRef dataBuffer,
Boolean dataReady,
CMSampleBufferMakeDataReadyCallback makeDataReadyCallback,
void *makeDataReadyRefcon,
CMFormatDescriptionRef formatDescription,
CMItemCount numSamples,
CMItemCount numSampleTimingEntries,
const CMSampleTimingInfo *sampleTimingArray,
CMItemCount numSampleSizeEntries,
const size_t *sampleSizeArray,
CMSampleBufferRef *sBufOut
);
The data types in this function might give you a hint what you want to extract from the CMSampleBuffer when packing your data.

This is probably not the entire answer, but your use of malloc_size seems like a huge red flag to me. This seems a non-portable extension, not governed by anything like ANSI, ISO or POSIX, and I have some doubts on how it might behave if passed a buffer that didn't come from malloc. It seems like a sketchy thing to rely on. (I would say if it's come to calling malloc_size you're already doing something wrong as a C coder, since C is all about knowing how big your buffers are upfront and not relying on non-portable libc functions to do your buffer-size-tracking work for you.)

Related

Obj-C/iOS: How to retrieve contents of NSData and interact?

I'm retrieving a unix timestamp from a Bluetooth LE peripheral, which is stored in an NSData object. If I print the contents of the NSData object to the debug console, they appear correct, however if I try to convert the NSData object to an integer value, the integer value appears to keep changing.
NSData *refinedData = [mfrData subdataWithRange:range];
Which yields a value of 386d5e9a on the debug console.
I then convert to an integer:
uint32_t unixTimeStamp = refinedData;
Initially, this yields a value of 342162144 on the debug console. However, this value keeps growing, despite the NSData not changing. Can anybody help me understand what's going on?
If it's not already very apparent, I'm a newbie.
Thanks.
refinedData is a pointer to an instance of NSData. You want to access its contents:
uint32_t unixTimeStamp = *(uint32_t *)[refinedData bytes];
Note that this is simplified, and assumes that the bytes returned by the Bluetooth peripheral are the same endianness as the processor in your device, that range is correct, etc.

data loss when sending across type NSData

I am currently in the middle of implementing the network layer of my game, i am making progress however i have come across something very odd which i was hoping someone could shed some might on:
Before sending my data across i am encoding it into type NSData (message.cards = [MovePlayer beginEncodeMyCards:myCards];) and then assigning it to a pointer (message.cards)
MessageMove message;
message.message.messageType = kMessageTypeMove;
/**/message.cards = [MovePlayer beginEncodeMyCards:myCards];/**/
NSData *data = [NSData dataWithBytes:&message length:sizeof(message)];
If i do all the decoding on the client side as follows :
MessageMove *myMessage = (MessageMove *) [data bytes];
/**/myCards = [MovePlayer beginDecodeMyCards:myMessage->cards cardArray:myCards];/**/
everything works fine, i am able to decode myMessage->cards, however when i try doing the same after sending the object remotely i can see the correct message type (kMessageTypeMove) but not the cards data (myMessage->cards).
if anyone can please shed some light to this i would greatly appreciate it
many thanks
tl;dr, use NSCoding to serialise your objects.
You can't send pointers to a different process, because the pointers will all be wrong on the other side.
Lets say on the sending machine, you have this:
NSLog(#"cards: %p", message.cards);
//prints "cards: 0xDEADBEEF"
This says that the chunk of memory for the NSArray* object starts at the memory address 0xDEADBEEF. This is all correct at this point.
Now, you send that 0xDEADBEEF pointer to another computer. The computer gets the pointer, but the memory at 0xDEADBEEF does not have an NSArray in it on the other computer. There could be anything in that region of memory. You've sent a pointer, but you haven't actually sent any of the data with it.
Long story short, just use NSCoder for serialisation because it's very good.

NSData & malloc In NSTimer, using a lot of memory in timer

I'm quite new to iPhone development. My target is a remote control app, the server of which is TightVNC. But I met a problem and it's driving me crazy... I've successfully connected to the server(using socket), and next would like to request desktop update at least every one second. So here comes the timer which is created through the selector in "performSelectorInBackground". The timer's main task is as fellow:
int picLength;
[self readExact:(char*)(&picLength) bySize:sizeof(int)];
char *picBuffer;
picBuffer = (char *)malloc(picLength);
[self readExact:picBuffer bySize:picLength];
NSData *picData = [[NSData alloc]initWithBytes:picBuffer length:picLength];
[self performSelectorOnMainThread:#selector(setPicInMainThread:) withObject:picData waitUntilDone:YES];
[picData release];
free(picBuffer);
And "setPicInMainThread" is as follow (each picture is around 200KB, iTouch is connected to PC through computer-to-computer wifi, so the speed would be fast enough):
- (void) setPicInMainThread:(NSData *)data {
[chatController.imageView.image release];
chatController.imageView.image = [UIImage imageWithData:data];
}
The app crushes after presenting the first desktop update. I am wondering if I've met the "memory leak" concerning NSTimer and NSData, which lots of people is talking about... If so, is there any way to solve the problem? Thank you very much for helping!
Your malloc code is looking bad. Why would you take the address of an int and then cast it as a character pointer? Then why would even pass an int for the size to the same function you pass a char * to get set? I have a feeling you are getting an EXC_BAD_ACCESS and it is because of your first few lines and the readExact method. Make sure you get the length the correct and pass the correct parameters.
int picLength;
[self readExact:(char*)(&picLength) bySize:sizeof(int)];
char *picBuffer;
picBuffer = (char *)malloc(picLength);

playing pcm data on iphone

i need to play linear pcm data live on a iphone.
i get a LIVE datastream via RTSP, and i can currently read it out from iphone, save it into a file, play it on a desktop audioplayer that supports pcm, therefore i think the transport is okay.
now i got stuck, i have completely! no idea what to do with my NSData object containing the data.
i did a bit of research, ending up with AudioUnits, but i just cannot assign my NSdata to the audiobuffer, respectivly i have no clue how.
for my instance, i assigned the callback:
AURenderCallbackStruct input;
input.inputProc = makeSound;
input.inputProcRefCon = self;
and having the function 'makeSound':
OSStatus makeSound(
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
//so what to do here?
//ioData->mBuffers[0].mdata = [mySound bytes]; does not work, nor does
//ioData->mBuffers = [mySound byes];
return noErr;
}
is my approeach wrong in gerneral?
of what do i need to know/learn/implement? i am a complete audio-newbie, so my suggestion was, that i dont need several buffers, since when i get the new sound-package from rtsp, the old one is ended, since its a live stream (i base this on my recordings, that just appended the bytes w/o looking up presentation timestamps, since i dont receive some anyways)
Cheers
I don't know if this is exactly what you are looking for but some of Matt Gallagher's AudioStreamer code might be helpful to you. In particular, check out how he handles the audio buffering.
http://cocoawithlove.com/2010/03/streaming-mp3aac-audio-again.html

Streaming JPEGs, detect end of JPEG

I have created a java server, which takes screenshots, resizes them, and sends them over TCP/IP to my iPhone application. The application then uses NSInputStream to collect the incoming image data, create an NSMutableData instance with the byte buffer, and then create a UIImage object to display on the iPhone. Screenshare, essentially. My iPhone code to collect the image data is currently as follow:
- (void)stream:(NSStream *)theStream handleEvent:(NSStreamEvent)streamEvent{
if(streamEvent == NSStreamEventHasBytesAvailable && [iStream hasBytesAvailable]){
uint8_t buffer[1024];
while([iStream hasBytesAvailable]){
NSLog(#"New Data");
int len = [iStream read:buffer maxLength:sizeof(buffer)];
[imgdata appendBytes:buffer length:len];
fullen=fullen+len;
/*Here is where the problem lies. What should be in this
if statement in order to make it test the last byte of
the incoming buffer, to tell if it is the End of Image marker
for the end of incoming JPEG file?
*/
if(buffer[len]=='FFD9'){
UIImage *img = [[UIImage alloc] initWithData:imgdata];
NSLog(#"NUMBER OF BYTES: %u", len);
image.image = img;
}
}
}
}
My problem, as indicated by the in-code comment, is figuring out when to stop collecting data in the NSMutableData object, and use the data to create a UIImage. It seems to make sense to look for the JPEG End of File marker--End of Image (EOI) marker (FFD9)--in the incoming bytes, as the image will be ready for display when this is sent. How can I test for this? I'm either missing something about how the data is stored, or about the marker within the JPEG file, but any help in testing for this would be greatly appreciated!
James
You obviously don't want to close the stream because that would kill performance.
Since you control the client server connection, send down the # of bytes in the image before sending the image data. Better yet, send down # of bytes in the image, the image data, and an easily identified serial # at the end so you can quickly verify that the data has actually arrived.
Much easier and more efficient than actually checking for the end of file marker. Though, of course, you could also just check for that after the # of bytes have been received, too. Easy enough.
Of course, all of this is going to be grossly inefficient for screensharing style purposes in all but the unusual cases. In most cases, only a small part of the screen to be mirrored actually changes with each frame. If you try to send the whole screen with every frame, you'll quickly saturate your connection and the client side will be horribly laggy and unresponsive.
Given that this is an extremely mature market, there are tons of solutions and quite a few open source bits from which you can derive a solution to fit your needs (see VNC, for example).