When CVMetalTextureRef should be released? - objective-c++

I am creating id<MTLTexture> object from CVPixelBufferRef the following way:
id<MTLTexture> CreateMTLTexrure(CVMetalTextureCacheRef texture_cache,
CVPixelBufferRef pixel_buffer,
MTLPixelFormat metal_pixel_format, size_t plane,
int height, int width) {
CVMetalTextureRef texture_ref;
CVReturn err = CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault, texture_cache, pixel_buffer, NULL,
metal_pixel_format, width, height, plane, &texture_ref);
if (err != kCVReturnSuccess) {
// throw error
return nil;
}
id<MTLTexture> texture = CVMetalTextureGetTexture(texture_ref);
//
// Q: is it safe to do CVBufferRelease(texture_ref) here?
//
return texture;
}
When CVMetalTextureRef object should be released?
Is it safe to release it after MTLTexture is obtained?

Yes it is safe.
According to this sample code from Apple Documentation Archive, it is released right after it was assigned, like so:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVReturn error;
CVImageBufferRef sourceImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
size_t width = CVPixelBufferGetWidth(sourceImageBuffer);
size_t height = CVPixelBufferGetHeight(sourceImageBuffer);
CVMetalTextureRef textureRef;
error = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, sourceImageBuffer, NULL, MTLPixelFormatBGRA8Unorm, width, height, 0, &textureRef);
if (error)
{
NSLog(#">> ERROR: Couldnt create texture from image");
assert(0);
}
_videoTexture[_constantDataBufferIndex] = CVMetalTextureGetTexture(textureRef);
if (!_videoTexture[_constantDataBufferIndex]) {
NSLog(#">> ERROR: Couldn't get texture from texture ref");
assert(0);
}
CVBufferRelease(textureRef);
}
You could also look at the WebRTC source code from Google Chromium repository, they also release CVMetalTextureRef the same way.
- (BOOL)setupTexturesForFrame:(nonnull RTCVideoFrame *)frame {
RTC_DCHECK([frame.buffer isKindOfClass:[RTCCVPixelBuffer class]]);
if (![super setupTexturesForFrame:frame]) {
return NO;
}
CVPixelBufferRef pixelBuffer = ((RTCCVPixelBuffer *)frame.buffer).pixelBuffer;
id<MTLTexture> lumaTexture = nil;
id<MTLTexture> chromaTexture = nil;
CVMetalTextureRef outTexture = nullptr;
// Luma (y) texture.
int lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
int lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0);
int indexPlane = 0;
CVReturn result = CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault, _textureCache, pixelBuffer, nil, MTLPixelFormatR8Unorm, lumaWidth,
lumaHeight, indexPlane, &outTexture);
if (result == kCVReturnSuccess) {
lumaTexture = CVMetalTextureGetTexture(outTexture);
}
// Same as CFRelease except it can be passed NULL without crashing.
CVBufferRelease(outTexture);
outTexture = nullptr;
// Chroma (CrCb) texture.
indexPlane = 1;
result = CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault, _textureCache, pixelBuffer, nil, MTLPixelFormatRG8Unorm, lumaWidth / 2,
lumaHeight / 2, indexPlane, &outTexture);
if (result == kCVReturnSuccess) {
chromaTexture = CVMetalTextureGetTexture(outTexture);
}
CVBufferRelease(outTexture);
if (lumaTexture != nil && chromaTexture != nil) {
_yTexture = lumaTexture;
_CrCbTexture = chromaTexture;
return YES;
}
return NO;
}

Related

OpenAL alSourceUnqueueBuffers & alSourceUnqueueBuffers

everyone , I have a problem about the API-- alSourceUnqueueBuffers when I use the OpenAL Libaray.
My problem as follows:
1.I play a pcm-music though streaming mechanism.
2.The application can queue up one or multiple buffer names using alSourceQueueBuffers.
when a buffer has been processed. I want to fill new audio data in my function: getSourceState . but when I use the API of OpenAL alSourceUnqueueBuffers. it returns an error
--- AL_INVALID_OPERATION . I do this as the document about the OpenAL.
so I test a way to solve this problem. I use alSourceStop(source) before the api alSourceUnqueueBuffers, an use alSourcePlay(source) after i filled new data though
alBufferData & alSourceQueueBuffers. but it is bad. because It breaks down the music.
who can help me to find this problem ?
and where i can find more information and method about openAL?
I am waiting for your help . thanks , everyone.
so my code as follows:
.h:
#interface myPlayback : NSObject
{
ALuint source;
ALuint * buffers;
ALCcontext* context;
ALCdevice* device;
unsigned long long offset;
ALenum m_format;
ALsizei m_freq;
void* data;
}
#end
.m
- (void)initOpenAL
{
ALenum error;
// Create a new OpenAL Device
// Pass NULL to specify the system’s default output device
device = alcOpenDevice(NULL);
if (device != NULL)
{
// Create a new OpenAL Context
// The new context will render to the OpenAL Device just created
context = alcCreateContext(device, 0);
if (context != NULL)
{
// Make the new context the Current OpenAL Context
alcMakeContextCurrent(context);
// Create some OpenAL Buffer Objects
buffers = (ALuint*)malloc(sizeof(ALuint) * 5);
alGenBuffers(5, buffers);
if((error = alGetError()) != AL_NO_ERROR) {
NSLog(#"Error Generating Buffers: %x", error);
exit(1);
}
// Create some OpenAL Source Objects
alGenSources(1, &source);
if(alGetError() != AL_NO_ERROR)
{
NSLog(#"Error generating sources! %x\n", error);
exit(1);
}
}
}
// clear any errors
alGetError();
[self initBuffer];
[self initSource];
}
- (void) initBuffer
{
ALenum error = AL_NO_ERROR;
ALenum format;
ALsizei size;
ALsizei freq;
NSBundle* bundle = [NSBundle mainBundle];
// get some audio data from a wave file
CFURLRef fileURL = (CFURLRef)[[NSURL fileURLWithPath:[bundle pathForResource:#"4" ofType:#"caf"]] retain];
if (fileURL)
{
data = MyGetOpenALAudioData(fileURL, &size, &format, &freq);
CFRelease(fileURL);
m_freq = freq;
m_format = format;
if((error = alGetError()) != AL_NO_ERROR) {
NSLog(#"error loading sound: %x\n", error);
exit(1);
}
alBufferData(buffers[0], format, data, READ_SIZE , freq);
offset += READ_SIZE;
alBufferData(buffers[1], format, data + offset, READ_SIZE, freq);
offset += READ_SIZE;
alBufferData(buffers[2], format, data + offset, READ_SIZE, freq);
offset += READ_SIZE;
alBufferData(buffers[3], format, data + offset, READ_SIZE, freq);
offset += READ_SIZE;
alBufferData(buffers[4], format, data + offset, READ_SIZE, freq);
offset += READ_SIZE;
if((error = alGetError()) != AL_NO_ERROR) {
NSLog(#"error attaching audio to buffer: %x\n", error);
}
}
else
NSLog(#"Could not find file!\n");
}
- (void) initSource
{
ALenum error = AL_NO_ERROR;
alGetError(); // Clear the error
// Turn Looping ON
alSourcei(source, AL_LOOPING, AL_TRUE);
// Set Source Position
float sourcePosAL[] = {sourcePos.x, kDefaultDistance, sourcePos.y};
alSourcefv(source, AL_POSITION, sourcePosAL);
// Set Source Reference Distance
alSourcef(source, AL_REFERENCE_DISTANCE, 50.0f);
alSourceQueueBuffers(source, 5, buffers);
if((error = alGetError()) != AL_NO_ERROR) {
NSLog(#"Error attaching buffer to source: %x\n", error);
exit(1);
}
}
- (void)startSound
{
ALenum error;
NSLog(#"Start!\n");
// Begin playing our source file
alSourcePlay(source);
if((error = alGetError()) != AL_NO_ERROR) {
NSLog(#"error starting source: %x\n", error);
} else {
// Mark our state as playing (the view looks at this)
self.isPlaying = YES;
}
while (1) {
[self getSourceState];
}
}
-(void)getSourceState
{
int queued;
int processed;
int state;
alGetSourcei(source, AL_BUFFERS_QUEUED, &queued);
alGetSourcei(source, AL_BUFFERS_PROCESSED, &processed);
alGetSourcei(source, AL_SOURCE_STATE, &state);
NSLog(#"%d", queued);
NSLog(#"%d", processed);
NSLog(#"===================================");
while (processed > 0) {
for (int i = 0; i < processed; ++i) {
ALuint buf;
alGetError();
// alSourceStop(source);
ALenum y = alGetError();
NSLog(#"%d", y);
alSourceUnqueueBuffers(source, 1, &buf);
ALenum i = alGetError();
NSLog(#"%d", i);
processed --;
alBufferData(buf, m_format, data + offset, READ_SIZE, m_freq);
ALenum j = alGetError();
NSLog(#"%d", j);
alSourceQueueBuffers(source, 1, &buf);
ALenum k = alGetError();
NSLog(#"%d", k);
offset += READ_SIZE;
// alSourcePlay(source);
}
}
// [self getSourceState];
}
I found the reason about the problem.
the reason I turn Looping ON : alSourcei(source, AL_LOOPING, AL_TRUE);
if you set this , when the source processed a buffer, you want to fill new data or delete the buffer from the source. you will get the error.

Catch ImageIO: <ERROR>

is it possible to catch the error "ImageIO: JPEGMaximum supported image dimension is 65500 pixels"
I have wrote try,catch section for the corresponding method.However the catch block didn't catch the ImageIO exception!!
The below code catch block will never catch it. any help on this is Appreciated.
-(void)captureToPhotoAlbum {
#try {
UIImage *image = [self glToUIImage]; //to get an image
if (image != nil) {
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self eraseView];
[self showAnAlert:#"Saved Successfully" Message:#""];
}
else {
[self showAnAlert:#"Can't Save!!" Message:#"An error occurred "];
}
}
#catch (NSException *ex) {
[self showAnAlert:#"Can't Save!!" Message:#"An error occurred"];
}
}
-(UIImage *) glToUIImage {
UIImage *myImage = nil;
#try {
NSInteger myDataLength = 800 * 960 * 4;
// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, 800, 960, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
// gl renders "upside down" so swap top to bottom into new array.
// there's gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < 960; y++)
{
for(int x = 0; x < 800 * 4; x++)
{
buffer2[(959 - y) * 800 * 4 + x] = buffer[y * 4 * 800 + x];
}
}
// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);
// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * 800;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
// make the cgimage
CGImageRef imageRef = CGImageCreate(800, 444444444, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
// then make the uiimage from that
myImage = [UIImage imageWithCGImage:imageRef];
}
#catch (NSException *ex) {
#throw;
}
#finally {
return myImage;
}
}
the method does not throw an exception, so there is nothing to catch. if you want to receive an error message, you should pass a callback as the third argument:
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
then you can implement the following method, which will be called once the image is saved:
- (void) image: (UIImage *) image
didFinishSavingWithError: (NSError *) error
contextInfo: (void *) contextInfo;

Video from Set of Images have RGB problem

HI Guys,
I used ffmpeg for creating video from sequence of Images. Following is My coding.
-(void)imageToMov:(NSString*)videoName imageNumber:(int)imageNumber{
[[NSFileManager defaultManager]createDirectoryAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"CoachedFiles/"]] attributes:nil];
[[NSFileManager defaultManager]createDirectoryAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"CoachedFiles/%#/",videoName]] attributes:nil];
//创建文件
[[NSFileManager defaultManager]createFileAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"CoachedFiles/%#/%#.mov",videoName,videoName]] contents:nil attributes:nil];
//[[NSFileManager defaultManager]createFileAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"temp/temp.mov"]] contents:nil attributes:nil];
const char *outfilename = [[Utilities documentsPath:[NSString stringWithFormat:#"CoachedFiles/%#/%#.mov",videoName,videoName]]UTF8String];
UIImage * tempImage = [UIImage imageWithContentsOfFile:[Utilities documentsPath:[NSString stringWithFormat:#"temp/temp0000.jpeg"]]];
AVFormatContext *pFormatCtxEnc;
AVCodecContext *pCodecCtxEnc;
AVCodec *pCodecEnc;
AVFrame *pFrameEnc;
AVOutputFormat *pOutputFormat;
AVStream *video_st;
int i;
int outbuf_size;
uint8_t *outbuf;
int out_size;
// Register all formats and codecs
av_register_all();
// auto detect the output format from the name. default is mpeg.
pOutputFormat = av_guess_format(NULL, outfilename, NULL);
if (pOutputFormat == NULL)
return;
// allocate the output media context
pFormatCtxEnc = avformat_alloc_context();
if (pFormatCtxEnc == NULL)
return;
pFormatCtxEnc->oformat = pOutputFormat;
sprintf(pFormatCtxEnc->filename, "%s", outfilename);
video_st = av_new_stream(pFormatCtxEnc, 0); // 0 for video
pCodecCtxEnc = video_st->codec;
pCodecCtxEnc->codec_id = pOutputFormat->video_codec;
pCodecCtxEnc->codec_type = CODEC_TYPE_VIDEO;
// put sample parameters
pCodecCtxEnc->bit_rate = 500000;
// resolution must be a multiple of two
pCodecCtxEnc->width = tempImage.size.width;
pCodecCtxEnc->height = tempImage.size.height;
// frames per second
pCodecCtxEnc->time_base.den = 1;
pCodecCtxEnc->time_base.num = 1;
pCodecCtxEnc->pix_fmt = PIX_FMT_YUV420P;
pCodecCtxEnc->gop_size = 12; /* emit one intra frame every ten frames */
if (pCodecCtxEnc->codec_id == CODEC_ID_MPEG1VIDEO){
/* needed to avoid using macroblocks in which some coeffs overflow
this doesnt happen with normal video, it just happens here as the
motion of the chroma plane doesnt match the luma plane */
pCodecCtxEnc->mb_decision=2;
}
// some formats want stream headers to be seperate
if(!strcmp(pFormatCtxEnc->oformat->name, "mp4") || !strcmp(pFormatCtxEnc->oformat->name, "mov") || !strcmp(pFormatCtxEnc->oformat->name, "3gp"))
pCodecCtxEnc->flags |= CODEC_FLAG_GLOBAL_HEADER;
// set the output parameters (must be done even if no parameters).
if (av_set_parameters(pFormatCtxEnc, NULL) < 0) {
return;
}
// find the video encoder
pCodecEnc = avcodec_find_encoder(pCodecCtxEnc->codec_id);
if (pCodecEnc == NULL)
return;
// open it
if (avcodec_open(pCodecCtxEnc, pCodecEnc) < 0) {
return;
}
if (!(pFormatCtxEnc->oformat->flags & AVFMT_RAWPICTURE)) {
/* allocate output buffer */
/* XXX: API change will be done */
outbuf_size = 500000;
outbuf = av_malloc(outbuf_size);
}
pFrameEnc= avcodec_alloc_frame();
// open the output file, if needed
if (!(pOutputFormat->flags & AVFMT_NOFILE)) {
if (url_fopen(&pFormatCtxEnc->pb, outfilename, URL_WRONLY) < 0) {
//fprintf(stderr, "Could not open '%s'\n", filename);
return;
}
}
// write the stream header, if any
av_write_header(pFormatCtxEnc);
// Read frames and save frames to disk
int size = pCodecCtxEnc->width * pCodecCtxEnc->height;
uint8_t * picture_buf;
picture_buf = malloc((size * 3) / 2);
pFrameEnc->data[0] = picture_buf;
pFrameEnc->data[1] = pFrameEnc->data[0] + size;
pFrameEnc->data[2] = pFrameEnc->data[1] + size / 4;
pFrameEnc->linesize[0] = pCodecCtxEnc->width;
pFrameEnc->linesize[1] = pCodecCtxEnc->width / 2;
pFrameEnc->linesize[2] = pCodecCtxEnc->width / 2;
for (i=0;i<imageNumber;i++){
NSString *imgName = [NSString stringWithFormat:#"temp/temp%04d.jpeg",i];
NSLog(#"%#",imgName);
UIImage * image = [UIImage imageWithContentsOfFile:[Utilities documentsPath:imgName]];
[imgName release];
//创建avpicture
AVPicture pict;
//格式保持bgra,后面是输入图片的长宽
avpicture_alloc(&pict, PIX_FMT_BGRA, image.size.width, image.size.height);
//读取图片数据
CGImageRef cgimage = [image CGImage];
CGDataProviderRef dataProvider = CGImageGetDataProvider(cgimage);
CFDataRef data = CGDataProviderCopyData(dataProvider);
const uint8_t * imagedata = CFDataGetBytePtr(data);
//向avpicture填充数据
avpicture_fill(&pict, imagedata, PIX_FMT_BGRA, image.size.width, image.size.height);
//定义转换格式,从bgra转到yuv420
static int sws_flags = SWS_FAST_BILINEAR;
struct SwsContext * img_convert_ctx = sws_getContext(image.size.width,
image.size.height,
PIX_FMT_BGRA,
image.size.width,
image.size.height,
PIX_FMT_YUV420P,
sws_flags, NULL, NULL, NULL);
//转换图象数据格式
sws_scale (img_convert_ctx, pict.data, pict.linesize,
0, image.size.height,
pFrameEnc->data, pFrameEnc->linesize);
if (pFormatCtxEnc->oformat->flags & AVFMT_RAWPICTURE) {
/* raw video case. The API will change slightly in the near
futur for that */
AVPacket pkt;
av_init_packet(&pkt);
pkt.flags |= PKT_FLAG_KEY;
pkt.stream_index= video_st->index;
pkt.data= (uint8_t *)pFrameEnc;
pkt.size= sizeof(AVPicture);
av_write_frame(pFormatCtxEnc, &pkt);
} else {
// encode the image
out_size = avcodec_encode_video(pCodecCtxEnc, outbuf, outbuf_size, pFrameEnc);
// if zero size, it means the image was buffered
if (out_size != 0) {
AVPacket pkt;
av_init_packet(&pkt);
pkt.pts= pCodecCtxEnc->coded_frame->pts;
if(pCodecCtxEnc->coded_frame->key_frame)
pkt.flags |= PKT_FLAG_KEY;
pkt.stream_index= video_st->index;
pkt.data= outbuf;
pkt.size= out_size;
// write the compressed frame in the media file
av_write_frame(pFormatCtxEnc, &pkt);
}
}
}
// get the delayed frames
for(; out_size; i++) {
out_size = avcodec_encode_video(pCodecCtxEnc, outbuf, outbuf_size, NULL);
if (out_size != 0) {
AVPacket pkt;
av_init_packet(&pkt);
pkt.pts= pCodecCtxEnc->coded_frame->pts;
if(pCodecCtxEnc->coded_frame->key_frame)
pkt.flags |= PKT_FLAG_KEY;
pkt.stream_index= video_st->index;
pkt.data= outbuf;
pkt.size= out_size;
// write the compressed frame in the media file
av_write_frame(pFormatCtxEnc, &pkt);
}
}
// Close the codec
//avcodec_close(pCodecCtxDec);
avcodec_close(pCodecCtxEnc);
// Free the YUV frame
//av_free(pFrameDec);
av_free(pFrameEnc);
av_free(outbuf);
// write the trailer, if any
av_write_trailer(pFormatCtxEnc);
// free the streams
for(i = 0; i < pFormatCtxEnc->nb_streams; i++) {
av_freep(&pFormatCtxEnc->streams[i]->codec);
av_freep(&pFormatCtxEnc->streams[i]);
}
if (!(pOutputFormat->flags & AVFMT_NOFILE)) {
/* close the output file */
//comment out this code to fix the record video issue. Kevin 2010-07-11
//url_fclose(&pFormatCtxEnc->pb);
}
/* free the stream */
av_free(pFormatCtxEnc);
if([[NSFileManager defaultManager]fileExistsAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"temp/"]] isDirectory:NULL]){
[[NSFileManager defaultManager] removeItemAtPath:[Utilities documentsPath:[NSString stringWithFormat:#"temp/"]] error:nil];
}
//[self MergeVideoFileWithVideoName:videoName];
[self SaveFileDetails:videoName];
[alertView dismissWithClickedButtonIndex:0 animated:YES];
}
Now the Problem is Video is created successfully and the RGB color is greenish. Please notify my mistake on this coding.
I am not an expert on colors but I think your problem might be in the color space you use on your img_convert_ctx. You are using PIX_FMT_BGRA. That's different colorspace then RGB it's BGR, it has reversed colors. Try using PIX_FMT_RGBA instead.

Is there any way to improve time between shots with AVCaptureStillImageOutput?

I currently use the following code to shoot a series of images:
- (void)shootSeries:(int)photos {
if(photos == 0) {
[self mergeImages];
} else {
[output captureStillImageAsynchronouslyFromConnection:connection completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
NSLog(#"Shot picture %d.", 7 - photos);
[self shootSeries:(photos - 1)];
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
int dataSize = CVPixelBufferGetDataSize(pixelBuffer);
CFDataRef data = CFDataCreate(NULL, (const UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer), dataSize);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData(data);
CFRelease(data);
CGImageRef image = CGImageCreate(CVPixelBufferGetWidth(pixelBuffer),
CVPixelBufferGetHeight(pixelBuffer),
8, 32,
CVPixelBufferGetBytesPerRow(pixelBuffer),
colorspace,
kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
dataProvider, NULL, true, kCGRenderingIntentDefault);
CFRelease(dataProvider);
CFArrayAppendValue(shotPictures, image);
CFRelease(image);
}];
}
}
While this works rather well it is very slow. How come apps like ClearCam can shoot pictures much faster in series than this and how can I do it too?
After capturing the image, store the sample buffer in a CFArray, and once you're done taking all your phones, THEN convert them to images (or in your case CGImageRefs).

Trying to find USB device on iphone with IOKit.framework

I'm working on a project were I need the USB port to communicate with an external device. I have been looking for examples on the net (Apple and /developer/IOKit/usb exemple) and trying some others, but I can't even find the device.
In my code, I'm blocking at the place where the function looks for a next iterator (pointer in fact) with the function getNextIterator; but it never returns a good value, so the code is blocking. By the way, I am using toolchain and added IOKit.framework in my project. All I want right now is to communicate or do like a ping to someone on the USB bus! I'm blocking in FindDevice... I can't manage to enter in the while loop because the variable usbDevice is always = to 0... I have tested my code in a small mac program and it works...
Here is my code :
IOReturn ConfigureDevice(IOUSBDeviceInterface **dev) {
UInt8 numConfig;
IOReturn result;
IOUSBConfigurationDescriptorPtr configDesc;
//Get the number of configurations
result = (*dev)->GetNumberOfConfigurations(dev, &numConfig);
if (!numConfig) {
return -1;
}
// Get the configuration descriptor
result = (*dev)->GetConfigurationDescriptorPtr(dev, 0, &configDesc);
if (result) {
NSLog(#"Couldn't get configuration descriptior for index %d (err=%08x)\n", 0, result);
return -1;
}
#ifdef OSX_DEBUG
NSLog(#"Number of Configurations: %d\n", numConfig);
#endif
// Configure the device
result = (*dev)->SetConfiguration(dev, configDesc->bConfigurationValue);
if (result)
{
NSLog(#"Unable to set configuration to value %d (err=%08x)\n", 0, result);
return -1;
}
return kIOReturnSuccess;
}
IOReturn FindInterfaces(IOUSBDeviceInterface **dev, IOUSBInterfaceInterface ***itf) {
IOReturn kr;
IOUSBFindInterfaceRequest request;
io_iterator_t iterator;
io_service_t usbInterface;
IOUSBInterfaceInterface **intf = NULL;
IOCFPlugInInterface **plugInInterface = NULL;
HRESULT res;
SInt32 score;
UInt8 intfClass;
UInt8 intfSubClass;
UInt8 intfNumEndpoints;
int pipeRef;
CFRunLoopSourceRef runLoopSource;
NSLog(#"Debut FindInterfaces \n");
request.bInterfaceClass = kIOUSBFindInterfaceDontCare;
request.bInterfaceSubClass = kIOUSBFindInterfaceDontCare;
request.bInterfaceProtocol = kIOUSBFindInterfaceDontCare;
request.bAlternateSetting = kIOUSBFindInterfaceDontCare;
kr = (*dev)->CreateInterfaceIterator(dev, &request, &iterator);
usbInterface = IOIteratorNext(iterator);
IOObjectRelease(iterator);
NSLog(#"Interface found.\n");
kr = IOCreatePlugInInterfaceForService(usbInterface, kIOUSBInterfaceUserClientTypeID, kIOCFPlugInInterfaceID, &plugInInterface, &score);
kr = IOObjectRelease(usbInterface); // done with the usbInterface object now that I have the plugin
if ((kIOReturnSuccess != kr) || !plugInInterface)
{
NSLog(#"unable to create a plugin (%08x)\n", kr);
return -1;
}
// I have the interface plugin. I need the interface interface
res = (*plugInInterface)->QueryInterface(plugInInterface, CFUUIDGetUUIDBytes(kIOUSBInterfaceInterfaceID), (LPVOID*) &intf);
(*plugInInterface)->Release(plugInInterface); // done with this
if (res || !intf)
{
NSLog(#"couldn't create an IOUSBInterfaceInterface (%08x)\n", (int) res);
return -1;
}
// Now open the interface. This will cause the pipes to be instantiated that are
// associated with the endpoints defined in the interface descriptor.
kr = (*intf)->USBInterfaceOpen(intf);
if (kIOReturnSuccess != kr)
{
NSLog(#"unable to open interface (%08x)\n", kr);
(void) (*intf)->Release(intf);
return -1;
}
kr = (*intf)->CreateInterfaceAsyncEventSource(intf, &runLoopSource);
if (kIOReturnSuccess != kr)
{
NSLog(#"unable to create async event source (%08x)\n", kr);
(void) (*intf)->USBInterfaceClose(intf);
(void) (*intf)->Release(intf);
return -1;
}
CFRunLoopAddSource(CFRunLoopGetCurrent(), runLoopSource, kCFRunLoopDefaultMode);
if (!intf)
{
NSLog(#"Interface is NULL!\n");
} else
{
*itf = intf;
}
NSLog(#"End of FindInterface \n \n");
return kr;
}
unsigned int FindDevice(void *refCon, io_iterator_t iterator) {
kern_return_t kr;
io_service_t usbDevice;
IOCFPlugInInterface **plugInInterface = NULL;
HRESULT result;
SInt32 score;
UInt16 vendor;
UInt16 product;
UInt16 release;
unsigned int count = 0;
NSLog(#"Searching Device....\n");
while (usbDevice = IOIteratorNext(iterator))
{
// create intermediate plug-in
NSLog(#"Found a device!\n");
kr = IOCreatePlugInInterfaceForService(usbDevice,
kIOUSBDeviceUserClientTypeID,
kIOCFPlugInInterfaceID,
&plugInInterface, &score);
kr = IOObjectRelease(usbDevice);
if ((kIOReturnSuccess != kr) || !plugInInterface) {
NSLog(#"Unable to create a plug-in (%08x)\n", kr);
continue;
}
// Now create the device interface
result = (*plugInInterface)->QueryInterface(plugInInterface,
CFUUIDGetUUIDBytes(kIOUSBDeviceInterfaceID),
(LPVOID)&dev);
// Don't need intermediate Plug-In Interface
(*plugInInterface)->Release(plugInInterface);
if (result || !dev) {
NSLog(#"Couldn't create a device interface (%08x)\n",
(int)result);
continue;
}
// check these values for confirmation
kr = (*dev)->GetDeviceVendor(dev, &vendor);
kr = (*dev)->GetDeviceProduct(dev, &product);
//kr = (*dev)->GetDeviceReleaseNumber(dev, &release);
//if ((vendor != LegoUSBVendorID) || (product != LegoUSBProductID) || (release != LegoUSBRelease)) {
if ((vendor != LegoUSBVendorID) || (product != LegoUSBProductID))
{
NSLog(#"Found unwanted device (vendor = %d != %d, product = %d != %d, release = %d)\n",
vendor, kUSBVendorID, product, LegoUSBProductID, release);
(void) (*dev)->Release(dev);
continue;
}
// Open the device to change its state
kr = (*dev)->USBDeviceOpen(dev);
if (kr == kIOReturnSuccess) {
count++;
} else {
NSLog(#"Unable to open device: %08x\n", kr);
(void) (*dev)->Release(dev);
continue;
}
// Configure device
kr = ConfigureDevice(dev);
if (kr != kIOReturnSuccess) {
NSLog(#"Unable to configure device: %08x\n", kr);
(void) (*dev)->USBDeviceClose(dev);
(void) (*dev)->Release(dev);
continue;
}
break;
}
return count;
}
// USB rcx Init
IOUSBInterfaceInterface** osx_usb_rcx_init (void)
{
CFMutableDictionaryRef matchingDict;
kern_return_t result;
IOUSBInterfaceInterface **intf = NULL;
unsigned int device_count = 0;
// Create master handler
result = IOMasterPort(MACH_PORT_NULL, &gMasterPort);
if (result || !gMasterPort)
{
NSLog(#"ERR: Couldn't create master I/O Kit port(%08x)\n", result);
return NULL;
}
else {
NSLog(#"Created Master Port.\n");
NSLog(#"Master port 0x:08X \n \n", gMasterPort);
}
// Set up the matching dictionary for class IOUSBDevice and its subclasses
matchingDict = IOServiceMatching(kIOUSBDeviceClassName);
if (!matchingDict) {
NSLog(#"Couldn't create a USB matching dictionary \n");
mach_port_deallocate(mach_task_self(), gMasterPort);
return NULL;
}
else {
NSLog(#"USB matching dictionary : %08X \n", matchingDict);
}
CFDictionarySetValue(matchingDict, CFSTR(kUSBVendorID),
CFNumberCreate(kCFAllocatorDefault, kCFNumberShortType, &LegoUSBVendorID));
CFDictionarySetValue(matchingDict, CFSTR(kUSBProductID),
CFNumberCreate(kCFAllocatorDefault, kCFNumberShortType, &LegoUSBProductID));
result = IOServiceGetMatchingServices(gMasterPort, matchingDict, &gRawAddedIter);
matchingDict = 0; // this was consumed by the above call
// Iterate over matching devices to access already present devices
NSLog(#"RawAddedIter : 0x:%08X \n", &gRawAddedIter);
device_count = FindDevice(NULL, gRawAddedIter);
if (device_count == 1)
{
result = FindInterfaces(dev, &intf);
if (kIOReturnSuccess != result)
{
NSLog(#"unable to find interfaces on device: %08x\n", result);
(*dev)->USBDeviceClose(dev);
(*dev)->Release(dev);
return NULL;
}
// osx_usb_rcx_wakeup(intf);
return intf;
}
else if (device_count > 1)
{
NSLog(#"too many matching devices (%d) !\n", device_count);
}
else
{
NSLog(#"no matching devices found\n");
}
return NULL;
}
int main(int argc, char *argv[])
{
int returnCode;
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
NSLog(#"Debut du programme \n \n");
osx_usb_rcx_init();
NSLog(#"Fin du programme \n \n");
return 0;
// returnCode = UIApplicationMain(argc, argv, #"Untitled1App", #"Untitled1App");
// [pool release];
// return returnCode;
}
IOKit is not available for iPhone applications. If you need to connect with external devices from the iPhone you need to sign up for the MFi Program which will provide you with the needed API's and documentation.
besides the appstore rules i dont think u can even touch iokit on iOS without violating the sdk's agreement.