Following the question from this post:
Unable to display printer options with AirPrint
There were few questions I had:
1) One of the answers were to set Controller.printItem to a URL. Similarly, Can I set multiple URLs? Basically, I have a set of URLs I need to print at one shot. Is it possible to set controller.printitems to an array of URLs? Also, I know controller.printItem takes a type 'data', so how do I convert a web based image URL to a type 'data'?
2) For some weird reason, by default, doubled sided is set to on every time I reach the print dialog. What is the variable I need to set that off? It would be great if I could just not show the option to the user.
Try this code may help to you
- (IBAction)btnPrintTapped:(id)sender {
NSData *imageData = UIImagePNGRepresentation(self.imgV.image);
[self printItem:imageData];
}
#pragma mark - Printing
-(void)printItem :(NSData*)data {
printController = [UIPrintInteractionController sharedPrintController];
if(printController && [UIPrintInteractionController canPrintData:data]) {
printController.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = [NSString stringWithFormat:#""];
printInfo.duplex = UIPrintInfoDuplexLongEdge;
printController.printInfo = printInfo;
printController.showsPageRange = YES;
printController.printingItem = data;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
//NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
[printController presentFromBarButtonItem:self.item animated:YES completionHandler:completionHandler];
}
}
- (BOOL)presentFromRect:(CGRect)rect inView:(UIView *)view animated:(BOOL)animated completionHandler:(UIPrintInteractionCompletionHandler)completion {
return YES;
}
- (BOOL)presentFromBarButtonItem:(UIBarButtonItem *)item animated:(BOOL)animated completionHandler:(UIPrintInteractionCompletionHandler)completion {
return YES;
}
I know its late but might help if someone needs this:
Create array of URLs and assign it to "printingItems" property of "UIPrintInteractionController" class.
Related
I’m trying to make a custom matchmakingview using a matchmaker. The code below is used to find a match.
When i run this on two different devices with different Game Center accounts, both will get a match but none will connect to the match. They will just get stuck in the while loop in infinity and never get out. Have i missed something, do you need to call something to actually connect to the match?
- (void) findMatch{
GKMatchRequest *request = [[GKMatchRequest alloc] init];
request.minPlayers = 2;
request.maxPlayers = 2;
request.playersToInvite = nil;
NSLog(#"Start searching!");
[matchmaker findMatchForRequest:request
withCompletionHandler:^(GKMatch *match, NSError *error)
{
if (error) {
// Print the error
NSLog(#"%#", error.localizedDescription);
}
else if (match != nil)
{
curMatch = match;
curMatch.delegate = self;
NSLog(#"Expected: %i", match.expectedPlayerCount);
while (match.expectedPlayerCount != 0){
NSLog(#"PLayers: %i", curMatch.playerIDs.count);
}
NSLog(#"Start match!");
}
}];
You should not be using a while loop to wait for expectedPlayerCount to reach 0, instead implement the GKMatchDelegate method:
- (void)match:(GKMatch *)match player:(NSString *)playerID didChangeState:(GKPlayerConnectionState)state {
if (!self.matchStarted && match.expectedPlayerCount == 0) {
self.matchStarted = YES;
//Now you should start your match.
}
}
Hi there can any one help me with this? I am new to iOS development. I am trying to implement a print feature, but I am getting errors in it. I see nothing but .xib file with some labels. The textview is there, I just want to print that view... when i pressed the print button the program is crashing..
Here is my code:
- (NSData *)generatePDFDataForPrinting {
NSMutableData *myPdfData = [NSMutableData data];
UIGraphicsBeginPDFContextToData(myPdfData, kPDFPageBounds, nil);
UIGraphicsBeginPDFPage();
CGContextRef ctx = UIGraphicsGetCurrentContext();
[self drawStuffInContext:ctx]; // Method also usable from drawRect:.
UIGraphicsEndPDFContext();
return myPdfData;
}
- (void)drawStuffInContext:(CGContextRef)ctx {
UIFont *font = [UIFont fontWithName:#"Zapfino" size:48];
CGRect textRect = CGRectInset(kPDFPageBounds, 36, 36);
[#"hello world!" drawInRect:textRect withFont:font];
}
- (IBAction)printFromIphone:(id)sender {
float systemVersion = [[[UIDevice currentDevice] systemVersion] floatValue];
if (systemVersion>4.1) {
NSData *myPdfData = [NSData dataWithContentsOfFile:myPdfData]; //check the value inside |myPdfData| and |pdfPath| is the path of your pdf.
UIPrintInteractionController *controller = [UIPrintInteractionController sharedPrintController];
if (controller && [UIPrintInteractionController canPrintData:myPdfData]){
//controller.delegate = delegate; //if necessary else nil
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = [myPdfData lastPathComponent];
//printInfo.duplex = UIPrintInfoDuplexLongEdge;
controller.printInfo = printInfo;
controller.showsPageRange = YES;
controller.printingItem = myPdfData;
// We need a completion handler block for printing.
UIPrintInteractionCompletionHandler completionHandler = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if(completed && error){
NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
// [controller presentFromRect:CGRectMake(200, 300, 100, 100) inView:senderView animated:YES completionHandler:completionHandler];
}else {
NSLog(#"Couldn't get shared UIPrintInteractionController!");
}
}
}
Not sure if its a typo or not but you've commented out your actual pdfData which is why its undeclared.
This line needs to be uncommented because you need myPdfData.
//NSData *myPdfData = [NSData dataWithContentsOfFile:pdfData]; //check the value inside |myPdfData| and |pdfPath| is the path of your pdf.
You can replace it with this line to use YOUR pdf instead of a file.
NSData *myPdfData = [self generatePDFDataForPrinting];
I am not knowing anything about gamecentre.I am working on the COCO application
I had gone through gamecenter guide and get able to integrate the game center in my application.
1)
Now the i am able enter the score in the gamecentre through my application.
By doing this:-
- (void) reportScore: (int64_t) score forCategory: (NSString*) category
{
GKScore *scoreReporter = [[[GKScore alloc] initWithCategory:category] autorelease];
scoreReporter.value = score;
[scoreReporter reportScoreWithCompletionHandler: ^(NSError *error)
{
[self callDelegateOnMainThread: #selector(scoreReported:) withArg: NULL error: error];
}];
}
I is working properly uptil yesterday.But now if i tried to update the score it is not updating.Please any one know the reason y the score is not updating in the game center
I had tried to search out but not get anything.
2)I want to send the current latitude & longitude of one player to another.
I came to know that it is possible by GKMATCH
GKMatchRequest *request = [[[GKMatchRequest alloc] init] autorelease];
request.minPlayers = 1;
request.maxPlayers = 1;
[[GKMatchmaker sharedMatchmaker] findMatchForRequest:request
withCompletionHandler:^(GKMatch *match, NSError *error) {
if (error || !match) {
// handle the error
}
else if (match != nil){
// match found
}}];
/*GKMatchRequest *request = [[[GKMatchRequest alloc] init] autorelease];
request.minPlayers = 1;
request.maxPlayers = 1;
GKMatchmakerViewController *mmvc = [[GKMatchmakerViewController alloc] initWithMatchRequest:request];
mmvc.matchmakerDelegate = self;
currentMatch.delegate=self;
[self presentModalViewController:mmvc animated:YES];*/
//[mmvc release];
}
- (void)match:(GKMatch *)match player:(NSString *)playerID didChangeState:(GKPlayerConnectionState)state {
switch (state){
case GKPlayerStateConnected:
if (match.expectedPlayerCount == 0) {
// start the match
}
break;
case GKPlayerStateDisconnected:
// remove the player from the match, notify other players, etc
break;
}
}
- (void)matchmakerViewControllerWasCancelled:(GKMatchmakerViewController *)viewController{}
// Matchmaking has failed with an error
- (void)matchmakerViewController:(GKMatchmakerViewController *)viewController didFailWithError:(NSError *)error{
}
// A peer-to-peer match has been found, the game should start
- (void)matchmakerViewController:(GKMatchmakerViewController *)viewController didFindMatch:(GKMatch *)match{
}
// Players have been found for a server-hosted game, the game should start
- (void)matchmakerViewController:(GKMatchmakerViewController *)viewController didFindPlayers:(NSArray *)playerIDs{
}
// An invited player has accepted a hosted invite. Apps should connect through the hosting server and then update the player's connected state (using setConnected:forHostedPlayer:)
- (void)matchmakerViewController:(GKMatchmakerViewController *)viewController didReceiveAcceptFromHostedPlayer:(NSString *)playerID {
}
- (void) sendPosition
{
NSError *error;
NSString* str = #"teststring";
NSData* data = [str dataUsingEncoding:NSUTF8StringEncoding];
// NSData *packet = [NSData dataWithBytes:&msg length:sizeof(PositionPacket)];
[currentMatch sendDataToAllPlayers: data withDataMode: GKMatchSendDataUnreliable error:&error];
if (error != nil)
{
// Handle the erro.r
}
}
But i dont know why but delegate method of GKMatchmakerViewController is not called i had set the delegate but still and even i ama not able to get GKMATCH (here currentMatch).
So due to this
[currentMatch sendDataToAllPlayers: data withDataMode: GKMatchSendDataUnreliable error:&error];
this method is not working.
as CurrentMatch refrence is 0X0.
Please help me out.How do i send and recive data.
M i doing anything wrong?OR is there any thing else to be done.
Please help me.
Thanks
- (IBAction)saveButton:(id)sender
{
NSURL *yourURL = [NSURL URLWithString: webpageURLLabel.text ];
NSURLRequest *request = [[NSURLRequest alloc] initWithURL:yourURL];
if ([self checkURL] == YES) {
[webpagePreview loadRequest:request];
webpagePreview.scalesPageToFit = YES;
}
}
- (BOOL)checkURL
{
NSString *arrayOfStrings = [NSArray arrayWithObjects:#"http://", #"https://", nil];
NSString *stringToSearchWithin = webpageURLLabel.text;
BOOL found=NO;
for (NSString *s in arrayOfStrings)
{
if ([stringToSearchWithin rangeOfString:s].location != NSNotFound)
{
found = YES;
break;
} else
{
webpageURLLabel.text = [NSString stringWithFormat:#"http://%#", webpageURLLabel.text];
found = YES;
break;
}
}
return found;
}
- (void)webViewDidFinishLoad:(UIWebView *)webView
{
webpageTitleLabel.text = [webpagePreview stringByEvaluatingJavaScriptFromString:#"document.title"];
}
I debugged it and it looks like it is suppose to load. But for some reason, when you tap the button the first time, nothing happens.
If the user taps the button a second time, it works fine. Any suggestions?
Looks to me like the first time the button is clicked, webpageURLLabel.text may not have a valid URL, and so the request is not valid. Then when [self checkURL] is called, webpageURLLabel.text gets set to a valid URL, and so the next click works.
Maybe you should be calling -checkURL before you create the NSURLRequest?
There are also problems in checkURL. Note that it always returns YES. Inside the for loop is logic that reduces to:
if (some condition) {
found = YES;
break;
} else {
// Fix webpageURLLabel.text
found = YES;
break;
}
However, consider what happens if you have a valid entry that starts with #"https://". The first time through the loop the if condition fails and so #"http://" gets added to the front of the URL and you return. So #"https://valid.com" turns into #"http://https://valid.com". You need to move everything in the else outside the for loop and do it only if found is not true.
First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.
I'm experimenting with AVFoundation and time lapse photography.
My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.
The problem is, if I use the CMSampleBufferRef passed to captureOutput:idOutputSampleBuffer:fromConnection:, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.
I've tried using CMSampleBufferCreateCopyWithNewTiming(), but then after 13 frames are written to the file, the captureOutput:idOutputSampleBuffer:fromConnection: stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.
How can I accomplish my goal of 30fps playback?
How can I tell where the app is getting lost and why?
I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.
Thanks for any help.
Here is where I'm trying to do the retiming.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;
//NSLog(#"in captureOutpput sample buffer method");
if( !CMSampleBufferDataIsReady(sampleBuffer) )
{
NSLog( #"sample buffer is not ready. Skipping sample" );
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
if (! [inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(#"Not ready for data.");
}
else {
// Write every first frame of n frames (30 native from camera).
intervalFrames++;
if (intervalFrames > 30) {
intervalFrames = 1;
}
else if (intervalFrames != 1) {
//CMSampleBufferInvalidate(sampleBuffer);
return;
}
// Need to initialize start session time.
if (writtenFrames < 1) {
if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"Starting CMtime");
CMTimeShow(imageSourceTime);
}
if (useNativeTime) {
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;
//NSLog(#"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo, // maybe a little confused on this param.
&newSampleBuffer);
// These confirm the good heath of our newSampleBuffer.
if (myStatus != 0) NSLog(#"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(#"CMSampleBufferIsValid NOT!");
// No affect.
//myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); // How is this different; CMSampleBufferSetDataReady ?
//if (myStatus != 0) NSLog(#"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
//CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
//CFRelease(sampleBuffer); // - Not surprisingly - “EXC_BAD_ACCESS”
}
if (!appendSuccessFlag)
{
NSLog(#"Failed to append pixel buffer");
}
else {
writtenFrames++;
NSLog(#"writtenFrames: %i", writtenFrames);
}
}
//[self displayOuptutWritterStatus]; // Expect and see AVAssetWriterStatusWriting.
}
My setup routine.
- (IBAction) recordingStartStop: (id) sender
{
NSError * error;
if (self.isRecording) {
NSLog(#"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
self.isRecording = NO;
[recordingStarStop setTitle: #"Record" forState: UIControlStateNormal];
//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
NSLog(#"finished CMtime");
CMTimeShow(imageSourceTime);
// Really, I should loop through the outputs and close all of them or target specific ones.
// Since I'm only recording video right now, I feel safe doing this.
[self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];
[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(#"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
NSLog(#"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
NSLog(#"filePath: %#", [projectPaths movieFilePath]);
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
NSLog(#"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, #selector(video:didFinishSavingWithError: contextInfo:), nil);
}
NSLog(#"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
}
else {
NSLog(#"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
projectPaths = [[ProjectPaths alloc] initWithProjectFolder: #"TestProject"];
intervalFrames = 30;
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue: value forKey: key];
[videoOutput setVideoSettings: cameraVideoSettings];
[videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames: YES];
queue = dispatch_queue_create("cameraQueue", NULL);
[videoOutput setSampleBufferDelegate: self queue: queue];
dispatch_release(queue);
NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
[outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];
NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
//[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
[outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];
inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;
outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
[outputWriter retain];
if (error) NSLog(#"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
else NSLog(#"can not add input");
if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(#"ouptutSettings are NOT supported");
if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
else NSLog(#"could not addOutput: videoOutput to captureSession");
//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle: #"Stop" forState: UIControlStateNormal];
writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
//[outputWriter startSessionAtSourceTime: imageSourceTime];
NSLog(#"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
NSLog (#"recording to fileURL: %#", [projectPaths movieURLPath]);
}
NSLog(#"isRecording: %#", self.isRecording ? #"YES" : #"NO");
[self displayOuptutWritterStatus];
}
OK, I found the bug in my first post.
When using
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
&sampleTimingInfo,
&newSampleBuffer);
you need to balance that with a CFRelease(newSampleBuffer);
The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.
Hope this is helpful to someone else.
With a little more searching and reading I have a working solution. Don't know that it is best method, but so far, so good.
In my setup area I've setup an AVAssetWriterInputPixelBufferAdaptor. The code addition looks like this.
InputWriterBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: inputWriterBuffer
sourcePixelBufferAttributes: nil];
[inputWriterBufferAdaptor retain];
For completeness to understand the code below, I also have these three lines in the setup method.
fpsOutput = 30; //Some possible values: 30, 10, 15 24, 25, 30/1.001 or 29.97;
cmTimeSecondsDenominatorTimescale = 600 * 100000; //To more precisely handle 29.97.
cmTimeNumeratorValue = cmTimeSecondsDenominatorTimescale / fpsOutput;
Instead of applying a retiming to a copy of the sample buffer. I now have the following three lines of code that effectively does the same thing. Notice the withPresentationTime parameter for the adapter. By passing my custom value to that, I gain the correct timing I'm seeking.
CVPixelBufferRef myImage = CMSampleBufferGetImageBuffer( sampleBuffer );
imageSourceTime = CMTimeMake( writtenFrames * cmTimeNumeratorValue, cmTimeSecondsDenominatorTimescale);
appendSuccessFlag = [inputWriterBufferAdaptor appendPixelBuffer: myImage withPresentationTime: imageSourceTime];
Use of the AVAssetWriterInputPixelBufferAdaptor.pixelBufferPool property may have some gains, but I haven't figured that out.