nil from Camera _createPreviewImage in iPhone SDK - iphone

I am trying to apply Mike Chen's answer here, using SDK 3.0. In delegate.m file I implement;
[viewController.view addSubview:[[objc_getClass("PLCameraController") sharedInstance] previewView]];
and in viewcontroller.m I implement:
CapturedImage = [[objc_getClass("PLCameraController") sharedInstance] _createPreviewImage];
but CapturedImage is always nil. Any suggestions?

Things to check:
(1) Has PLCameraController been updated for iPhone 3.x? This blog post suggest it has not.
(2) Instead of nesting all your calls Smalltalk style break them apart so you can see if your getting the objects you think you are:
WhateverClass *cameraController=[objc_getClass("PLCameraController") sharedInstance];
if (cameraController!=nil) {
ImageClass *myImage=[cameraController _createPreviewImage];
}else {
NSLog(#"camerController is nil");
}
Just remember that PLCameraController is a private framework that relies on a hidden Apple API. Even if you get it work, you most likely won't get it passed the App store review.

Related

How can I improve my TWTweetComposeViewController code in iOS?

I have implemented the following code to do a Twitter Share. In my code I try to test for iOS 5 and if that does not work, I go back to the old way of sharing using ShareKit's Twitter code.
I showed the code to a co worker and he suggested that my code may have flaws and that I need to do two things:
Do a proper Run Time check?? (since it may crash on IOS 4 and earlier) EVEN though it did not.
Weak Link the Twitter frame work
Can someone kindly explain what a proper run time check would be? and Why Weak Link?
NSString *text = [NSString stringWithFormat:#"#Awesome chart: %#", self.titleLabel.text];
if ([TWTweetComposeViewController canSendTweet])
{
TWTweetComposeViewController *tweetComposeViewController = [[TWTweetComposeViewController alloc] init];
[tweetComposeViewController setInitialText:text];
[tweetComposeViewController addImage:image];
[tweetComposeViewController setCompletionHandler:^(TWTweetComposeViewControllerResult result){
dispatch_async(dispatch_get_main_queue(), ^{
[self dismissModalViewControllerAnimated:YES];
if (result == TWTweetComposeViewControllerResultDone)
{
NSLog(#"iOS 5 onwards Twitter share complete");
}
});
}];
[self presentViewController:tweetComposeViewController
animated:YES
completion:^{ }];
}
else
{
SHKItem *item = [SHKItem image:image title:text];
// Share the message.
[SHKTwitter shareItem:item];
NSLog(#"Device does not support Twitter library");
}
}
A weak link simply means the a framework is not required to be on the device. Or put another way, when you add frameworks to your project, the app will require that framework to be on the device. So if you require a framework to be there, and it isn't, then the app will crash. In your case, you would want to weak link the twitter framework if you want the app to run on iOS version prior to iOS 5 (ie iOS 4.x).
A proper run time check means you should load the app onto your device (running iOS 5 or later) and test the twitter feature of your app. If it crashes, then you know you have a problem.
I skimmed your code and everything looks fine. I didn't run it through my complier though so I can't speak for syntax errors and such. The one change I would make is:
if ([TWTweetComposeViewController canSendTweet])
to
Class twClass = NSClassFromString(#"TWTweetComposeViewController");
if (!twClass) // Framework not available, older iOS
return;
The reason why I use that is becuase that literally checks if the framework is on that device, while canSendTweet checks if the user is logged in. So I don't want to confuse a user not being logged in with a user whose device doesn't support Twitter with iOS 5.
Let me know if you need anymore help.
DETweetComposeViewController is another option. It's compatible with iOS 4 too.
I think you also leak the controller (as do most of the code samples I've seen).
On the other hand, you paid attention to the documentation about the completion handler, and correctly make sure you do UI work in the main thread. I need to go fix my implementation to do the same.

Lauching App with URL (via UIApplicationDelegate's handleOpenURL) working under iOS 4, but not under iOS 3.2

I have implemented UIApplicationDelegate's
application:didFinishLaunchingWithOptions:
and
application:handleOpenURL:
according to specification, i.e.,
application:didFinishLaunchingWithOptions:
returns YES
and
application:handleOpenURL: opens the URL.
The code works under iOS 4 (in both cases, i.e., when the app is launched and when it becomes active from suspended state). However, the code does not work under iOS 3.2.
I give an answer to my own question. Finding out the solution took me a while and was quite frustrating. If you do an internet search you find some partial answers, but it still took me a while to work out the following solution and I do hope it adds some clarity.
So first, the recommended behavior of your app appears to be the following (see Opening Supported File Types in iOS Ref Lib):
Do not implement applicationDidFinishLaunching: (see the note at UIApplicationDelegate).
Implement application:didFinishLaunchingWithOptions: and check the URL, return YES if you can open it, otherwise NO, but do not open it.
Implement application:handleOpenURL: and open the URL, return YES if successful, otherwise NO.
In iOS 4, passing an URL to an app results in one of the following two behaviors:
If the app is launched then application:didFinishLaunchingWithOptions: is called and application:handleOpenURL: is called if and application:didFinishLaunchingWithOptions: returned YES.
If the app is becoming active from suspended state then application:didFinishLaunchingWithOptions: is not called but application:handleOpenURL: is called.
However, in iOS 3.2 it appears as if application:handleOpenURL: is never called! A hint that the behavior is different under iOS 3.2 can be found in Handling URL Requests. There you find that application:handleOpenURL: is called if application:didFinishLaunchingWithOptions: is not implemented, but applicationDidFinishLaunching: is implemented. But application:handleOpenURL: is not called if application:didFinishLaunchingWithOptions: is implemented.
Hence, one solution to make the code work under 3.2 and 4.0 is:
Open the URL in application:didFinishLaunchingWithOptions:, but then return NO to prevent that application:handleOpenURL: is called.
Open the URL in application:handleOpenURL:, in case you are under 4.0 and the app was in suspended state.
I found this solution in another post, but I was confused, because it contradicted the recommendation in iOS Ref Lib documentation (namely that we should return YES in application:didFinishLaunchingWithOptions:). (At that point I did not realize that the documentation contradicts it self).
I believe that the current iOS 4.0 behavior will be the future behavior I prefer the following solution:
Do not implement applicationDidFinishLaunching:.
Implement application:didFinishLaunchingWithOptions: and check the URL, return YES if you can open it, otherwise NO, but do not open it. If we are on 3.2, open the URL.
Implement application:handleOpenURL: and open the URL, return YES if successful, otherwise NO.
So in summary, I implement the iOS 4 behavior and added the following line to application:didFinishLaunchingWithOptions:
if([[[UIDevice currentDevice] systemVersion] hasPrefix:#"3.2"]) {
[self application:application handleOpenURL:url];
}
which make the code work under 3.2.
application:handleOpenURL: is now DEPRECATED.
As of iOS 4.2, you can use this for opening URLs:
- (BOOL)application:(UIApplication *)application openURL:(NSURL *)url
sourceApplication:(NSString *)sourceApplication annotation:(id)annotation
Documentation:
https://developer.apple.com/library/ios/#DOCUMENTATION/UIKit/Reference/UIApplicationDelegate_Protocol/Reference/Reference.html
I started writing application which used Dropbox api. To understand concept, I ran a sample application using my Key/secret mentioned at dropbox/developer documentation.
Once sample app started working, I used same key/secret values for my application.
For sample app, implementation of handleOpenURL (or openURL on iOS 4.2) gets executed as expected. For some odd reason, it wasn't the case for my app. My app entered background in order to show login screen and authentication page of dropbox. After successful login and authentication, my app never entered foreground. It was true for both platform Simulator and device (iPad)
I tried almost everything listed on internet including this post. Thanks. There was NO success, though.
At last, it STARTED working for my application when I did following:
On simulator, select "iOS Simulator --> Reset Content and Settings", and reset.
On device, I deleted sample application related executable and which in turn delete cache associated to it.
Add the following to the end of application:DidFinishLaunchingWithOptions:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
...
NSURL *url = (NSURL *)[launchOptions valueForKey:UIApplicationLaunchOptionsURLKey];
if (url != nil && [url isFileURL]) {
return YES;
} else return NO;
} // End of application:didFinishLaunchingWithOptions:
// New method starts
-(BOOL) application:(UIApplication *)application openURL:(NSURL *)url sourceApplication:(NSString *)sourceApplication annotation:(id)annotation
{
mvc = [nc.viewControllers objectAtIndex:0];
if (url != nil && [url isFileURL]) {
[mvc handleOpenURL:url];
}
return YES;
}
where mvc is my main ViewController, and nc my navigation controller.
Then in the MainViewController, do something like this:
- (void)handleOpenURL:(NSURL *)url {
[self.navigationController popToRootViewControllerAnimated:YES];
// Next bit not relevant just left in as part of the example
NSData *jsonData = [NSData dataWithContentsOfURL:url];
NSError *error;
NSDictionary *dictionary = [[NSJSONSerialization JSONObjectWithData:jsonData options:kNilOptions error:&error] objectAtIndex:0];
[self managedObjectFromStructure:dictionary withManagedObjectContext:self.context];
...
}
after declaring handleOpenURL in the .h of course.
Thanks goes to Christian for putting in the effort for this.

iOS 4: Remote controls for background audio

I'm currently attempting to set up background audio for an app I'm developing for iOS 4. The app doesn't have a dedicated music player viewController, however, unlike other background audio apps such as Pandora, which makes the task a bit more confusing.
I've set the appropriate Info.plist settings correctly and have an AVAudioPlayer object in my app delegate which is accessible from everywhere. When the user plays a song, I replace the AVAudioPlayer with a new one initialized with the song and play it. This all works great, except now I have no idea how to go about supporting remote control events.
Based on Apple's documentation, I have this:
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
}
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
switch(event.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
if([iPhoneAppDelegate backgroundAudioPlayer].playing)
[iPhoneAppDelegate pauseBackgroundAudioPlayer];
else
[iPhoneAppDelegate playBackgroundAudioPlayer];
break;
}
}
The thing is, where do I put this? Apple's documentation seems to suggest this should go in some view controller somewhere, but my app has lots of view controllers and navigation controllers. Wherever I try to put this, for some reason tapping the Toggle Play/Pause button in the multitasking tray remote controls either causes the song to just pause for a moment and then unpause, or somehow causes the song to play twice.
The documentation examples are a bit misleading, but there is no need to subclass anything anywhere. The correct place to put remoteControlReceivedWithEvent: is in the application delegate, as it remains in the responder chain regardless of whether the app is in the foreground or not. Also the begin/end receiving remote control events should be based on whether you actually need the events, not on the visibility of some random view.
I found a couple of solutions to receiving global remote control events on the Apple Developer Forums after a bit of searching.
One way is to subclass UIWindow and override its remoteControlReceivedWithEvent:.
The second, perhaps nicer way is to subclass UIApplication and override sendEvent:. That way, you can intercept all the remote control events and handle them there globally, and not have any other responders handle them later in the responder chain.
- (void)sendEvent:(UIEvent *)event {
if (event.type == UIEventTypeRemoteControl) {
// Handle event
}
else
[super sendEvent:event];
}
The second method didn't work for me, sendEvent was never called. However the first method worked just nicely (subclassing UIWindow).
I struggled with this one for a while and none of the answers above worked. The bug in my code, and I hope that it will help someone reading this, was that I had the AudioSession set to mix with others. You want to be the foreground audio player to get Remote Control events. Check to see if you have INCORRECT code like this:
[[AVAudioSession sharedInstance] setDelegate: self];
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 0;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryMixWithOthers,
sizeof (doSetProperty),
&doSetProperty
);
NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];
And remove the AudioSessionSetProperty, or change doSetProperty to 1.
No need to subclass Window or forward events. Simply handle it from your main view controller. See the Audio Mixer (MixerHost) example for details.
http://developer.apple.com/LIBRARY/IOS/#samplecode/MixerHost/Listings/Classes_MixerHostViewController_m.html
Documentation explains it very well.
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/Remote-ControlEvents/Remote-ControlEvents.html
One thing that seems to influence this behavior is any category options you set for your AVAudioSession using setCategory:withOptions:error: instead of just setCategory:error:. In particular, from trial and error, it appears that if you set AVAudioSessionCategoryOptionMixWithOthers you will not get remote control events; the now playing controls will still control the iPod app. If you set AVAudioSessionCategoryOptionDuckOthers you will get remote control events, but it seems like there may be some ambiguity regarding which app is controlled. Setting the categoryOptions to 0 or just calling setCategory:error: works best.

always returns nil in iPhone

I am trying to apply Mike Chen's answer here, using SDK 3.0. In delegate.m file I implement;
[viewController.view addSubview:[[objc_getClass("PLCameraController") sharedInstance] previewView]];
and in viewcontroller.m I implement:
PLCameraController *cam = [objc_getClass("PLCameraController") sharedInstance];
CapturedImage = [cam _createPreviewImage];
but 'cam' is always nil. Any suggestions?
This won't work when running in the simulator or on an iPod Touch. Are you running this on a physical iPhone with a camera?
Also, if you've dumped the headers correctly, you shouldn't be using the objc_ runtime functions, but rather using the class names. Is the result of objc_getClass("PLCameraController") set to nil?
On a side note: I hope you're not looking to publish your application through the App Store because this sort of private method calling is a great excuse for Apple to reject your application.
To use PLCameraController, you'll need to include the PhotoLibray private framework. The simplest way to do that is drag drop an Image Picker Controller (UIImagePickerController) into your main nib.

Problem with applicationShouldTerminate on iPhone

I'm having a problem with applicationShouldTerminate.
What ever I do it seams that has no effect. Any help would be
appreciated.
I'm well versed in programing but this just gives me headache. Im going
over some basic tutorials for xcode , as I'm new to mac in general, and am currently looking at a simple flashlight app.
It exists but I would like to add a alert box here with option not to
quit.
(void)applicationWillTerminate:(UIApplication *)application
{
[application setIdleTimerDisabled:NO];
}
this has no effect, alert is closed even before its created.
(void)applicationWillTerminate:(UIApplication *)application
{
[application setIdleTimerDisabled:NO];
UIAlertView *alertTest = [[UIAlertView alloc]
initWithTitle:#"This is a Test"
message:#"This is the message contained
with a UIAlertView"
delegate:self
cancelButtonTitle:#"Button #1"
otherButtonTitles:nil];
[alertTest addButtonWithTitle:#"Button #2"];
[alertTest show];
[alertTest autorelease];
NSLog(#"Termination");
}
I did some reading online and found that it should be possible to do
this with
(NSApplicationTerminateReply)applicationShouldTerminate:(NSApplication*)sender
But no mater where I put that declaration I get error: syntax error
before NSApplicationTerminateReply.
There is no syntax error except that xcode seems not to recognize
NSApplicationTerminateReply as valid input.
Any sample code would be greatly appreciated.
I know this is a non-answer, but hopefully I can be helpful:
Displaying a "Really quit?"-type alert like this, even if you can pull it off technically (and I'm not sure you can), is a bad idea and is likely to either cause rejection from the App Store or, at best, an inconsistent user experience because no other apps do this.
The convention with iPhone apps is to save state if necessary, then yield control (for termination) as quickly as possible when the user hits the home button or switches apps.
To ensure a consistent experience, Apple probably has an aggressive timer in place to restrict what you can do in applicationWillTerminate. And even if they don't have a technical measure in place, they probably have an App Store approval policy to ensure that applications quit immediately when they're asked to.
applicationShouldTerminate and NSApplication do not exist on the iPhone. You have to use UIApplication.
The alert view is never shown because the 'show' method does not block, and therefore, the end of 'applicationWillTerminate' is reached immediately after you create the alert view and try to show it. I believe this is by design. You can't really begin asynchronous operations in 'applicationWillTerminate'.
With regards to the applicationShouldTerminate error, in case anyone's curious, NSApplicationTerminateReply and NSApplication seem to be deprecated...even though the OP's method is exactly how it appears in the docs!
Defining your method as the below should build with no errors:
-(BOOL)applicationShouldTerminate :(UIApplication *)application
I think I found the answer to what I wanted to do but will need to check it when I get back home.
Some directions were found here
http://blog.minus-zero.org/
The iPhone 2.0 software was recently released, and with it came the
ability for users to download native apps (i.e., not web sites)
directly to their phones from within the iPhone UI or via iTunes.
Developers (anyone who pays Apple 59GBP for the privilege) can then
write their own apps and have them available for purchase in the App
Store.
One limitation of the Apple-sanctioned SDK is that only one
application is allowed to be running at a time. This presents a
problem for apps such as IM clients, music players and other programs
whose functionality relies on being able to run in the background.
Another example (courtesy of James) would be an app that takes
advantage of the iPhone 3G's GPS chip to create a log of all the
places you visit.
However, there is a neat trick that I discovered: your app will only
get terminated if you switch away from it, and hitting the iPhone's
power button while your app is in the foreground doesn't count as
switching away. The upshot of this is you can create apps which
continue to run while the iPhone is in your pocket - perfect for the
GPS example.
Achieving this is as simple as implementing two methods in your
UIApplication delegate - applicationWillResignActive: and
applicationDidBecomeActive:. Here's a simple example to demonstrate
the effect.
In your UIApplication delegate header file, add a new ivar: BOOL
activeApp. Then, in your implementation, add the following three
methods:
- (void)applicationWillResignActive:(UIApplication *)application {
NSLog(#"resigning active status...");
activeApp = NO;
[self performSelector:#selector(sayHello) withObject:nil afterDelay:1.0];
}
- (void)applicationDidBecomeActive:(UIApplication *)application {
NSLog(#"becoming the active app...");
activeApp = YES;
}
- (void)sayHello {
NSLog(#"Hello!");
if (!activeApp)
[self performSelector:#selector(sayHello) withObject:nil afterDelay:1.0];
}