What are accepted methods to reduce iPhone application piracy, which do not violate Apple's evaluation process?
If my application "phones home" to provide the unique device ID on which it runs, what other information would I need to collect (e.g., the Apple ID used to purchase the application) to create a valid registration token that authorizes use of the application? Likewise, what code would I use to access that extra data?
What seem to be the best available technical approaches to this problem, at the present time?
(Please refrain from non-programming answers about how piracy is inevitable, etc. I know piracy is inevitable. I am interested in programming-based answers that discuss how to reduce it. Thanks in advance for your understanding.)
UPDATE
Please visit and read
Thanks to chpwn in the comments.
Code that's way too old! - 11th May 2009
For now there's an easier way to detect if your iPhone application has been cracked for piracy use. This does not involve you to check the iPhone unique IDs against a list of accepted IDs.
Currently there are three things crackers do:
Edit the Info.plist file
Decode the Info.plist from binary to
UTF-8 or ASCII
Add a key-pair to Info.plist{SignerIdentity,
Apple iPhone OS Application Signing}
The last one is easiest to check with this code:
NSBundle *bundle = [NSBundle mainBundle];
NSDictionary *info = [bundle infoDictionary];
if ([info objectForKey: #"SignerIdentity"] != nil)
{ /* do something */ }
Generally we don't have SignerIdentity in any of the App Store applications we build so checking for nil then performing set instructions should make it more difficult for crackers and pirates.
I can't take credit for this so please visit How to Thwart iPhone IPA Crackers. There's loads of information there about piracy on iPhone and how to curb it.
As pointed out by Andrey Tarantsov in the comments, looking for the "SignerIdentity" string in the binary (using an app like HexEdit) and replacing it is pretty easy.
You could encode that string, but then again all you have to do is change one char of it and the app is not going to look for the "SignerIdentity" key anymore but for some other key that probably doesn't exist (therefore is null). That key being null, the app thinks it isn't cracked (since SignerIdentity should be null if the app isn't cracked).
Instead, I'd rather check the size of the info.plist and compare it to a reference value. I noticed Simulator and Devices builds don't have the same info.plist file size. Same goes for Debug, Release and Distribution builds. Therefore, make sure you set the reference value using the info.plist file size for the Device Distribution Build.
How to look for the filesize at launch:
It looks like saving MD5 checksum of Plist and checking CryptID should do well till some time.
Check iTunesMetadata.plist for the date of Purchace as sometimes, when an app is cracked, that date is changed to something outrageous.
Also check to see if the purchacer name field exists. In my experience with cracking apps for personal use, that is usually removed. If anyone knows how the anti dump protection of Temple Run works, you could use that in conjunction with some protection that poedCrackMod can't get (google poedCrackMod create hackulo.us account, go to dev center look for poedCrackMod, install it on iDevice).
Clutch which will not crack things with Temple Run like protection, has a feature called OverDrive intended to silence an app's crack detection. poedCrackMod has LamestPatch, which isn't as good. Also poedCrackMod is an open source bash script that can be reverse engineered. To recap, you have an app that has copy protection that can't be circumvented with clutch / overdrive but can be cracked with poedCrackMod. However poedCrackMod can't circumvent the app's in app piracy checks. It is hard to manually patch integrety checks in the app's executable. So your app is hard to crack.
Related
My project is in iPhone, uses sysctl(mib, 2, &argmax, &size, NULL, 0) function.
It is dealing with background processes in iPhone, accesses files modification time outside my applications directory.
apple guidelines says- "Apps that read or write data outside its designated container area will be rejected".
will my app be accepted in appstore? or Is there any special certificate required for such apps to upload on iPhone?
Thanks.
Besides that succinct answer above, you can always submit an appeal to the App Review Board (or maybe even send one in before your submission) and see if they will make an exception.
Here's where you can find information about that:
http://developer.apple.com/appstore/guidelines.html
But yes, don't expect to get lucky with this kind of app. If you word your explanation or appeal very well, there's a chance you might get approved.
Apps that break the rules will be rejected.
Your app breaks the rules.
Yes it will be rejected.
Can an iPhone app encrypt its stored data? So that even a user with a Jailbroken iOS device cannot access the file. For example, game-center may sync with local data, you do not want the user manipulating the scores. You do not want your IAP be circumvented either.
Is there a simple way to encrypt your data before writing to the device?
Maybe my questions are not very clear. Indeed they are:
When I'm using things like: [array writeToFile:path atomically:YES]; is there any auto-encryption that ensures only my app can access the file correctly?
If not, what is the simplest way to achieve it?
PS: now I found NSData can do the job, but the NSDataWritingFileProtectionComplete flag requires #if __IPHONE_4_0 <= __IPHONE_OS_VERSION_MAX_ALLOWED. I wonder what happens on not-supported devices?
More information on iOS encryption in #GrahamLee's answer to this question and on other iOS tagged questions on security.stackexchange.com.
Basic summary is - based on iPhone controls alone:
If someone has the device and it isn't locked, they can access all the data
If someone has the device and it is locked, they can get most of the data, and possibly all (some exceptions may apply)
You could obfuscate, and use encryption within your app when storing data, but an attacker could reverse engineer that encryption code to decrypt.
You need to work out the value of such DRM techniques and decide whether they are worthwhile in this scenario.
That docs will help you to find more accurate answer:
http://sit.sit.fraunhofer.de/studies/en/sc-iphone-passwords.pdf
http://sit.sit.fraunhofer.de/studies/en/sc-iphone-passwords-faq.pdf
How exactly does apple approve apps? Is the actual source code viewed?
While none of us have access to the internal review process (which appears to be continuously changing), there are a few things that can be said based on the responses that people have received.
First, Apple has no access to your source code, so they do not review that. You submit a binary as part of an application bundle, along with your other application resources.
They do, however, appear to scan your application's binary for certain symbols that indicate the use of private APIs. A number of applications started crashing after iPhone OS updates because they used these private APIs, so Apple has been cracking down on this.
There are plenty of applications on the store that have memory leaks or other performance issues. I know that I've submitted versions of my applications that had subtle leaks (since fixed) and had no problems with review. Therefore, it does not appear that they do any sort of performance testing or profiling.
The only place where a memory leak causes a problem doing review is when that memory leak gets so bad that the application crashes when the reviewer is testing it. If your application crashes at any point during the review process, it will be rejected.
Beyond that, they have a checklist of user interface elements that they check for proper usage of (no persistent selections on table view rows, etc.). If your application deviates significantly from the Human Interface Guidelines when using these standard UI elements, you may get rejected.
Apple is very careful about copyright, particularly with their own images and artwork, so you may run into trouble if you use copyrighted material improperly.
Most of the rejection reasons you will face are preventable by making your application stable and by following platform guidelines, but some are not. Certain classes of applications have been rejected due to their intended use, and again the classes of applications that are allowed on the store change on a regular basis. This can add frustration and uncertainty when dealing with the App Store, but the vast majority of application types will never run into problems (as can be seen in the diversity of applications currently available).
try to use otool -L binary yourself and you can see immediately if a private API framework got used...
No, they only have access to the binary code that you send them.
They can run this through profilers checking for memory leaks and the like.
They do not have access to your source; it is not part of what you send to them. They test the binary you send them for leaks and such. I think they also check what data your app sends out to make sure it isn't doing anything egregiously bad (sending passwords or the like).
Apple does not care about leaks and profiling information for your App. The Operating system is able to kill your app if that gets out of hand. What they actually do is manually run your App and check if it follows some of Apples guidelines. In an automated process they extract symbols, selectors and strings from your binary and check those for usage of private APIs.
You might want to try nm -u on your (simulator) binary.
They only receive your binray file of ipa only. They can get some of your resource file while extract ipa. They can able to find what all urls you used in web service or url request. Nothing else other than this.
I have a iPhone app which needs to have a self destruct option. This app is going to be use on sensitive locations and holds some algorithms which are not to be known by anybody except the iPod Holder.
What would be the most "complete" way of deleting the app?
I was thinking of some how writing zeros to the nib file. or the actual application.app but I believe this folders are write protected and sandboxed.
Anybody have any ideas of better ways to achieve this?
Elaboration (Taken from original poster's comments):
This is for a jailbroken iPhone.
These devices are going to be provided to military personnel this device falling into enemy hands would be the least of my concerns. It's going to have a button so wipe the app once the app is written to zero or better yet corrupted with garbage all over the "exe" the app has no way of working and it would require inspection of the iPod flash chip with equipment that i 100% know the wrong people wont have
If you are openly storing the code that contains this algorithm within your application, there's nothing stopping the "wrong people" from jailbreaking the device and copying the complete file structure of the device before you run your "wipe" process.
Additionally, if you are dealing with a U.S. Government customer, I doubt that they will approve of the purchase of a jailbroken device, given that the vendor of such a device has claimed that jailbreaking is illegal. Whether or not this will hold up in court, the government tends to be conservative in these matters and err on the side of caution. Because Apple is a large U.S. company and a vendor to the government, I wouldn't expect the government procurers to take the jailbreakers' side in this.
My recommendation would be to encrypt the particular algorithms within a file in your application's bundle, and require the user of this application to decrypt this file into memory with the correct (difficult) password. That way, even if the "bad guys" were to gain access to the application, they wouldn't have everything they need to access these algorithms and would have to brute-force the password on the encrypted portion. This could be done on a standard, non-jailbroken device.
The U.S. Army is rolling out iPods in the field, with custom applications on them, so I'm sure that you're not the first person facing this challenge. If this work is being funded through a Department of Defense SBIR grant (or similar), you may even be able to contact your contracting officer and see if they can put you in touch with people at the appropriate agency who may be able to help you out with this (or even determine if it an issue to begin with).
I'm going to go out on a limb here and say you may not want to use the iphone for this type of app. There are intentional limitations to this exact type of action on the iphone and in springboard. If you are doing something so sensitive that it can't fall into unauthorized hands my recommendation would be to use a different and more customizable/controllable platform.
Unless you're working from a jailbroken device, you're probably going to run into problems here.
Even if you can find a way to automatically delete the app, you're still running the risk of those algorithms getting into the wrong hands - you would essentially be running into the same problems that Apple has with jailbreaking - once the device is in someone else's hands, it only takes the proper amount of motivation for the data to be accessed.
The only way to secure your algorithms is to pass the data to a remote server and get the results. There's still a possibility of a security breach, but it's much, much lower.
I don't know how well this would work, but you could store the algorithm as a file inside the application bundle, run the algorithm from that file possibly using a scripting language or something, and delete that file if you need to.
The folders are sandboxed, but your application is in there. On my jailbroken iPhone I see that all the permissions are owned by mobile so I don't see any reason why you can't just overwrite all the files with zeroes and then delete them.
The application bundle is effectively read-only, perhaps you should store some of the information in an encrypted form somewhere on a network.
Even if you find a way to write over the app in the flash memory, you really aren't erasing the app. Flash memory chips use wear leveling algorithms to reduce writes to the same blocks and so when you write out zeroes they are typically written to a new block of memory and not to the same block used before, so you really aren't erasing anything. The data can still be recovered from the flash chip (by a pro).
Another option is to separate out the parameters of the algorithm so that the algorithm is no longer sensitive (or at least not usable) and provide the parameters encrypted in a file. Then provide the key to authorized users via the network and don't store that key into flash, only RAM. They would need to get the key every time they start the app. Only give the key to authorized users. Of course, you'll also need to encrypt that key for transmission over the network with another key... There are systems for doing this, don't invent your own, in any case you'll need a crypto expert to do this right.
I would use the built in encryption to store the data, with a key the user has to enter to decrypt it. Without the key it doesn't matter if the data blob is recovered from the device.
Is it possible to assign different identifiers to copies of an app downloaded from the app store that is hard coded into the application? Or is their anyway of permanently storing an identifier in the application bundle such that when it is copied, the key remains within the bundle?
EDIT: Ok, how about iTunes reciepts, can they be used to verify when it was downloaded as the user has to register their app with the server within say 5 hours of them downloading it.
thanks in advance
I'm assuming your goal here is to disable part of the functionality of your app by having a master list of bogus serial numbers somewhere. Unfortunately there is no per-copy serial number available, and if there were it would be the first thing the bad guys would change before posting your app for download.
Instead you'll need to detect whether your app bundle has been tampered with from within the app. See this question:
Reducing piracy of iPhone applications
You'll then need to decide how subtly or obviously you want to limit functionality. Probably the best solution would be to do something innocuous but slightly annoying that generates a specific kind of support request, at which point you can gently prod the deadbeat into considering buying a legit copy.
An approach with more false positives but potentially fewer false negatives would be to check if the app is running on a jailbroken device. The downside there is that jailbreakers may well have legitimately purchased your app, so you're alienating honest customers for little to no extra benefit.
For the app I'm working on, which has a big social/viral aspect (I hope), I've decided that potential deadbeats probably have enough honest friends to pay for the server cycles that they're stealing, and it's just not worth worrying about.
No, there's no way to do either of these. The closest you could come would be to store device IDs on a central server.