Does UserDefaults.integer work on 32-bit platforms? - swift

Will the following code work on 32-bit platforms?
class Account {
private static let UserIDKey = "AccountUserIDKey"
class var userID: Int64? {
get { Int64(UserDefaults.standard.integer(forKey: UserIDKey)) }
set { UserDefaults.standard.set(newValue, forKey: UserIDKey) }
}
}
Ie, will it work on 32-bit platforms if I store and retrieve an Int value that's greater than Int32.max into UserDefaults?
I'm wondering because I only see the UserDefaults instance method integer(forKey:). Why doesn't UserDefaults have an instance method like int64(forKey:)?

You can save a number of simple variable types in the user defaults:
Booleans with Bool
integers with Int
floats with Float
doubles with Double
Strings with String
binary data with Data
dates with Date
URLs with the URL type
Collection types with Array and Dictionary
Internally the UserDefaults class can only store NSData, NSString, NSNumber, NSDate, NSArray and NSDictionary classes.
These are object types that can be saved in a property list.
An NSNumber is an NSObject that can contain the original C style numeric types, which even include the C/Objective-C style BOOL type (that stores YES or NO). Thankfully for us, these are also bridged to Swift, so for a Swift program, an NSNumber can automatically accept the following Swift types:
UInt
Int
Float
Double
Bool
Dmitry Popov's answer in Quora
Remember how you were taught in school how to add numbers like 38 and 54. You work with separate digits. You take rightmost digits, 8 and 4, add them, get the digit 2 in answer and 1 carried because of an overflow. You take the second digits, 3 and 5, add them to get 8 and add the carried 1 to get result 9, so the whole answer becomes 92. A 32-bit processor does exactly the same, but instead of decimal digits it's got 32-bit integers, algorithms are the same: work on two 32-bit parts of 64-bit number separately, deal with overflows by carrying from lower to higher word. This takes several 32-bit instructions and the compiler is responsible to encode them properly.
Save Int64 to UserDefaults
import Foundation
let defaults = UserDefaults.standart
//Define an 64bit integer with `Int64`
let myInteger: Int64 = 1000000000000000
//Save the integer to `UserDefaults`
defaults.set(myInteger, forKey: "myIntegerKey")
// Get the saved value from `UserDefaults`
guard let savedInteger = defaults.object(forKey: "myIntegerKey") as? Int64 else { return }
print("my 64bit integer is:", savedInteger)
Save 64bit Int to UserDefaults via converting to String
let myInteger: Int64 = 1000000000000000
let myString = String(myInteger)
//Save the `myString` to `UserDefaults`
UserDefaults.standard.set(myString, forKey: "myIntegerKey")
// Get the value from `UserDefaults`
let myString = UserDefaults.standard.string(forKey: "myIntegerKey")
let myInteger = Int64(myString!)
Resources
How is a UInt64 represented in a 32 bit device
Martin R's SO answer to store 64bit Int into UserDefaults
learnappmaking.com
codingexplorer.com NSUserDefaults a Swift introduction

Related

Guard Statement Parameter Error [duplicate]

How do you get the length of a String? For example, I have a variable defined like:
var test1: String = "Scott"
However, I can't seem to find a length method on the string.
As of Swift 4+
It's just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = "Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪"
println("unusualMenagerie has \(count(unusualMenagerie)) characters")
// prints "unusualMenagerie has 40 characters"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie) instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count
TLDR:
For Swift 2.0 and 3.0, use test1.characters.count. But, there are a few things you should know. So, read on.
Counting characters in Swift
Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String, so it's the most correct alternative in the sense that, if you'd print the string and count characters by hand, you'd get the same result.
However, because of the way Strings are implemented in Swift, characters don't always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count on characters.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode
scalars. This means that different characters—and different
representations of the same character—can require different amounts of
memory to store. Because of this, characters in Swift do not each take
up the same amount of memory within a string’s representation. As a
result, the number of characters in a string cannot be calculated
without iterating through the string to determine its extended
grapheme cluster boundaries. If you are working with particularly long
string values, be aware that the characters property must iterate over
the Unicode scalars in the entire string in order to determine the
characters for that string.
The count of the characters returned by the characters property is not
always the same as the length property of an NSString that contains
the same characters. The length of an NSString is based on the number
of 16-bit code units within the string’s UTF-16 representation and not
the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = "👍"
emoji.characters.count //returns 1
emoji.utf16.count //returns 2
Swift 1.2 Update: There's no longer a countElements for counting the size of collections. Just use the count function as a replacement: count("Swift")
Swift 2.0, 3.0 and 3.1:
let strLength = string.characters.count
Swift 4.2 (4.0 onwards): [Apple Documentation - Strings]
let strLength = string.count
Swift 1.1
extension String {
var length: Int { return countElements(self) } //
}
Swift 1.2
extension String {
var length: Int { return count(self) } //
}
Swift 2.0
extension String {
var length: Int { return characters.count } //
}
Swift 4.2
extension String {
var length: Int { return self.count }
}
let str = "Hello"
let count = str.length // returns 5 (Int)
Swift 4
"string".count
;)
Swift 3
extension String {
var length: Int {
return self.characters.count
}
}
usage
"string".length
If you are just trying to see if a string is empty or not (checking for length of 0), Swift offers a simple boolean test method on String
myString.isEmpty
The other side of this coin was people asking in ObjectiveC how to ask if a string was empty where the answer was to check for a length of 0:
NSString is empty
Swift 5.1, 5
let flag = "🇵🇷"
print(flag.count)
// Prints "1" -- Counts the characters and emoji as length 1
print(flag.unicodeScalars.count)
// Prints "2" -- Counts the unicode lenght ex. "A" is 65
print(flag.utf16.count)
// Prints "4"
print(flag.utf8.count)
// Prints "8"
tl;dr If you want the length of a String type in terms of the number of human-readable characters, use countElements(). If you want to know the length in terms of the number of extended grapheme clusters, use endIndex. Read on for details.
The String type is implemented as an ordered collection (i.e., sequence) of Unicode characters, and it conforms to the CollectionType protocol, which conforms to the _CollectionType protocol, which is the input type expected by countElements(). Therefore, countElements() can be called, passing a String type, and it will return the count of characters.
However, in conforming to CollectionType, which in turn conforms to _CollectionType, String also implements the startIndex and endIndex computed properties, which actually represent the position of the index before the first character cluster, and position of the index after the last character cluster, respectively. So, in the string "ABC", the position of the index before A is 0 and after C is 3. Therefore, endIndex = 3, which is also the length of the string.
So, endIndex can be used to get the length of any String type, then, right?
Well, not always...Unicode characters are actually extended grapheme clusters, which are sequences of one or more Unicode scalars combined to create a single human-readable character.
let circledStar: Character = "\u{2606}\u{20DD}" // ☆⃝
circledStar is a single character made up of U+2606 (a white star), and U+20DD (a combining enclosing circle). Let's create a String from circledStar and compare the results of countElements() and endIndex.
let circledStarString = "\(circledStar)"
countElements(circledStarString) // 1
circledStarString.endIndex // 2
In Swift 2.0 count doesn't work anymore. You can use this instead:
var testString = "Scott"
var length = testString.characters.count
Here's something shorter, and more natural than using a global function:
aString.utf16count
I don't know if it's available in beta 1, though. But it's definitely there in beta 2.
Updated for Xcode 6 beta 4, change method utf16count --> utf16Count
var test1: String = "Scott"
var length = test1.utf16Count
Or
var test1: String = "Scott"
var length = test1.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)
As of Swift 1.2 utf16Count has been removed. You should now use the global count() function and pass the UTF16 view of the string. Example below...
let string = "Some string"
count(string.utf16)
For Xcode 7.3 and Swift 2.2.
let str = "🐶"
If you want the number of visual characters:
str.characters.count
If you want the "16-bit code units within the string’s UTF-16 representation":
str.utf16.count
Most of the time, 1 is what you need.
When would you need 2? I've found a use case for 2:
let regex = try! NSRegularExpression(pattern:"🐶",
options: NSRegularExpressionOptions.UseUnixLineSeparators)
let str = "🐶🐶🐶🐶🐶🐶"
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.utf16.count), withTemplate: "dog")
print(result) // dogdogdogdogdogdog
If you use 1, the result is incorrect:
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.characters.count), withTemplate: "dog")
print(result) // dogdogdog🐶🐶🐶
You could try like this
var test1: String = "Scott"
var length = test1.bridgeToObjectiveC().length
in Swift 2.x the following is how to find the length of a string
let findLength = "This is a string of text"
findLength.characters.count
returns 24
Swift 2.0:
Get a count: yourString.text.characters.count
Fun example of how this is useful would be to show a character countdown from some number (150 for example) in a UITextView:
func textViewDidChange(textView: UITextView) {
yourStringLabel.text = String(150 - yourStringTextView.text.characters.count)
}
In swift4 I have always used string.count till today I have found that
string.endIndex.encodedOffset
is the better substitution because it is faster - for 50 000 characters string is about 6 time faster than .count. The .count depends on the string length but .endIndex.encodedOffset doesn't.
But there is one NO. It is not good for strings with emojis, it will give wrong result, so only .count is correct.
In Swift 4 :
If the string does not contain unicode characters then use the following
let str : String = "abcd"
let count = str.count // output 4
If the string contains unicode chars then use the following :
let spain = "España"
let count1 = spain.count // output 6
let count2 = spain.utf8.count // output 7
In Xcode 6.1.1
extension String {
var length : Int { return self.utf16Count }
}
I think that brainiacs will change this on every minor version.
Get string value from your textview or textfield:
let textlengthstring = (yourtextview?.text)! as String
Find the count of the characters in the string:
let numberOfChars = textlength.characters.count
Here is what I ended up doing
let replacementTextAsDecimal = Double(string)
if string.characters.count > 0 &&
replacementTextAsDecimal == nil &&
replacementTextHasDecimalSeparator == nil {
return false
}
Swift 4 update comparing with swift 3
Swift 4 removes the need for a characters array on String. This means that you can directly call count on a string without getting characters array first.
"hello".count // 5
Whereas in swift 3, you will have to get characters array and then count element in that array. Note that this following method is still available in swift 4.0 as you can still call characters to access characters array of the given string
"hello".characters.count // 5
Swift 4.0 also adopts Unicode 9 and it can now interprets grapheme clusters. For example, counting on an emoji will give you 1 while in swift 3.0, you may get counts greater than 1.
"👍🏽".count // Swift 4.0 prints 1, Swift 3.0 prints 2
"👨‍❤️‍💋‍👨".count // Swift 4.0 prints 1, Swift 3.0 prints 4
Swift 4
let str = "Your name"
str.count
Remember: Space is also counted in the number
You can get the length simply by writing an extension:
extension String {
// MARK: Use if it's Swift 2
func stringLength(str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 3
func stringLength(_ str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 4
func stringLength(_ str: String) -> Int {
return str.count
}
}
Best way to count String in Swift is this:
var str = "Hello World"
var length = count(str.utf16)
String and NSString are toll free bridge so you can use all methods available to NSString with swift String
let x = "test" as NSString
let y : NSString = "string 2"
let lenx = x.count
let leny = y.count
test1.characters.count
will get you the number of letters/numbers etc in your string.
ex:
test1 = "StackOverflow"
print(test1.characters.count)
(prints "13")
Apple made it different from other major language. The current way is to call:
test1.characters.count
However, to be careful, when you say length you mean the count of characters not the count of bytes, because those two can be different when you use non-ascii characters.
For example;
"你好啊hi".characters.count will give you 5 but this is not the count of the bytes.
To get the real count of bytes, you need to do "你好啊hi".lengthOfBytes(using: String.Encoding.utf8). This will give you 11.
Right now (in Swift 2.3) if you use:
myString.characters.count
the method will return a "Distance" type, if you need the method to return an Integer you should type cast like so:
var count = myString.characters.count as Int
my two cents for swift 3/4
If You need to conditionally compile
#if swift(>=4.0)
let len = text.count
#else
let len = text.characters.count
#endif

Why converting Double to Int doesn't return optional Int in Swift?

Converting a String to Int returns an optional value but converting a Double to Int does not return an optional value. Why is that? I wanted to check if a double value is bigger than maximum Int value, but because converting function does not return an optional value, I am not be able to check by using optional binding.
var stringNumber: String = "555"
var intValue = Int(stringNumber) // returns optional(555)
var doubleNumber: Double = 555
var fromDoubleToInt = Int(doubleNumber) // returns 555
So if I try to convert a double number bigger than maximum Integer, it crashes instead of returning nil.
var doubleNumber: Double = 55555555555555555555
var fromDoubleToInt = Int(doubleNumber) // Crashes here
I know that there's another way to check if a double number is bigger than maximum Integer value, but I'm curious as why it's happening this way.
If we consider that for most doubles, a conversion to Int simply means dropping the decimal part:
let pieInt = Int(3.14159) // 3
Then the only case in which the Int(Double) constructor returns nil is in the case of an overflow.
With strings, converting to Int returns an optional, because generally, strings, such as "Hello world!" cannot be represented as an Int in a way that universally makes sense. So we return nil in the case that the string cannot be represented as an integer. This includes, by the way, values that can be perfectly represented as doubles or floats:
Consider:
let iPi = Int("3.14159")
let dPi = Double("3.14159")
In this case, iPi is nil while dPi is 3.14159. Why? Because "3.14159" doesn't have a valid Int representation.
But meanwhile, when we use the Int constructor which takes a Double and returns non-optional, we get a value.
So, if that constructor is changed to return an optional, why would it return 3 for 3.14159 instead of nil? 3.14159 can't be represented as an integer.
But if you want a method that returns an optional Int, returning nil when the Double would overflow, you can just write that method.
extension Double {
func toInt() -> Int? {
let minInt = Double(Int.min)
let maxInt = Double(Int.max)
guard case minInt ... maxInt = self else {
return nil
}
return Int(self)
}
}
let a = 3.14159.toInt() // returns 3
let b = 555555555555555555555.5.toInt() // returns nil
Failable initializers and methods with Optional return types are designed for scenarios where you, the programmer, can't know whether a parameter value will cause failure, or where verifying that an operation will succeed is equivalent to performing the operation:
let intFromString = Int(someString)
let valueFromDict = dict[someKey]
Parsing an integer from a string requires checking the string for numeric/non-numeric characters, so the check is the same as the work. Likewise, checking a dictionary for the existence of a key is the same as looking up the value for the key.
By contrast, certain operations are things where you, the programmer, need to verify upfront that your parameters or preconditions meet expectations:
let foo = someArray[index]
let bar = UInt32(someUInt64)
let baz: UInt = someUInt - anotherUInt
You can — and in most cases should — test at runtime whether index < someArray.count and someUInt64 < UInt32.max and someUInt > anotherUInt. These assumptions are fundamental to working with those kinds of types. On the one hand, you really want to design around them from the start. On the other, you don't want every bit of math you do to be peppered with Optional unwrapping — that's why we have types whose axioms are stated upfront.

Swift: Convert Int16 to Int32 (or NSInteger)

I'm really stuck! I'm not an expert at ObjC, and now I am trying to use Swift. I thought it would be much simpler, but it wasn't. I remember Craig said that they call Swift “Objective-C without C”, but there are too many C types in OS X's foundation. Documents said that many ObjC types will automatically convert, possibly bidirectionally, to Swift types. I'm curious: how about C types?
Here's where I'm stuck:
//array1:[String?], event.KeyCode.value:Int16
let s = array1[event.keyCode.value]; //return Int16 is not convertible to Int
I tried some things in ObjC:
let index = (Int) event.keyCode.value; //Error
or
let index = (Int32) event.keyCode.value; //Error again, Swift seems doesn't support this syntax
What is the proper way to convert Int16 to Int?
To convert a number from one type to another, you have to create a new instance, passing the source value as parameter - for example:
let int16: Int16 = 20
let int: Int = Int(int16)
let int32: Int32 = Int32(int16)
I used explicit types for variable declarations, to make the concept clear - but in all the above cases the type can be inferred:
let int16: Int16 = 20
let int = Int(int16)
let int32 = Int32(int16)
This is not how this type of casting works in Swift. Instead, use:
let a : Int16 = 1
let b : Int = Int(a)
So, you basically instantiate one variable on base of the other.
In my case "I was trying to convert core data Int16 to Int" explicit casting didn't worked. I did this way
let quantity_string:String = String(format:"%d",yourInt16value)
let quantity:Int = Int(quantity_string)

Saving score with NSUserDefaults: UInt8 not convertible to Int8

I'm creating a simple game. if the player loses i want his current score to be compared with the best score saved as persistent storage as NSUserDefault. i keep on getting "UInt8 is not convertible to Int8" this is the code
scoreForComparingWithBestScore = UInt8(score)
if ( scoreForComparingWithBestScore > NSUserDefaults.standardUserDefaults().objectForKey("BestScore")){
NSUserDefaults.standardUserDefaults().objectForKey("BestScore") = scoreForComparingWithBestScore
}
even if i typed
scoreForComparingWithBestScore = Int8(score)
the error will be switched.
Thanks in advance !
NSUserDefaults.standardUserDefaults().objectForKey("BestScore”) returns an AnyObject?. To compare this with a UInt8 (or any other Swift numeric value type – you might want to reconsider UInt8 and use Int instead unless there’s a very good reason to stick with that type), you need to both unwrap the optional and also convert it to a number:
if let best = NSUserDefaults.standardUserDefaults().objectForKey("BestScore")?.integerValue {
// best will be an Int here, you can compare it to scoreForComparingWithBestScore
}
else {
// either there was no BestScore value, or there was but it wasn't an integer
}
Alternatively, you could use the ?? operator to default the best score to zero if not present (or not an integer):
let best = NSUserDefaults
.standardUserDefaults()
.objectForKey("BestScore")?
.integerValue ?? 0
NSSUserDefaults has convenience methods to store different types of objects.
In your case, rather than storing objects directly and then trying to convert them to the correct type, you could just use the convenience methods instead:
if let best = NSUserDefaults.standardUserDefaults().integerForKey("BestScore") {
// best will be an Int here, you can compare it to scoreForComparingWithBestScore
}
else {
// There was no BestScore value.
}
As long as you set the value with:
NSUserDefaults.setInteger(best, forKey:"BestScore")
Then because of the strong type system you will know that you have put an integer into the defaults and you then only need to check if a value was set for that key instead of also worrying about whether the object was of the correct type.
This will also drive your design, as you know that the value should be an Int rather than over optimising it is an Int8 or a UInt8.
I got here some examples for how to save NSUserDefaults and how to retrieve it. What you are seeing here is how I save text from an UITextField and how I save a UISwith. You will also see how I retrieve this information out of my NSUserDefaults.
func saveNSUserDefaults() {
let defaults: NSUserDefaults = NSUserDefaults.standardUserDefaults()
defaults.setObject(self.textFieldUrl.text + "/portal/silvertabletmenu.aspx", forKey: "url")
defaults.setBool(switchToolbar!.on, forKey: "toolbar");
defaults.synchronize()
}
func loadNSUserDefaults() {
let defaults: NSUserDefaults = NSUserDefaults.standardUserDefaults()
if let urlIsNotNil = defaults.objectForKey("url") as? String {
self.textFieldUrl.text = defaults.objectForKey("url") as String
}
if textFieldUrl.text == "" {
self.textFieldUrl.text = "https://thisisjustanexample.net/"
}
switchToolbar!.on = NSUserDefaults.standardUserDefaults().boolForKey("toolbar")
}

Convert JSON AnyObject to Int64

this is somewhat related to this question: How to properly store timestamp (ms since 1970)
Is there a way to typeCast a AnyObject to Int64? I am receiving a huge number via JSON this number arrives at my class as "AnyObject" - how can I cast it to Int64 - xcode just says its not possible.
JSON numbers are NSNumber, so you'll want to go through there.
import Foundation
var json:AnyObject = NSNumber(longLong: 1234567890123456789)
var num = json as? NSNumber
var result = num?.longLongValue
Note that result is Int64?, since you don't know that this conversion will be successful.
You can cast from a AnyObject to an Int with the as type cast operator, but to downcast into different numeric types you need to use the target type's initializer.
var o:AnyObject = 1
var n:Int = o as Int
var u:Int64 = Int64(n)
Try SwiftJSON which is a better way to deal with JSON data in Swift
let json = SwiftJSON.JSON(data: dataFromServer)
if let number = json["number"].longLong {
//do what you want
} else {
//print the error message if you like
println(json["number"])
}
As #Rob Napier's answer says, you are dealing with NSNumber. If you're sure you have a valid one, you can do this to get an Int64
(json["key"] as! NSNumber).longLongValue