NSLocalizedString with format specifiers in Swift yields garbage - swift

To facilitate easier localizing in a very small app of mine, I have this String extension method:
extension String {
func localized(with values: Any...) -> String {
// debug values
for v in values {
print("\(type(of: v)): \(v)")
}
return String.localizedStringWithFormat(NSLocalizedString(self, comment: ""), values)
}
}
My German localization of Localizable.strings contains this key/value pair:
"WeeksFuture" = "In %d Wochen";
Doing this:
for _ in 0..<5 {
let localized = "WeeksFuture".localized(with: 3)
print(localized)
}
while having Xcode set to debug the app in German (although this happens in every other language too) prints this to the output window:
Int: 3
In 151.456 Wochen
Int: 3
In 186.912 Wochen
Int: 3
In 186.880 Wochen
Int: 3
In 187.264 Wochen
Int: 3
In 187.488 Wochen
Obviously, this is all wrong. Why do I first get the correct output of "Int: 3", and then a string with a seemingly random garbage number?

String.localizedStringWithFormat takes a String and CVarArg... as arguments. You passed in an array of Any - values as the second argument. It is forced to convert an array to a decimal number, resulting in the weird result.
To solve this problem, you just need to find an overload that takes an [CVarArg] instead. Luckily, there is an init overload like that:
return String.init(format:
NSLocalizedString(self, comment: ""), arguments: values)
However, values is an [Any], which is not compatible with the expected [CVarArg]. You should probably change the parameter type.
So your whole extension looks like this:
func localized(with values: CVarArg...) -> String {
return String.init(format: NSLocalizedString(self, comment: ""), arguments: values)
}

Related

Adding numbers inside a string in Swift

Reading through this problem in a book
Given a string that contains both letters and numbers, write a
function that pulls out all the numbers then returns their sum. Sample
input and output
The string “a1b2c3” should return 6 (1 + 2 + 3). The string
“a10b20c30” should return 60 (10 + 20 + 30). The string “h8ers” should
return “8”.
My solution so far is
import Foundation
func sumOfNumbers(in string: String) -> Int {
var numbers = string.filter { $0.isNumber }
var numbersArray = [Int]()
for number in numbers {
numbersArray.append(Int(number)!)
}
return numbersArray.reduce(0, { $0 * $1 })
}
However, I get the error
Solution.swift:8:33: error: cannot convert value of type 'String.Element' (aka 'Character') to expected argument type 'String'
numbersArray.append(Int(number)!)
^
And I'm struggling to get this number of type String.Element into a Character. Any guidance would be appreciated.
The error occurs because Int.init is expecting a String, but the argument number you gave is of type Character.
It is easy to fix the compiler error just by converting the Character to String by doing:
numbersArray.append(Int("\(number)")!)
or just:
numbersArray.append(number.wholeNumberValue!)
However, this does not produce the expected output. First, you are multiplying the numbers together, not adding. Second, you are considering each character separately, and not considering groups of digits as one number.
You can instead implement the function like this:
func sumOfNumbers(in string: String) -> Int {
string.components(separatedBy: CharacterSet(charactersIn: "0"..."9").inverted)
.compactMap(Int.init)
.reduce(0, +)
}
The key thing is to split the string using "non-digits", so that "10" and "20" etc gets treated as individual numbers.

Cannot invoke 'reduce' with an argument list of type '(String, (String) -> String)'

I am trying to convert Swift 3 to Swift 4 for a repo on github. Here is a function that blocks me.
func times(_ n: Int) -> String {
return (0..<n).reduce("") { $0 + self }
}
The error Xcode gives is:
"Cannot invoke 'reduce' with an argument list of type '(String, (String) -> String)'"
I looked at Apple's official page and found reduce(_:_:) and reduce(into:_:), and someone's question. Have tried the code below but I still can't get it to work. Please point out what I am missing.
return (0..<n).character.reduce("") { string, character in
(0..<n) + self }
return (0..<n).character.reduce("") { $0 + self }
Here $0 refers to the closure's first argument (I think). Then we can self property to refer to the current instance within its own instance methods.
Your closure will be receiving two parameters and you're only using one ($0). You could use $0.0 in the closure or simply use the string constructor that does the same thing without reduce:
func times(_ n: Int) -> String
{ return String(repeating:self, count:n) }
OR, if you want to use Python like multiplication to repeat a string, you could add an operator:
extension String
{
static func *(lhs:String,rhs:Int) -> String
{ return String(repeating:lhs, count:rhs) }
}
// then, the following will work nicely:
"blah " * 10 // ==> "blah blah blah blah blah blah blah blah blah blah "
The answer depends on what are you going to do. It was hard to get it out what reduce exactly do, but you must understand that the main point of this function needed to reduce the array into one variable.
Take a look on example:
let items = ["one", "two", "three"]
let final = items.reduce("initial") { text, item in "\(text), \(item)" }
print(final) // initial, one, two, three
In closure the text is a cumulating string. The initial value is setted as a parameter. "initial" in our example. At the first iteration the text would be initial, one. At second: initial, one, two. And so on. That's because we setted a rule how to reduce the array: "\(text), \(item)"
In your example:
func times(_ n: Int) -> String {
return (0..<n).reduce("") { $0 + self }
}
First we create an array with n items by this (0..<n): [0, 1, 2, 3, 4..]
Then we set an initial value as an empty string.
Next can't know what you need.
Maybe you need a result string as 0123456789.., then there a code:
let reduced = (0..<n).reduce("") { text, value in text + "\(value)" }
Hope that would help you =)

Swift 3 String has no member components [duplicate]

So I'm trying to prepare myself for coding interviews by doing HackerRank's test case samples. If you're familiar with the process, you usually take a standard input that has various lines of strings and you extract the information based on what the question is asking. I have come across numerous questions where they will give you a line (as a String) with n number of integers separated by a space (i.e. 1 2 3 4 5). In order to solve the problem I need to extrapolate an array of Int ([Int]) from a String. I came up with this nifty method:
func extractIntegers(_ s: String) -> [Int] {
let splits = s.characters.split { [" "].contains(String($0)) }
return splits.map { Int(String($0).trimmingCharacters(in: .whitespaces))! }
}
So I code it in my Playground and it works fantastic, I even run multiple test cases I make up, and they all pass with flying colors...then I copy the code to HackerRank and try running it for submission. And I get this:
solution.swift:16:29: error: value of type 'String' has no member 'trimmingCharacters'
return splits.map { Int(String($0).trimmingCharacters(in: .whitespaces))! }
So... okay maybe HR hasn't updated everything for Swift 3 yet. No big deal! I have an idea for an even cleaner solution! Here it is:
func extractIntegers(_ s: String) -> [Int] {
return s.components(separatedBy: " ").map { Int($0)! }
}
....AAAAANDDD of course:
solution.swift:15:12: error: value of type 'String' has no member 'components'
return s.components(separatedBy: " ").map { Int($0)! }
So now I'm forced to use a really sloppy method where I loop through all the characters, check for spaces, append substrings from ranges between spaces into an array, and then map that array and return it.
Does anyone have any other clean ideas to work around HR's inadequacies with Swift? I would like any recommendations I can get!
Thanks in advance!
The String methods
func trimmingCharacters(in set: CharacterSet) -> String
func components(separatedBy separator: String) -> [String]
are actually methods of the NSString class, defined in the Foundation
framework, and "bridged" to Swift. Therefore, to make your code compile,
you have go add
import Foundation
But a slightly simplified version of your first method compiles
with pure Swift, without importing Foundation. I handles leading, trailing, and intermediate whitespace:
func extractIntegers(_ s: String) -> [Int] {
let splits = s.characters.split(separator: " ").map(String.init)
return splits.map { Int($0)! }
}
let a = extractIntegers(" 12 234 -567 4 ")
print(a) // [12, 234, -567, 4]
Update for Swift 4 (and simplified):
func extractIntegers(_ s: String) -> [Int] {
return s.split(separator: " ").compactMap { Int($0) }
}

Guard Statement Parameter Error [duplicate]

How do you get the length of a String? For example, I have a variable defined like:
var test1: String = "Scott"
However, I can't seem to find a length method on the string.
As of Swift 4+
It's just:
test1.count
for reasons.
(Thanks to Martin R)
As of Swift 2:
With Swift 2, Apple has changed global functions to protocol extensions, extensions that match any type conforming to a protocol. Thus the new syntax is:
test1.characters.count
(Thanks to JohnDifool for the heads up)
As of Swift 1
Use the count characters method:
let unusualMenagerie = "Koala 🐨, Snail 🐌, Penguin 🐧, Dromedary 🐪"
println("unusualMenagerie has \(count(unusualMenagerie)) characters")
// prints "unusualMenagerie has 40 characters"
right from the Apple Swift Guide
(note, for versions of Swift earlier than 1.2, this would be countElements(unusualMenagerie) instead)
for your variable, it would be
length = count(test1) // was countElements in earlier versions of Swift
Or you can use test1.utf16count
TLDR:
For Swift 2.0 and 3.0, use test1.characters.count. But, there are a few things you should know. So, read on.
Counting characters in Swift
Before Swift 2.0, count was a global function. As of Swift 2.0, it can be called as a member function.
test1.characters.count
It will return the actual number of Unicode characters in a String, so it's the most correct alternative in the sense that, if you'd print the string and count characters by hand, you'd get the same result.
However, because of the way Strings are implemented in Swift, characters don't always take up the same amount of memory, so be aware that this behaves quite differently than the usual character count methods in other languages.
For example, you can also use test1.utf16.count
But, as noted below, the returned value is not guaranteed to be the same as that of calling count on characters.
From the language reference:
Extended grapheme clusters can be composed of one or more Unicode
scalars. This means that different characters—and different
representations of the same character—can require different amounts of
memory to store. Because of this, characters in Swift do not each take
up the same amount of memory within a string’s representation. As a
result, the number of characters in a string cannot be calculated
without iterating through the string to determine its extended
grapheme cluster boundaries. If you are working with particularly long
string values, be aware that the characters property must iterate over
the Unicode scalars in the entire string in order to determine the
characters for that string.
The count of the characters returned by the characters property is not
always the same as the length property of an NSString that contains
the same characters. The length of an NSString is based on the number
of 16-bit code units within the string’s UTF-16 representation and not
the number of Unicode extended grapheme clusters within the string.
An example that perfectly illustrates the situation described above is that of checking the length of a string containing a single emoji character, as pointed out by n00neimp0rtant in the comments.
var emoji = "👍"
emoji.characters.count //returns 1
emoji.utf16.count //returns 2
Swift 1.2 Update: There's no longer a countElements for counting the size of collections. Just use the count function as a replacement: count("Swift")
Swift 2.0, 3.0 and 3.1:
let strLength = string.characters.count
Swift 4.2 (4.0 onwards): [Apple Documentation - Strings]
let strLength = string.count
Swift 1.1
extension String {
var length: Int { return countElements(self) } //
}
Swift 1.2
extension String {
var length: Int { return count(self) } //
}
Swift 2.0
extension String {
var length: Int { return characters.count } //
}
Swift 4.2
extension String {
var length: Int { return self.count }
}
let str = "Hello"
let count = str.length // returns 5 (Int)
Swift 4
"string".count
;)
Swift 3
extension String {
var length: Int {
return self.characters.count
}
}
usage
"string".length
If you are just trying to see if a string is empty or not (checking for length of 0), Swift offers a simple boolean test method on String
myString.isEmpty
The other side of this coin was people asking in ObjectiveC how to ask if a string was empty where the answer was to check for a length of 0:
NSString is empty
Swift 5.1, 5
let flag = "🇵🇷"
print(flag.count)
// Prints "1" -- Counts the characters and emoji as length 1
print(flag.unicodeScalars.count)
// Prints "2" -- Counts the unicode lenght ex. "A" is 65
print(flag.utf16.count)
// Prints "4"
print(flag.utf8.count)
// Prints "8"
tl;dr If you want the length of a String type in terms of the number of human-readable characters, use countElements(). If you want to know the length in terms of the number of extended grapheme clusters, use endIndex. Read on for details.
The String type is implemented as an ordered collection (i.e., sequence) of Unicode characters, and it conforms to the CollectionType protocol, which conforms to the _CollectionType protocol, which is the input type expected by countElements(). Therefore, countElements() can be called, passing a String type, and it will return the count of characters.
However, in conforming to CollectionType, which in turn conforms to _CollectionType, String also implements the startIndex and endIndex computed properties, which actually represent the position of the index before the first character cluster, and position of the index after the last character cluster, respectively. So, in the string "ABC", the position of the index before A is 0 and after C is 3. Therefore, endIndex = 3, which is also the length of the string.
So, endIndex can be used to get the length of any String type, then, right?
Well, not always...Unicode characters are actually extended grapheme clusters, which are sequences of one or more Unicode scalars combined to create a single human-readable character.
let circledStar: Character = "\u{2606}\u{20DD}" // ☆⃝
circledStar is a single character made up of U+2606 (a white star), and U+20DD (a combining enclosing circle). Let's create a String from circledStar and compare the results of countElements() and endIndex.
let circledStarString = "\(circledStar)"
countElements(circledStarString) // 1
circledStarString.endIndex // 2
In Swift 2.0 count doesn't work anymore. You can use this instead:
var testString = "Scott"
var length = testString.characters.count
Here's something shorter, and more natural than using a global function:
aString.utf16count
I don't know if it's available in beta 1, though. But it's definitely there in beta 2.
Updated for Xcode 6 beta 4, change method utf16count --> utf16Count
var test1: String = "Scott"
var length = test1.utf16Count
Or
var test1: String = "Scott"
var length = test1.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)
As of Swift 1.2 utf16Count has been removed. You should now use the global count() function and pass the UTF16 view of the string. Example below...
let string = "Some string"
count(string.utf16)
For Xcode 7.3 and Swift 2.2.
let str = "🐶"
If you want the number of visual characters:
str.characters.count
If you want the "16-bit code units within the string’s UTF-16 representation":
str.utf16.count
Most of the time, 1 is what you need.
When would you need 2? I've found a use case for 2:
let regex = try! NSRegularExpression(pattern:"🐶",
options: NSRegularExpressionOptions.UseUnixLineSeparators)
let str = "🐶🐶🐶🐶🐶🐶"
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.utf16.count), withTemplate: "dog")
print(result) // dogdogdogdogdogdog
If you use 1, the result is incorrect:
let result = regex.stringByReplacingMatchesInString(str,
options: NSMatchingOptions.WithTransparentBounds,
range: NSMakeRange(0, str.characters.count), withTemplate: "dog")
print(result) // dogdogdog🐶🐶🐶
You could try like this
var test1: String = "Scott"
var length = test1.bridgeToObjectiveC().length
in Swift 2.x the following is how to find the length of a string
let findLength = "This is a string of text"
findLength.characters.count
returns 24
Swift 2.0:
Get a count: yourString.text.characters.count
Fun example of how this is useful would be to show a character countdown from some number (150 for example) in a UITextView:
func textViewDidChange(textView: UITextView) {
yourStringLabel.text = String(150 - yourStringTextView.text.characters.count)
}
In swift4 I have always used string.count till today I have found that
string.endIndex.encodedOffset
is the better substitution because it is faster - for 50 000 characters string is about 6 time faster than .count. The .count depends on the string length but .endIndex.encodedOffset doesn't.
But there is one NO. It is not good for strings with emojis, it will give wrong result, so only .count is correct.
In Swift 4 :
If the string does not contain unicode characters then use the following
let str : String = "abcd"
let count = str.count // output 4
If the string contains unicode chars then use the following :
let spain = "España"
let count1 = spain.count // output 6
let count2 = spain.utf8.count // output 7
In Xcode 6.1.1
extension String {
var length : Int { return self.utf16Count }
}
I think that brainiacs will change this on every minor version.
Get string value from your textview or textfield:
let textlengthstring = (yourtextview?.text)! as String
Find the count of the characters in the string:
let numberOfChars = textlength.characters.count
Here is what I ended up doing
let replacementTextAsDecimal = Double(string)
if string.characters.count > 0 &&
replacementTextAsDecimal == nil &&
replacementTextHasDecimalSeparator == nil {
return false
}
Swift 4 update comparing with swift 3
Swift 4 removes the need for a characters array on String. This means that you can directly call count on a string without getting characters array first.
"hello".count // 5
Whereas in swift 3, you will have to get characters array and then count element in that array. Note that this following method is still available in swift 4.0 as you can still call characters to access characters array of the given string
"hello".characters.count // 5
Swift 4.0 also adopts Unicode 9 and it can now interprets grapheme clusters. For example, counting on an emoji will give you 1 while in swift 3.0, you may get counts greater than 1.
"👍🏽".count // Swift 4.0 prints 1, Swift 3.0 prints 2
"👨‍❤️‍💋‍👨".count // Swift 4.0 prints 1, Swift 3.0 prints 4
Swift 4
let str = "Your name"
str.count
Remember: Space is also counted in the number
You can get the length simply by writing an extension:
extension String {
// MARK: Use if it's Swift 2
func stringLength(str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 3
func stringLength(_ str: String) -> Int {
return str.characters.count
}
// MARK: Use if it's Swift 4
func stringLength(_ str: String) -> Int {
return str.count
}
}
Best way to count String in Swift is this:
var str = "Hello World"
var length = count(str.utf16)
String and NSString are toll free bridge so you can use all methods available to NSString with swift String
let x = "test" as NSString
let y : NSString = "string 2"
let lenx = x.count
let leny = y.count
test1.characters.count
will get you the number of letters/numbers etc in your string.
ex:
test1 = "StackOverflow"
print(test1.characters.count)
(prints "13")
Apple made it different from other major language. The current way is to call:
test1.characters.count
However, to be careful, when you say length you mean the count of characters not the count of bytes, because those two can be different when you use non-ascii characters.
For example;
"你好啊hi".characters.count will give you 5 but this is not the count of the bytes.
To get the real count of bytes, you need to do "你好啊hi".lengthOfBytes(using: String.Encoding.utf8). This will give you 11.
Right now (in Swift 2.3) if you use:
myString.characters.count
the method will return a "Distance" type, if you need the method to return an Integer you should type cast like so:
var count = myString.characters.count as Int
my two cents for swift 3/4
If You need to conditionally compile
#if swift(>=4.0)
let len = text.count
#else
let len = text.characters.count
#endif

swift, optional unwrapping, reversing if condition

Let's say I have function which returns optional. nil if error and value if success:
func foo() -> Bar? { ... }
I can use following code to work with this function:
let fooResultOpt = foo()
if let fooResult = fooResultOpt {
// continue correct operations here
} else {
// handle error
}
However there are few problems with this approach for any non-trivial code:
Error handling performed in the end and it's easy to miss something. It's much better, when error handling code follows function call.
Correct operations code is indented by one level. If we have another function to call, we have to indent one more time.
With C one usually could write something like this:
Bar *fooResult = foo();
if (fooResult == null) {
// handle error and return
}
// continue correct operations here
I found two ways to achieve similar code style with Swift, but I don't like either.
let fooResultOpt = foo()
if fooResult == nil {
// handle error and return
}
// use fooResultOpt! from here
let fooResult = fooResultOpt! // or define another variable
If I'll write "!" everywhere, it just looks bad for my taste. I could introduce another variable, but that doesn't look good either. Ideally I would like to see the following:
if !let fooResult = foo() {
// handle error and return
}
// fooResult has Bar type and can be used in the top level
Did I miss something in the specification or is there some another way to write good looking Swift code?
Your assumptions are correct—there isn't a "negated if-let" syntax in Swift.
I suspect one reason for that might be grammar integrity. Throughout Swift (and commonly in other C-inspired languages), if you have a statement that can bind local symbols (i.e. name new variables and give them values) and that can have a block body (e.g. if, while, for), those bindings are scoped to said block. Letting a block statement bind symbols to its enclosing scope instead would be inconsistent.
It's still a reasonable thing to think about, though — I'd recommend filing a bug and seeing what Apple does about it.
This is what pattern matching is all about, and is the tool meant for this job:
let x: String? = "Yes"
switch x {
case .Some(let value):
println("I have a value: \(value)")
case .None:
println("I'm empty")
}
The if-let form is just a convenience for when you don't need both legs.
If what you are writing is a set of functions performing the same sequence of transformation, such as when processing a result returned by a REST call (check for response not nil, check status, check for app/server error, parse response, etc.), what I would do is create a pipeline that at each steps transforms the input data, and at the end returns either nil or a transformed result of a certain type.
I chose the >>> custom operator, that visually indicates the data flow, but of course feel free to choose your own:
infix operator >>> { associativity left }
func >>> <T, V> (params: T?, next: T -> V?) -> V? {
if let params = params {
return next(params)
}
return nil
}
The operator is a function that receives as input a value of a certain type, and a closure that transforms the value into a value of another type. If the value is not nil, the function invokes the closure, passing the value, and returns its return value. If the value is nil, then the operator returns nil.
An example is probably needed, so let's suppose I have an array of integers, and I want to perform the following operations in sequence:
sum all elements of the array
calculate the power of 2
divide by 5 and return the integer part and the remainder
sum the above 2 numbers together
These are the 4 functions:
func sumArray(array: [Int]?) -> Int? {
if let array = array {
return array.reduce(0, combine: +)
}
return nil
}
func powerOf2(num: Int?) -> Int? {
if let num = num {
return num * num
}
return nil
}
func module5(num: Int?) -> (Int, Int)? {
if let num = num {
return (num / 5, num % 5)
}
return nil
}
func sum(params: (num1: Int, num2: Int)?) -> Int? {
if let params = params {
return params.num1 + params.num2
}
return nil
}
and this is how I would use:
let res: Int? = [1, 2, 3] >>> sumArray >>> powerOf2 >>> module5 >>> sum
The result of this expression is either nil or a value of the type as defined in the last function of the pipeline, which in the above example is an Int.
If you need to do better error handling, you can define an enum like this:
enum Result<T> {
case Value(T)
case Error(MyErrorType)
}
and replace all optionals in the above functions with Result<T>, returning Result.Error() instead of nil.
I've found a way that looks better than alternatives, but it uses language features in unrecommended way.
Example using code from the question:
let fooResult: Bar! = foo();
if fooResult == nil {
// handle error and return
}
// continue correct operations here
fooResult might be used as normal variable and it's not needed to use "?" or "!" suffixes.
Apple documentation says:
Implicitly unwrapped optionals are useful when an optional’s value is confirmed to exist immediately after the optional is first defined and can definitely be assumed to exist at every point thereafter. The primary use of implicitly unwrapped optionals in Swift is during class initialization, as described in Unowned References and Implicitly Unwrapped Optional Properties.
How about the following:
func foo(i:Int) ->Int? {
switch i {
case 0: return 0
case 1: return 1
default: return nil
}
}
var error:Int {
println("Error")
return 99
}
for i in 0...2 {
var bob:Int = foo(i) ?? error
println("\(i) produces \(bob)")
}
Results in the following output:
0 produces 0
1 produces 1
Error
2 produces 99