Swift --- Convert Char to Int in a for in loop - swift

I'd like to convert each char in a string for-in loop, but failed
Here's my code
let stringImported: String = "12345678"
for char in stringImported {
print(Int(char))
}
Get this error
Initializer 'init(_:)' requires that 'String.Element' (aka 'Character') conform to 'BinaryInteger'

There is no Int initializer that takes a Character but Character has a property called wholeNumberValue which returns an optional Int:
let stringImported = "12345678"
for char in stringImported {
print(char.wholeNumberValue ?? "nil")
}

UPDATE: Int cannot operate over chars, it has to be converted to String. Code snippet change to reflect that.
The cast toInt does not guarantee that the conversion will be successful. You need to provide a default value:
let stringImported: String = "12345678"
for char in stringImported {
print("\(Int(String(char)) ?? 0)")
}

Convert Character to String first and then convert it to Int will success
let stringImported: String = "12345678"
for char in stringImported {
print(Int(String(char)))
}
also check this stack overflow question and answer.
Maybe this question is duplicated
Convert Character to Integer in Swift

Related

conversion of Int value String using user defaults having facing an error

UserInfoDefault.saveUserType(user: String(self.type))
I have to convert a type value that is Int and I have to convert it into String value to use it into the User defaults... But facing an issue :
Cannot invoke initializer for type 'String' with an argument list of type '(Int?)'
Note the question mark, there is no init method for String that takes an optional Int. You need to unwrap it first.
let value = myOptionalInt ?? 0
let stringValue = String(value)
or use if let or guard depending on what suits your case the best

How to convert Int8 to Character?

I need to convert Int8 to Character. How can I do it?
I tried use UnicodeScalar
var result += Character(UnicodeScalar(data[i]))
... but got this error:
Cannot invoke initializer for type 'UnicodeScalar' with an argument
list of type (Int8)
Unicode.Scalar can be initalized just with paramaters of certain types which you can find in docs.
I would suggest you using init(_:) which takes UInt8 which tells that given number is positive (this is required for creating UnicodeScalar). So, you can try to cast your Int8 to UInt8 and then your initializer takes parameter of correct type
let int: Int8 = data[i]
if let uint = UInt8(exactly: int) {
let char = Character(UnicodeScalar(uint))
}
Maybe you need just convert whole data to a string:
var result = String(data: data, encoding: .ascii)

why swift think its optional let input = int(a)

I have following simple code after last line sum swift gives error
and var sum=input*2 should be changed to var sum=input!*2
I am not sure why since i didn't declare variable a as optional.
Does swift make input as optional ? Thanks
Var a="2"
let input = int(a)
var sum=input*2
Casting a string to an int returns and Optional because it could fail.
For example, let result = Int("foo") will return nil, because "foo" is not a valid Int.
What if you did
Var a = "This is most defiantly not a number and even if it were its too long to fit within an int 123457993849038409238490ff9f-09-0f9-09f dd0d0066646464646464349023849038490328 I'm a teapot".
let input = int(a)
Do you think that could be converted to an int? Now do you understand why its an optional?
What happens if the String can't be converted to an Int? Such as Int("A"). It becomes an optional because the compiler can't know for sure that the String you are passing in can become an Int.
guard let intVal = Int(a) else { return }
or
if let intVal = Int(a) {
//you have a valid Int here
}
is the way to go about handling this situation

How can I convert Int32 to Int in Swift?

It should be easy but I can only find the reverse conversion.
How can I convert Int32 to Int in Swift?
Unless the problem is different?
I have a value stored in Core Data and I want to return it as an Int.
Here is the code I am using, which does not work:
func myNumber () -> Int {
var myUnit:NSManagedObject
myUnit=self.getObject(“EntityName”) // This is working.
return Int(myUnit.valueForKey(“theNUMBER”)?.intValue!)
}
Am I missing something or isn't this ridiculously easy?
let number1: Int32 = 10
let number2 = Int(number1)
The error is your ? after valueForKey.
Int initializer doesnt accept optionals.
By doing myUnit.valueForKey(“theNUMBER”)?.intValue! gives you an optional value and the ! at the end doesnt help it.
Just replace with this:
return Int(myUnit.valueForKey(“theNUMBER”)!.intValue)
But you could also do like this if you want it to be fail safe:
return myUnit.valueForKey(“theNUMBER”)?.integerValue ?? 0
And to shorten you function you can do this:
func myNumber() -> Int {
let myUnit = self.getObject("EntityName") as! NSManagedObject
return myUnit.valueForKey("theNUMBER")?.integerValue ?? 0
}
Swift 4.0 producing "Cannot invoke initializer for type 'Int' with an argument list of type '(() -> Int32)"
let number1: Int32 = 10
let number2 = Int(number1)
Simply do this
Int("\(Int32value)")
I'm unable to understand why swift is making things difficult.
Sometimes "?" make things twisted ,
adding "!" to Int32 and then convert it to int works
let number1 = someInt32!
let number2 = Int(number1)

String value to UnsafePointer<UInt8> function parameter behavior

I found the following code compiles and works:
func foo(p:UnsafePointer<UInt8>) {
var p = p
for p; p.memory != 0; p++ {
print(String(format:"%2X", p.memory))
}
}
let str:String = "今日"
foo(str)
This prints E4BB8AE697A5 and that is a valid UTF8 representation of 今日
As far as I know, this is undocumented behavior. from the document:
When a function is declared as taking a UnsafePointer argument, it can accept any of the following:
nil, which is passed as a null pointer
An UnsafePointer, UnsafeMutablePointer, or AutoreleasingUnsafeMutablePointer value, which is converted to UnsafePointer if necessary
An in-out expression whose operand is an lvalue of type Type, which is passed as the address of the lvalue
A [Type] value, which is passed as a pointer to the start of the array, and lifetime-extended for the duration of the call
In this case, str is non of them.
Am I missing something?
ADDED:
And it doesn't work if the parameter type is UnsafePointer<UInt16>
func foo(p:UnsafePointer<UInt16>) {
var p = p
for p; p.memory != 0; p++ {
print(String(format:"%4X", p.memory))
}
}
let str:String = "今日"
foo(str)
// ^ 'String' is not convertible to 'UnsafePointer<UInt16>'
Even though the internal String representation is UTF16
let str = "今日"
var p = UnsafePointer<UInt16>(str._core._baseAddress)
for p; p.memory != 0; p++ {
print(String(format:"%4X", p.memory)) // prints 4ECA65E5 which is UTF16 今日
}
This is working because of one of the interoperability changes the Swift team has made since the initial launch - you're right that it looks like it hasn't made it into the documentation yet. String works where an UnsafePointer<UInt8> is required so that you can call C functions that expect a const char * parameter without a lot of extra work.
Look at the C function strlen, defined in "shims.h":
size_t strlen(const char *s);
In Swift it comes through as this:
func strlen(s: UnsafePointer<Int8>) -> UInt
Which can be called with a String with no additional work:
let str = "Hi."
strlen(str)
// 3
Look at the revisions on this answer to see how C-string interop has changed over time: https://stackoverflow.com/a/24438698/59541