How to convert 'String.Element' to 'Int'? - swift

var numsInStr = "1abc2x30yz67"
var sum = 0
for i in numsInStr {
if i.isNumber == true {
sum += i
}
}
print(sum)
Problem is in if statemt to summing numbers. And it returns - "Cannot convert value of type 'String.Element' (aka 'Character') to expected argument type 'Int'"
It is possible to solve this problem like mine. I saw some answers but answers are very short and misunderstanding
Input: 1abc2x30yz67
Output: 100

your solution is not working because you are adding character type in to integrate type value, first you need to convert your character object into string then convert into integer for sum.
hope this may helps you thanks 😊
var numsInStr = "1abc2x30yz67"
var sum = 0
for i in numsInStr
{
if i.isNumber == true
{
sum = sum + Int(String(i))
}
}
print(sum)
for sum of all number from your String following is the best solution you will get the sum of 100 as you required in your question.
let numsInStr = "1abc2x30yz67"
.components(separatedBy: .letters)
.compactMap(Int.init)
.reduce(0, +)
print(numsInStr)

What's the issue in sum += i, as the error said, i is a Character, and sum a Int.
Can you make addition between bananas & apples? It's the same logic here.
So you might want to have its Int equivalent with Int(i)
It's returning an optional value, because there is no guarantee that i is valid. You check isNumber before hand, but the line itself doesn't know that. So you can soft unwrap, or if you are sure force unwrap:
sum += Int(String(i))! //Char -> String -> Int
Because there is a String.init(someChar), and Int.init(someString), but not Int.init(someChar), that's why there is the double init().
BUT, keeping your logic, you are iterating characters per characters...
So, in the end you have:
1 + 2 + 3 + 0 + 6 + 7 (ie 19), not 1 + 2 + 30 + 67 (ie 100) as expected.
So if you want to iterate, you need to "group" the consecutive numbers...
With basic for loops, your can do this (it's a possible solution, might no be the better one, but a working one)
let numsInStr = "1abc2x30yz67"
var lastWasNumber = false
var intStrings: [String] = []
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
intStrings.append(String(aCharacter))
} else {
intStrings[intStrings.count - 1] = intStrings[intStrings.count - 1] + String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(intStrings)")
}
print(intStrings)
var sum = 0
for anIntString in intStrings {
sum += Int(anIntString)!
}
print("Sum: \(sum)")
At your level, never hesitate to add print() (but never just the variable, always add an additional text which will be context to know from where it's called).
The output being:
$>After processing: 1 - got: ["1"]
$>After processing: a - got: ["1"]
$>After processing: b - got: ["1"]
$>After processing: c - got: ["1"]
$>After processing: 2 - got: ["1", "2"]
$>After processing: x - got: ["1", "2"]
$>After processing: 3 - got: ["1", "2", "3"]
$>After processing: 0 - got: ["1", "2", "30"]
$>After processing: y - got: ["1", "2", "30"]
$>After processing: z - got: ["1", "2", "30"]
$>After processing: 6 - got: ["1", "2", "30", "6"]
$>After processing: 7 - got: ["1", "2", "30", "67"]
$>["1", "2", "30", "67"]
$>100
We rely on Int(someString) (and force unwrapping), but sum += Int(anIntString) ?? 0 should be safer. Since for too big values, if you have "a1234567890123456789123456789123456789" for instance, I'm not sure that Int will be big enough to handle that value. That some edges cases that you need to be aware of.
With high level methods, you can use componentsSeparated(by:) to get an array of only string & only letters. Then, you can filter() (if needed), or compactMap() and transform to Int if possible, then sum (with reduce(into:_:).
As suggested, another solution without keeping a list of String could be:
var sum = 0
var lastWasNumber = false
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
sum += Int(currentIntString) ?? 0
currentIntString = "" // Reset, but in fact since we override with the next line, it's not necessary to write it
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")
Here, we keep currentInString as a "buffer".
This could be simplified too by removing lastWasNumber and checking instead currentIntString:
var sum = 0
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if currentIntString.isEmpty {
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
} else {
sum += Int(currentIntString) ?? 0
currentIntString = ""
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")

Related

Swift - Using stride with an Int Array

I want to add the numbers together and print every 4 elements, however i cannot wrap my head around using the stride function, if i am using the wrong approach please explain a better method
var numbers = [1,2,3,4,5,6,7,8,9,10,11,12,13]
func addNumbersByStride(){
var output = Stride...
//first output = 1+2+3+4 = 10
//second output = 5+6+7+8 = 26 and so on
print(output)
}
It seems you would like to use stride ...
let arr = [1,2,3,4,5,6,7,8,9,10,11,12,13]
let by = 4
let i = stride(from: arr.startIndex, to: arr.endIndex, by: by)
var j = i.makeIterator()
while let n = j.next() {
let e = min(n.advanced(by: by), arr.endIndex)
let sum = arr[n..<e].reduce(0, +)
print("summ of arr[\(n)..<\(e)]", sum)
}
prints
summ of arr[0..<4] 10
summ of arr[4..<8] 26
summ of arr[8..<12] 42
summ of arr[12..<13] 13
You can first split the array into chunks, and then add the chunks up:
extension Array {
// split array into chunks of n
func chunked(into size: Int) -> [[Element]] {
return stride(from: 0, to: count, by: size).map {
Array(self[$0 ..< Swift.min($0 + size, count)])
}
}
}
// add each chunk up:
let results = numbers.chunked(into: 4).map { $0.reduce(0, +) }
If you would like to discard the last sum if the length of the original array is not divisible by 4, you can add an if statement like this:
let results: [Int]
if numbers.count % 4 != 0 {
results = Array(numbers.chunked(into: 4).map { $0.reduce(0, +) }.dropLast())
} else {
results = numbers.chunked(into: 4).map { $0.reduce(0, +) }
}
This is quite a basic solution and maybe not so elegant. First calculate and print sum of every group of 4 elements
var sum = 0
var count = 0
for n in stride(from: 4, to: numbers.count, by: 4) {
sum = 0
for i in n-4..<n {
sum += numbers[i]
}
count = n
print(sum)
}
Then calculate the sum of the remaining elements
sum = 0
for n in count..<numbers.count {
sum += numbers[n]
}
print(sum)

Int() doesn't convert from String to Optional Integer (Swift)

I'm new at programming and started with Swift. The first issue I came along with is the following:
I have 4 variables
var a = "345"
var b = "30.6"
var c = "74hf2"
var d = "5"
I need to count the sum of Integers (if not integer, it will turn to nil)
if Int(a) != nil {
var aNum = Int(ar)!
}
if Int (b) != nil {
var bNum = Int (b)!
}
and so on..
As far as I understand, the Int() should convert each element into an Optional Integer.
Then I should use forced unwrapping by convertin the Int? to Int and only then I can use it for my purposes. But instead, when I count the sum of my variables, the compiler sums them as Strings.
var sum = aNum + bNum + cNum + dNum
Output:
34530.674hf25
Why my variables, which are declared as strings and then converted into optional integers with Int(), didn't work?
Your code has typos that make it hard to tell what you are actually trying to do:
Assuming your 2nd variable should be b, as below:
var a = "345"
var b = "30.6"
var c = "74hf2"
var d = "5"
///Then you can use code like this:
var sum = 0
if let aVal = Int(a) { sum += aVal }
if let bVal = Int(b) { sum += bVal }
if let cVal = Int(c) { sum += cVal }
if let dVal = Int(d) { sum += dVal }
print(sum)
That prints 350 since only 345 and 5 are valid Int values.

How to get a specific character from index of a string in swift

I am trying to build a Binary to Decimal calculator for the Apple Watch using Swift 4.
The code I am having trouble is this:
var i = 0
var labelInputInt = 0
let labelOutputString = "10010" // Random number in binary
let reverse = String(labelOutputString.reversed()) // Reversing the original string
while i <= reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 2^i * 1
}
i += 1
}
I am using a while loop to get the index indexOfString and check if in the string reverse at the specific index it is equal with "1".
The problem is that I get a runtime error when the if statement is executed.
The error looks like this:
2 libpthread.so.0 0x00007fc22f163390
3 libswiftCore.so 0x00007fc22afa88a0 _T0s18_fatalErrorMessages5NeverOs12Stati
cStringV_A2E4fileSu4lines6UInt32V5flagstFTfq4nnddn_n + 96
4 libswiftCore.so 0x00007fc22afb3323
5 libswiftCore.so 0x00007fc22afdf9a2
6 libswiftCore.so 0x00007fc22aedca19 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 9
7 libswiftCore.so 0x00007fc22f591294 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 74139780
8 swift 0x0000000000f2925f
9 swift 0x0000000000f2d402
10 swift 0x00000000004bf516
11 swift 0x00000000004ae461
12 swift 0x00000000004aa411
13 swift 0x0000000000465424
14 libc.so.6 0x00007fc22d88d830 __libc_start_main + 240
15 swift 0x0000000000462ce9
Stack dump:
0. Program arguments: /home/drkameleon/swift4/usr/bin/swift -frontend -inte
rpret tmp/XfwP0oM7FJ.swift -disable-objc-interop -suppress-warnings -module-na
me XfwP0oM7FJ
Illegal instruction (core dumped)
So, how can I get a specific character of a String and compare it with another character without getting this crash?
Your approach to get a specific character from a string is actually correct, there are two other problems in your code:
The index i should run up to and excluding reverse.count.
This is conveniently done with the "half-open range" operator (..<).
^ is the bitwise-xor operator, not exponentiation. Exponentiation is done with the pow() function, in your case
labelInputInt += Int(pow(2.0, Double(i)))
or with the "shift-left" operator << if the base is 2.
So this would be a working variant:
for i in 0 ..< reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 1 << i
}
i += 1
}
But you can simply enumerate the characters of a string in reverse order instead of subscripting (which is also more efficient):
let binaryString = "10010"
var result = 0
for (i, char) in binaryString.reversed().enumerated() {
if char == "1" {
result += 1 << i
}
}
print(result)
Even simpler with forward iteration, no reversed() or << needed:
let binaryString = "10010"
var result = 0
for char in binaryString {
result = 2 * result
if char == "1" {
result += 1
}
}
print(result)
Which suggests to use reduce():
let binaryString = "10010"
let result = binaryString.reduce(0) { 2 * $0 + ($1 == "1" ? 1 : 0) }
print(result)
But why reinvent the wheel? Just use init?(_:radix:) from the Swift standard library (with error-checking for free):
let binaryString = "10010"
if let result = Int(binaryString, radix: 2) {
print(result)
} else {
print("invalid input")
}

for loop over odd numbers in swift

I am trying to solve the task
Using a standard for-in loop add all odd numbers less than or equal to 100 to the oddNumbers array
I tried the following:
var oddNumbers = [Int]()
var numbt = 0
for newNumt in 0..<100 {
var newNumt = numbt + 1; numbt += 2; oddNumbers.append(newNumt)
}
print(oddNumbers)
This results in:
1,3,5,7,9,...199
My question is: Why does it print numbers above 100 although I specify the range between 0 and <100?
You're doing a mistake:
for newNumt in 0..<100 {
var newNumt = numbt + 1; numbt += 2; oddNumbers.append(newNumt)
}
The variable newNumt defined inside the loop does not affect the variable newNumt declared in the for statement. So the for loop prints out the first 100 odd numbers, not the odd numbers between 0 and 100.
If you need to use a for loop:
var odds = [Int]()
for number in 0...100 where number % 2 == 1 {
odds.append(number)
}
Alternatively:
let odds = (0...100).filter { $0 % 2 == 1 }
will filter the odd numbers from an array with items from 0 to 100. For an even better implementation use the stride operator:
let odds = Array(stride(from: 1, to: 100, by: 2))
If you want all the odd numbers between 0 and 100 you can write
let oddNums = (0...100).filter { $0 % 2 == 1 }
or
let oddNums = Array(stride(from: 1, to: 100, by: 2))
Why does it print numbers above 100 although I specify the range between 0 and <100?
Look again at your code:
for newNumt in 0..<100 {
var newNumt = numbt + 1; numbt += 2; oddNumbers.append(newNumt)
}
The newNumt used inside the loop is different from the loop variable; the var newNumt declares a new variable whose scope is the body of the loop, so it gets created and destroyed each time through the loop. Meanwhile, numbt is declared outside the loop, so it keeps being incremented by 2 each time through the loop.
I see that this is an old question, but none of the answers specifically address looping over odd numbers, so I'll add another. The stride() function that Luca Angeletti pointed to is the right way to go, but you can use it directly in a for loop like this:
for oddNumber in stride(from:1, to:100, by:2) {
// your code here
}
stride(from:,to:,by:) creates a list of any strideable type up to but not including the from: parameter, in increments of the by: parameter, so in this case oddNumber starts at 1 and includes 3, 5, 7, 9...99. If you want to include the upper limit, there's a stride(from:,through:,by:) form where the through: parameter is included.
If you want all the odd numbers between 0 and 100 you can write
for i in 1...100 {
if i % 2 == 1 {
continue
}
print(i - 1)
}
For Swift 4.2
extension Collection {
func everyOther(_ body: (Element) -> Void) {
let start = self.startIndex
let end = self.endIndex
var iter = start
while iter != end {
body(self[iter])
let next = index(after: iter)
if next == end { break }
iter = index(after: next)
}
}
}
And then you can use it like this:
class OddsEvent: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
(1...900000).everyOther{ print($0) } //Even
(0...100000).everyOther{ print($0) } //Odds
}
}
This is more efficient than:
let oddNums = (0...100).filter { $0 % 2 == 1 } or
let oddNums = Array(stride(from: 1, to: 100, by: 2))
because supports larger Collections
Source: https://developer.apple.com/videos/play/wwdc2018/229/

Simple Swift Fibonacci program crashing (Project Euler 2)

I am trying to solve the second problem on Project Euler. The problem is as follows:
Each new term in the Fibonacci sequence is generated by adding the previous two terms. By starting with 1 and 2, the first 10 terms will be:
1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ...
By considering the terms in the Fibonacci sequence whose values do not exceed four million, find the sum of the even-valued terms.
I think I've written a solution, but when I try to run my code it crashes my Swift playground and gives me this error message:
Playground execution aborted: Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)
var prev = 0
var next = 1
var num = 0
var sum = 0
for var i = 1; i < 400; i++ {
num = prev + next
if next % 2 == 0 {
sum += next
}
prev = next
next = num
}
print(sum)
The weird thing is, if I set the counter on my loop to less than 93, it works fine. Explicitly setting the variable names to Double does not help. Anyone know what's going on here?
There is nothing weird about this at all. Do you know how large the 400 fibonacci number is?
176023680645013966468226945392411250770384383304492191886725992896575345044216019675
Swift Int64 or UInt64 simply cannot handle that large of a number. The later can go up to 18446744073709551615 at max - not even close.
If you change your variables to be doubles it works but will be inaccurate:
var prev : Double = 0
var next : Double = 1
var num : Double = 0
var sum : Double = 0
will yield
2.84812298108489e+83
which is kind of close to the actual value of
1.76e+83
Luckily you do not need to get values that big. I would recommend not writing a for loop but a while loop that calculates the next fibonacci number until the break condition is met whose values do not exceed four million.
The Fibonacci numbers become very large quickly. To compute large Fibonacci numbers, you need to implement some kind of BigNum. Here is a version the makes a BigNum that is implemented internally as an array of digits. For example, 12345 is implemented internally as [1, 2, 3, 4, 5]. This makes it easy to represent arbitrarily large numbers.
Addition is implemented by making the two arrays the same size, then map is used to add the elements, finally the carryAll function restores the array to single digits.
For example 12345 + 67:
[1, 2, 3, 4, 5] + [6, 7] // numbers represented as arrays
[1, 2, 3, 4, 5] + [0, 0, 0, 6, 7] // pad the shorter array with 0's
[1, 2, 3, 10, 12] // add the arrays element-wise
[1, 2, 4, 1, 2] // perform carry operation
Here is the implementation of BigNum. It is also CustomStringConvertible which makes it possible to print the result as a String.
struct BigNum: CustomStringConvertible {
var arr = [Int]()
// Return BigNum value as a String so it can be printed
var description: String { return arr.map(String.init).joined() }
init(_ arr: [Int]) {
self.arr = carryAll(arr)
}
// Allow BigNum to be initialized with an `Int`
init(_ i: Int = 0) {
self.init([i])
}
// Perform the carry operation to restore the array to single
// digits
func carryAll(_ arr: [Int]) -> [Int] {
var result = [Int]()
var carry = 0
for val in arr.reversed() {
let total = val + carry
let digit = total % 10
carry = total / 10
result.append(digit)
}
while carry > 0 {
let digit = carry % 10
carry = carry / 10
result.append(digit)
}
return result.reversed()
}
// Enable two BigNums to be added with +
static func +(_ lhs: BigNum, _ rhs: BigNum) -> BigNum {
var arr1 = lhs.arr
var arr2 = rhs.arr
let diff = arr1.count - arr2.count
// Pad the arrays to the same length
if diff < 0 {
arr1 = Array(repeating: 0, count: -diff) + arr1
} else if diff > 0 {
arr2 = Array(repeating: 0, count: diff) + arr2
}
return BigNum(zip(arr1, arr2).map { $0 + $1 })
}
}
// This function is based upon this question:
// https://stackoverflow.com/q/52975875/1630618
func fibonacci(to n: Int) {
guard n >= 2 else { return }
var array = [BigNum(0), BigNum(1)]
for i in 2...n {
array.append(BigNum())
array[i] = array[i - 1] + array[i - 2]
print(array[i])
}
}
fibonacci(to: 400)
Output:
1
2
3
5
8
...
67235063181538321178464953103361505925388677826679492786974790147181418684399715449
108788617463475645289761992289049744844995705477812699099751202749393926359816304226
176023680645013966468226945392411250770384383304492191886725992896575345044216019675