How to get a specific character from index of a string in swift - swift

I am trying to build a Binary to Decimal calculator for the Apple Watch using Swift 4.
The code I am having trouble is this:
var i = 0
var labelInputInt = 0
let labelOutputString = "10010" // Random number in binary
let reverse = String(labelOutputString.reversed()) // Reversing the original string
while i <= reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 2^i * 1
}
i += 1
}
I am using a while loop to get the index indexOfString and check if in the string reverse at the specific index it is equal with "1".
The problem is that I get a runtime error when the if statement is executed.
The error looks like this:
2 libpthread.so.0 0x00007fc22f163390
3 libswiftCore.so 0x00007fc22afa88a0 _T0s18_fatalErrorMessages5NeverOs12Stati
cStringV_A2E4fileSu4lines6UInt32V5flagstFTfq4nnddn_n + 96
4 libswiftCore.so 0x00007fc22afb3323
5 libswiftCore.so 0x00007fc22afdf9a2
6 libswiftCore.so 0x00007fc22aedca19 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 9
7 libswiftCore.so 0x00007fc22f591294 _T0SS9subscripts9CharacterVSS5IndexVcfg
+ 74139780
8 swift 0x0000000000f2925f
9 swift 0x0000000000f2d402
10 swift 0x00000000004bf516
11 swift 0x00000000004ae461
12 swift 0x00000000004aa411
13 swift 0x0000000000465424
14 libc.so.6 0x00007fc22d88d830 __libc_start_main + 240
15 swift 0x0000000000462ce9
Stack dump:
0. Program arguments: /home/drkameleon/swift4/usr/bin/swift -frontend -inte
rpret tmp/XfwP0oM7FJ.swift -disable-objc-interop -suppress-warnings -module-na
me XfwP0oM7FJ
Illegal instruction (core dumped)
So, how can I get a specific character of a String and compare it with another character without getting this crash?

Your approach to get a specific character from a string is actually correct, there are two other problems in your code:
The index i should run up to and excluding reverse.count.
This is conveniently done with the "half-open range" operator (..<).
^ is the bitwise-xor operator, not exponentiation. Exponentiation is done with the pow() function, in your case
labelInputInt += Int(pow(2.0, Double(i)))
or with the "shift-left" operator << if the base is 2.
So this would be a working variant:
for i in 0 ..< reverse.count {
let indexOfString = reverse.index(reverse.startIndex, offsetBy: i)
if reverse[indexOfString] == "1" {
labelInputInt += 1 << i
}
i += 1
}
But you can simply enumerate the characters of a string in reverse order instead of subscripting (which is also more efficient):
let binaryString = "10010"
var result = 0
for (i, char) in binaryString.reversed().enumerated() {
if char == "1" {
result += 1 << i
}
}
print(result)
Even simpler with forward iteration, no reversed() or << needed:
let binaryString = "10010"
var result = 0
for char in binaryString {
result = 2 * result
if char == "1" {
result += 1
}
}
print(result)
Which suggests to use reduce():
let binaryString = "10010"
let result = binaryString.reduce(0) { 2 * $0 + ($1 == "1" ? 1 : 0) }
print(result)
But why reinvent the wheel? Just use init?(_:radix:) from the Swift standard library (with error-checking for free):
let binaryString = "10010"
if let result = Int(binaryString, radix: 2) {
print(result)
} else {
print("invalid input")
}

Related

How to convert 'String.Element' to 'Int'?

var numsInStr = "1abc2x30yz67"
var sum = 0
for i in numsInStr {
if i.isNumber == true {
sum += i
}
}
print(sum)
Problem is in if statemt to summing numbers. And it returns - "Cannot convert value of type 'String.Element' (aka 'Character') to expected argument type 'Int'"
It is possible to solve this problem like mine. I saw some answers but answers are very short and misunderstanding
Input: 1abc2x30yz67
Output: 100
your solution is not working because you are adding character type in to integrate type value, first you need to convert your character object into string then convert into integer for sum.
hope this may helps you thanks 😊
var numsInStr = "1abc2x30yz67"
var sum = 0
for i in numsInStr
{
if i.isNumber == true
{
sum = sum + Int(String(i))
}
}
print(sum)
for sum of all number from your String following is the best solution you will get the sum of 100 as you required in your question.
let numsInStr = "1abc2x30yz67"
.components(separatedBy: .letters)
.compactMap(Int.init)
.reduce(0, +)
print(numsInStr)
What's the issue in sum += i, as the error said, i is a Character, and sum a Int.
Can you make addition between bananas & apples? It's the same logic here.
So you might want to have its Int equivalent with Int(i)
It's returning an optional value, because there is no guarantee that i is valid. You check isNumber before hand, but the line itself doesn't know that. So you can soft unwrap, or if you are sure force unwrap:
sum += Int(String(i))! //Char -> String -> Int
Because there is a String.init(someChar), and Int.init(someString), but not Int.init(someChar), that's why there is the double init().
BUT, keeping your logic, you are iterating characters per characters...
So, in the end you have:
1 + 2 + 3 + 0 + 6 + 7 (ie 19), not 1 + 2 + 30 + 67 (ie 100) as expected.
So if you want to iterate, you need to "group" the consecutive numbers...
With basic for loops, your can do this (it's a possible solution, might no be the better one, but a working one)
let numsInStr = "1abc2x30yz67"
var lastWasNumber = false
var intStrings: [String] = []
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
intStrings.append(String(aCharacter))
} else {
intStrings[intStrings.count - 1] = intStrings[intStrings.count - 1] + String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(intStrings)")
}
print(intStrings)
var sum = 0
for anIntString in intStrings {
sum += Int(anIntString)!
}
print("Sum: \(sum)")
At your level, never hesitate to add print() (but never just the variable, always add an additional text which will be context to know from where it's called).
The output being:
$>After processing: 1 - got: ["1"]
$>After processing: a - got: ["1"]
$>After processing: b - got: ["1"]
$>After processing: c - got: ["1"]
$>After processing: 2 - got: ["1", "2"]
$>After processing: x - got: ["1", "2"]
$>After processing: 3 - got: ["1", "2", "3"]
$>After processing: 0 - got: ["1", "2", "30"]
$>After processing: y - got: ["1", "2", "30"]
$>After processing: z - got: ["1", "2", "30"]
$>After processing: 6 - got: ["1", "2", "30", "6"]
$>After processing: 7 - got: ["1", "2", "30", "67"]
$>["1", "2", "30", "67"]
$>100
We rely on Int(someString) (and force unwrapping), but sum += Int(anIntString) ?? 0 should be safer. Since for too big values, if you have "a1234567890123456789123456789123456789" for instance, I'm not sure that Int will be big enough to handle that value. That some edges cases that you need to be aware of.
With high level methods, you can use componentsSeparated(by:) to get an array of only string & only letters. Then, you can filter() (if needed), or compactMap() and transform to Int if possible, then sum (with reduce(into:_:).
As suggested, another solution without keeping a list of String could be:
var sum = 0
var lastWasNumber = false
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if !lastWasNumber {
sum += Int(currentIntString) ?? 0
currentIntString = "" // Reset, but in fact since we override with the next line, it's not necessary to write it
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
lastWasNumber = true
} else {
lastWasNumber = false
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")
Here, we keep currentInString as a "buffer".
This could be simplified too by removing lastWasNumber and checking instead currentIntString:
var sum = 0
var currentIntString = ""
for aCharacter in numsInStr {
if aCharacter.isNumber {
if currentIntString.isEmpty {
currentIntString = String(aCharacter)
} else {
currentIntString += String(aCharacter)
}
} else {
sum += Int(currentIntString) ?? 0
currentIntString = ""
}
print("After processing: \(aCharacter) - got: \(currentIntString) - current sum: \(sum)")
}
sum += Int(currentIntString) ?? 0
print("Sum: \(sum)")

Splitting a UInt16 to 2 UInt8 bytes and getting the hexa string of both. Swift

I need 16383 to be converted to 7F7F but I can only get this to be converted to 3fff or 77377.
I can convert 8192 to hexadecimal string 4000 which is essentially the same thing.
If I use let firstHexa = String(format:"%02X", a) It stops at 3fff hexadecimal for the first number and and 2000 hexadecimal for the second number. here is my code
public func intToHexString(_ int: Int16) -> String {
var encodedHexa: String = ""
if int >= -8192 && int <= 8191 {
let int16 = int + 8192
//convert to two unsigned Int8 bytes
let a = UInt8(int16 >> 8 & 0x00ff)
let b = UInt8(int16 & 0x00ff)
//convert the 2 bytes to hexadecimals
let first1Hexa = String(a, radix: 8 )
let second2Hexa = String(b, radix: 8)
let firstHexa = String(format:"%02X", a)
let secondHexa = String(format:"%02X", b)
//combine the 2 hexas into 1 string with 4 characters...adding 0 to the beggining if only 1 character.
if firstHexa.count == 1 {
let appendedFHexa = "0" + firstHexa
encodedHexa = appendedFHexa + secondHexa
} else if secondHexa.count == 1 {
let appendedSHexa = "0" + secondHexa
encodedHexa = firstHexa + appendedSHexa
} else {
encodedHexa = firstHexa + secondHexa
}
}
return encodedHexa
}
Please help ma'ams and sirs! Thanks.
From your test cases, it seems like your values are 7 bits per byte.
You want 8192 to convert to 4000.
You want 16383 to convert to 7F7F.
Note that:
(0x7f << 7) + 0x7f == 16383
Given that:
let a = UInt8((int16 >> 7) & 0x7f)
let b = UInt8(int16 & 0x7f)
let result = String(format: "%02X%02X", a , b)
This gives:
"4000" for 8128
"7F7F" for 16383
To reverse the process:
let str = "7F7F"
let value = Int(str, radix: 16)!
let result = ((value >> 8) & 0x7f) << 7 + (value & 0x7f)
print(result) // 16383

Why can't I get a negative number using bit manipulation in Swift?

This is a LeetCode question. I wrote 4 answers in different versions of that question. When I tried to use "Bit manipulation", I got the error. Since no one in LeetCode can answer my question, and I can't find any Swift doc about this. I thought I would try to ask here.
The question is to get the majority element (>n/2) in a given array. The following code works in other languages like Java, so I think it might be a general question in Swift.
func majorityElement(nums: [Int]) -> Int {
var bit = Array(count: 32, repeatedValue: 0)
for num in nums {
for i in 0..<32 {
if (num>>(31-i) & 1) == 1 {
bit[i] += 1
}
}
}
var ret = 0
for i in 0..<32 {
bit[i] = bit[i]>nums.count/2 ? 1 : 0
ret += bit[i] * (1<<(31-i))
}
return ret
}
When the input is [-2147483648], the output is 2147483648. But in Java, it can successfully output the right negative number.
Swift doc says :
Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.
Well, it is 2,147,483,647, the input is 1 larger than that number. When I ran pow(2.0, 31.0) in playground, it shows 2147483648. I got confused. What's wrong with my code or what did I miss about Swift Int?
A Java int is a 32-bit integer. The Swift Int is 32-bit or 64-bit
depending on the platform. In particular, it is 64-bit on all OS X
platforms where Swift is available.
Your code handles only the lower 32 bits of the given integers, so that
-2147483648 = 0xffffffff80000000
becomes
2147483648 = 0x0000000080000000
So solve the problem, you can either change the function to take 32-bit integers as arguments:
func majorityElement(nums: [Int32]) -> Int32 { ... }
or make it work with arbitrary sized integers by computing the
actual size and use that instead of the constant 32:
func majorityElement(nums: [Int]) -> Int {
let numBits = sizeof(Int) * 8
var bit = Array(count: numBits, repeatedValue: 0)
for num in nums {
for i in 0..<numBits {
if (num>>(numBits-1-i) & 1) == 1 {
bit[i] += 1
}
}
}
var ret = 0
for i in 0..<numBits {
bit[i] = bit[i]>nums.count/2 ? 1 : 0
ret += bit[i] * (1<<(numBits-1-i))
}
return ret
}
A more Swifty way would be to use map() and reduce()
func majorityElement(nums: [Int]) -> Int {
let numBits = sizeof(Int) * 8
let bitCounts = (0 ..< numBits).map { i in
nums.reduce(0) { $0 + ($1 >> i) & 1 }
}
let major = (0 ..< numBits).reduce(0) {
$0 | (bitCounts[$1] > nums.count/2 ? 1 << $1 : 0)
}
return major
}

Swift 3 for loop with increment

How do I write the following in Swift3?
for (f = first; f <= last; f += interval)
{
n += 1
}
This is my own attempt
for _ in 0.stride(to: last, by: interval)
{
n += 1
}
Swift 2.2 -> 3.0: Strideable:s stride(...) replaced by global stride(...) functions
In Swift 2.2, we can (as you've tried in your own attempt) make use of the blueprinted (and default-implemented) functions stride(through:by:) and stride(to:by:) from the protocol Strideable
/* Swift 2.2: stride example usage */
let from = 0
let to = 10
let through = 10
let by = 1
for _ in from.stride(through, by: by) { } // from ... through (steps: 'by')
for _ in from.stride(to, by: by) { } // from ..< to (steps: 'by')
Whereas in Swift 3.0, these two functions has been removed from Strideable in favour of the global functions stride(from:through:by:) and stride(from:to:by:); hence the equivalent Swift 3.0 version of the above follows as
/* Swift 3.0: stride example usage */
let from = 0
let to = 10
let through = 10
let by = 1
for _ in stride(from: from, through: through, by: by) { }
for _ in stride(from: from, to: to, by: by) { }
In your example you want to use the closed interval stride alternative stride(from:through:by:), since the invariant in your for loop uses comparison to less or equal to (<=). I.e.
/* example values of your parameters 'first', 'last' and 'interval' */
let first = 0
let last = 10
let interval = 2
var n = 0
for f in stride(from: first, through: last, by: interval) {
print(f)
n += 1
} // 0 2 4 6 8 10
print(n) // 6
Where, naturally, we use your for loop only as an example of the passage from for loop to stride, as you can naturally, for your specific example, just compute n without the need of a loop (n=1+(last-first)/interval).
Swift 3.0: An alternative to stride for more complex iterate increment logic
With the implementation of evolution proposal SE-0094, Swift 3.0 introduced the global sequence functions:
sequence(first:next:),
sequence(state:next:),
which can be an appropriate alternative to stride for cases with a more complex iterate increment relation (which is not the case in this example).
Declaration(s)
func sequence<T>(first: T, next: #escaping (T) -> T?) ->
UnfoldSequence<T, (T?, Bool)>
func sequence<T, State>(state: State,
next: #escaping (inout State) -> T?) ->
UnfoldSequence<T, State>
We'll briefly look at the first of these two functions. The next arguments takes a closure that applies some logic to lazily construct next sequence element given the current one (starting with first). The sequence is terminated when next returns nil, or infinite, if a next never returns nil.
Applied to the simple constant-stride example above, the sequence method is a bit verbose and overkill w.r.t. the fit-for-this-purpose stride solution:
let first = 0
let last = 10
let interval = 2
var n = 0
for f in sequence(first: first,
next: { $0 + interval <= last ? $0 + interval : nil }) {
print(f)
n += 1
} // 0 2 4 6 8 10
print(n) // 6
The sequence functions become very useful for cases with non-constant stride, however, e.g. as in the example covered in the following Q&A:
Express for loops in swift with dynamic range
Just take care to terminate the sequence with an eventual nil return (if not: "infinite" element generation), or, when Swift 3.1 arrives, make use of its lazy generation in combination with the prefix(while:) method for sequences, as described in evolution proposal SE-0045. The latter applied to the running example of this answer makes the sequence approach less verbose, clearly including the termination criteria of the element generation.
/* for Swift 3.1 */
// ... as above
for f in sequence(first: first, next: { $0 + interval })
.prefix(while: { $0 <= last }) {
print(f)
n += 1
} // 0 2 4 6 8 10
print(n) // 6
With Swift 5, you may choose one of the 5 following examples in order to solve your problem.
#1. Using stride(from:to:by:) function
let first = 0
let last = 10
let interval = 2
let sequence = stride(from: first, to: last, by: interval)
for element in sequence {
print(element)
}
/*
prints:
0
2
4
6
8
*/
#2. Using sequence(first:next:) function
let first = 0
let last = 10
let interval = 2
let unfoldSequence = sequence(first: first, next: {
$0 + interval < last ? $0 + interval : nil
})
for element in unfoldSequence {
print(element)
}
/*
prints:
0
2
4
6
8
*/
#3. Using AnySequence init(_:) initializer
let anySequence = AnySequence<Int>({ () -> AnyIterator<Int> in
let first = 0
let last = 10
let interval = 2
var value = first
return AnyIterator<Int> {
defer { value += interval }
return value < last ? value : nil
}
})
for element in anySequence {
print(element)
}
/*
prints:
0
2
4
6
8
*/
#4. Using CountableRange filter(_:) method
let first = 0
let last = 10
let interval = 2
let range = first ..< last
let lazyCollection = range.lazy.filter({ $0 % interval == 0 })
for element in lazyCollection {
print(element)
}
/*
prints:
0
2
4
6
8
*/
#5. Using CountableRange flatMap(_:) method
let first = 0
let last = 10
let interval = 2
let range = first ..< last
let lazyCollection = range.lazy.compactMap({ $0 % interval == 0 ? $0 : nil })
for element in lazyCollection {
print(element)
}
/*
prints:
0
2
4
6
8
*/
Simply, working code for Swift 3.0:
let (first, last, interval) = (0, 100, 1)
var n = 0
for _ in stride(from: first, to: last, by: interval) {
n += 1
}
We can also use while loop as alternative way
while first <= last {
first += interval
}
for _ in 0.stride(to: last, by: interval)
{
n += 1
}

Generate random number of certain amount of digits

Hy,
I have a very Basic Question which is :
How can i create a random number with 20 digits no floats no negatives (basically an Int) in Swift ?
Thanks for all answers XD
Step 1
First of all we need an extension of Int to generate a random number in a range.
extension Int {
init(_ range: Range<Int> ) {
let delta = range.startIndex < 0 ? abs(range.startIndex) : 0
let min = UInt32(range.startIndex + delta)
let max = UInt32(range.endIndex + delta)
self.init(Int(min + arc4random_uniform(max - min)) - delta)
}
}
This can be used this way:
Int(0...9) // 4 or 1 or 1...
Int(10...99) // 90 or 33 or 11
Int(100...999) // 200 or 333 or 893
Step 2
Now we need a function that receive the number of digits requested, calculates the range of the random number and finally does invoke the new initializer of Int.
func random(digits:Int) -> Int {
let min = Int(pow(Double(10), Double(digits-1))) - 1
let max = Int(pow(Double(10), Double(digits))) - 1
return Int(min...max)
}
Test
random(1) // 8
random(2) // 12
random(3) // 829
random(4) // 2374
Swift 5: Simple Solution
func random(digits:Int) -> String {
var number = String()
for _ in 1...digits {
number += "\(Int.random(in: 1...9))"
}
return number
}
print(random(digits: 1)) //3
print(random(digits: 2)) //59
print(random(digits: 3)) //926
Note It will return value in String, if you need Int value then you can do like this
let number = Int(random(digits: 1)) ?? 0
Here is some pseudocode that should do what you want.
generateRandomNumber(20)
func generateRandomNumber(int numDigits){
var place = 1
var finalNumber = 0;
for(int i = 0; i < numDigits; i++){
place *= 10
var randomNumber = arc4random_uniform(10)
finalNumber += randomNumber * place
}
return finalNumber
}
Its pretty simple. You generate 20 random numbers, and multiply them by the respective tens, hundredths, thousands... place that they should be on. This way you will guarantee a number of the correct size, but will randomly generate the number that will be used in each place.
Update
As said in the comments you will most likely get an overflow exception with a number this long, so you'll have to be creative in how you'd like to store the number (String, ect...) but I merely wanted to show you a simple way to generate a number with a guaranteed digit length. Also, given the current code there is a small chance your leading number could be 0 so you should protect against that as well.
you can create a string number then convert the number to your required number.
func generateRandomDigits(_ digitNumber: Int) -> String {
var number = ""
for i in 0..<digitNumber {
var randomNumber = arc4random_uniform(10)
while randomNumber == 0 && i == 0 {
randomNumber = arc4random_uniform(10)
}
number += "\(randomNumber)"
}
return number
}
print(Int(generateRandomDigits(3)))
for 20 digit you can use Double instead of Int
Here is 18 decimal digits in a UInt64:
(Swift 3)
let sz: UInt32 = 1000000000
let ms: UInt64 = UInt64(arc4random_uniform(sz))
let ls: UInt64 = UInt64(arc4random_uniform(sz))
let digits: UInt64 = ms * UInt64(sz) + ls
print(String(format:"18 digits: %018llu", digits)) // Print with leading 0s.
16 decimal digits with leading digit 1..9 in a UInt64:
let sz: UInt64 = 100000000
let ld: UInt64 = UInt64(arc4random_uniform(9)+1)
let ms: UInt64 = UInt64(arc4random_uniform(UInt32(sz/10)))
let ls: UInt64 = UInt64(arc4random_uniform(UInt32(sz)))
let digits: UInt64 = ld * (sz*sz/10) + (ms * sz) + ls
print(String(format:"16 digits: %llu", digits))
Swift 3
appzyourlifz's answer updated to Swift 3
Step 1:
extension Int {
init(_ range: Range<Int> ) {
let delta = range.lowerBound < 0 ? abs(range.lowerBound) : 0
let min = UInt32(range.lowerBound + delta)
let max = UInt32(range.upperBound + delta)
self.init(Int(min + arc4random_uniform(max - min)) - delta)
}
}
Step 2:
func randomNumberWith(digits:Int) -> Int {
let min = Int(pow(Double(10), Double(digits-1))) - 1
let max = Int(pow(Double(10), Double(digits))) - 1
return Int(Range(uncheckedBounds: (min, max)))
}
Usage:
randomNumberWith(digits:4) // 2271
randomNumberWith(digits:8) // 65273410
Swift 4 version of Unome's validate response plus :
Guard it against overflow and 0 digit number
Adding support for Linux's device because "arc4random*" functions don't exit
With linux device don't forgot to do
#if os(Linux)
srandom(UInt32(time(nil)))
#endif
only once before calling random.
/// This function generate a random number of type Int with the given digits number
///
/// - Parameter digit: the number of digit
/// - Returns: the ramdom generate number or nil if wrong parameter
func randomNumber(with digit: Int) -> Int? {
guard 0 < digit, digit < 20 else { // 0 digit number don't exist and 20 digit Int are to big
return nil
}
/// The final ramdom generate Int
var finalNumber : Int = 0;
for i in 1...digit {
/// The new generated number which will be add to the final number
var randomOperator : Int = 0
repeat {
#if os(Linux)
randomOperator = Int(random() % 9) * Int(powf(10, Float(i - 1)))
#else
randomOperator = Int(arc4random_uniform(9)) * Int(powf(10, Float(i - 1)))
#endif
} while Double(randomOperator + finalNumber) > Double(Int.max) // Verification to be sure to don't overflow Int max size
finalNumber += randomOperator
}
return finalNumber
}