Why does ("0"+"1").toInt()! return 1 as an Int, rather than 0 - swift

Was playing around with the reduce function on collections in Swift.
//Reduce method should return an Int with the value 3141.
let digits = ["3", "1", "4", "1"]
.reduce(0) {
(total:Int, digit:String) in
return ("\(total)" + digit).toInt()!
}
The function is giving the correct output but why does ("0"+"1").toInt()! return 1 as an Int, rather than 0? The string combined to be turned into an Int is "01". I assume this is a String that the function cannot covert to an Int directly. Does it just default to the second character instead?

"0"+"1" == "01". You're doing concatenation not addition. You lose the 0 when you convert to int because it's a leading zero.
Leading zero's are usually dropped as meaningless but in some contexts they actually signal that you're expressing an octal based number. Even if that's the case here it'd still end up evaluating to 1.

Related

Flutter int value if first digit is not zero I want to get first value

Flutter int value if first digit is not zero I want to get first value
for example
int x=123
result =1;
another example
int y=234;
result=2;
if the first value is zero I want to get the second value
for example
int a=023;
result=2;
another example
int b=098;
result=9;
how can i do this using dart?
There surely is a more mathematical way, but one way to do this is to convert the int to a String, get the first character and parse it back to int :
var numbers = [
1234,
567,
89,
];
for(var number in numbers) {
var firstNumber = int.parse(number.toString()[0]);
print(firstNumber);
}
Output :
1
5
8
This will give you the first non-zero digit. Assuming you get your number as a String. Otherwise it is pointless because the int value = 0123 is the same as int value = 123.
String yourNumber = "0123";
final firstNonZero = int.tryParse(yourNumber.split('').firstWhere((element) => element != '0', orElse: () => ''));
print(firstNonZero);

How to generate 15 digit random number using Scala

I am new to Scala programming, I want to generate random number with 15 digits, So can you please let share some example. I have tried the below code to get the alpha number string with 10 digits.
var ranstr = s"${(Random.alphanumeric take 10).mkString}"
print("ranstr", ranstr)
You need to pay attention to the return type. You cannot have a 15-digit Int because that type is a 32-bit signed integer, meaning that it's maximum value is a little over 2B. Even getting a 10-digit number means you're at best getting a number between 1B and the maximum value of Int.
Other answers go in the detail of how to get a 15-digits number using Long. In your comment you mentioned between, but because of the limitation I mentioned before, using Ints will not allow you to go beyond the 9 digits in your example. You can, however, explicitly annotate your numeric literals with a trailing L to make them Long and achieve what you want as follows:
Random.between(100000000000000L, 1000000000000000L)
Notice that the documentation for between says that the last number is exclusive.
If you're interested in generating arbitrarily large numbers, a String might get the job done, as in the following example:
import scala.util.Random
import scala.collection.View
def nonZeroDigit: Char = Random.between(49, 58).toChar
def digit: Char = Random.between(48, 58).toChar
def randomNumber(length: Int): String = {
require(length > 0, "length must be strictly positive")
val digits = View(nonZeroDigit) ++ View.fill(length - 1)(digit)
digits.mkString
}
randomNumber(length = 1)
randomNumber(length = 10)
randomNumber(length = 15)
randomNumber(length = 40)
Notice that when converting an Int to a Char what you get is the character encoded by that number, which isn't necessarily the same as the digit represented by the Int itself. The numbers you see in the functions from the ASCII table (odds are it's good enough for what you want to do).
If you really need a numeric type, for arbitrarily large integers you will need to use BigInt. One of its constructors allows you to parse a number from a string, so you can re-use the code above as follows:
import scala.math.BigInt
BigInt(randomNumber(length = 15))
BigInt(randomNumber(length = 40))
You can play around with this code here on Scastie.
Notice that in my example, in order to keep it simple, I'm forcing the first digit of the random number to not be zero. This means that the number 0 itself will never be a possible output. If you want that to be the case if one asks for a 1-digit long number, you're advised to tailor the example to your needs.
A similar approach to that by Alin's foldLeft, based here in scanLeft, where the intermediate random digits are first collected into a Vector and then concatenated as a BigInt, while ensuring the first random digit (see initialization value in scanLeft) is greater than zero,
import scala.util.Random
import scala.math.BigInt
def randGen(n: Int): BigInt = {
val xs = (1 to n-1).scanLeft(Random.nextInt(9)+1) {
case (_,_) => Random.nextInt(10)
}
BigInt(xs.mkString)
}
To notice that Random.nextInt(9) will deliver a random value between 0 and 8, thus we add 1 to shift the possibble values from 1 to 9. Thus,
scala> (1 to 15).map(randGen(_)).foreach(println)
8
34
623
1597
28474
932674
5620336
66758916
186155185
2537294343
55233611616
338190692165
3290592067643
93234908948070
871337364826813
There a lot of ways to do this.
The most common way is to use Random.nextInt(10) to generate a digit between 0-9.
When building a number of a fixed size of digits, you have to make sure the first digit is never 0.
For that I'll use Random.nextInt(9) + 1 which guarantees generating a number between 1-9, a sequence with the other 14 generated digits, and a foldleft operation with the first digit as accumulator to generate the number:
val number =
Range(1, 15).map(_ => Random.nextInt(10)).foldLeft[Long](Random.nextInt(9) + 1) {
(acc, cur_digit) => acc * 10 + cur_digit
}
Normally for such big numbers it's better to represent them as sequence of characters instead of numbers because numbers can easily overflow. But since a 15 digit number fits in a Long and you asked for a number, I used one instead.
In scala we have scala.util.Random to get a random value (not only numeric), for a numeric value random have nextInt(n: Int) what return a random num < n. Read more about random
First example:
val random = new Random()
val digits = "0123456789".split("")
var result = ""
for (_ <- 0 until 15) {
val randomIndex = random.nextInt(digits.length)
result += digits(randomIndex)
}
println(result)
Here I create an instance of random and use a number from 0 to 9 to generate a random number of length 15
Second example:
val result2 = for (_ <- 0 until 15) yield random.nextInt(10)
println(result2.mkString)
Here I use the yield keyword to get an array of random integers from 0 to 9 and use mkString to combine the array into a string. Read more about yield

Swift 5: String prefix with a maximum UTF-8 length

I have a string that can contain arbitrary Unicode characters and I want to get a prefix of that string whose UTF-8 encoded length is as close as possible to 32 bytes, while still being valid UTF-8 and without changing the characters' meaning (i.e. not cutting off an extended grapheme cluster).
Consider this CORRECT example:
let string = "\u{1F3F4}\u{E0067}\u{E0062}\u{E0073}\u{E0063}\u{E0074}\u{E007F}\u{1F1EA}\u{1F1FA}"
print(string) // 🏴󠁧󠁒󠁳󠁣󠁴󠁿πŸ‡ͺπŸ‡Ί
print(string.count) // 2
print(string.utf8.count) // 36
let prefix = string.utf8Prefix(32) // <-- function I want to implement
print(prefix) // 🏴󠁧󠁒󠁳󠁣󠁴󠁿
print(prefix.count) // 1
print(prefix.utf8.count) // 28
print(string.hasPrefix(prefix)) // true
And this example of a WRONG implementation:
let string = "ar\u{1F3F4}\u{200D}\u{2620}\u{FE0F}\u{1F3F4}\u{200D}\u{2620}\u{FE0F}\u{1F3F4}\u{200D}\u{2620}\u{FE0F}"
print(string) // arπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈ
print(string.count) // 5
print(string.utf8.count) // 41
let prefix = string.wrongUTF8Prefix(32) // <-- wrong implementation
print(prefix) // arπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈπŸ΄
print(prefix.count) // 5
print(prefix.utf8.count) // 32
print(string.hasPrefix(prefix)) // false
What's an elegant way to do this? (besides trial&error)
You've shown no attempt at a solution and SO doesn't normally write code for you. So instead here as some algorithm suggestions for you:
What's an elegant way to do this? (besides trial&error)
By what definition of elegant? (like beauty it depends on the eye of the beholder...)
Simple?
Start with String.makeIterator, write a while loop, append Characters to your prefix as long as the byte count ≀ 32.
It's a very simple loop, worse case is 32 iterations and 32 appends.
"Smart" Search Strategy?
You could implement a strategy based on the average byte length of each Character in the String and using String.Prefix(Int).
E.g. for your first example the character count is 2 and the byte count 36, giving an average of 18 bytes/character, 18 goes into 32 just once (we don't deal in fractional characters or bytes!) so start with Prefix(1), which has a byte count of 28 and leaves 1 character and 8 bytes – so the remainder has an average byte length of 8 and you are seeking at most 4 more bytes, 8 goes into 4 zero times and you are done.
The above example shows the case of extending (or not) your prefix guess. If your prefix guess is too long you can just start your algorithm from scratch using the prefix character & byte counts rather than the original string's.
If you have trouble implementing your algorithm ask a new question showing the code you've written, describe the issue, and someone will undoubtedly help you with the next step.
HTH
I discovered that String and String.UTF8View share the same indices, so I managed to create a very simple (and efficient?) solution, I think:
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
var index = self.utf8.index(self.startIndex, offsetBy: maxLength+1)
self.formIndex(before: &index)
return self.prefix(upTo: index)
}
}
Explanation (assuming maxLength == 32 and startIndex == 0):
The first case (utf8.count <= maxLength) should be clear, that's where no work is needed.
For the second case we first get the utf8-index 33, which is either
A: the endIndex of the string (if it's exactly 33 bytes long),
B: an index at the start of a character (after 33 bytes of previous characters)
C: an index somewhere in the middle of a character (after <33 bytes of previous characters)
So if we now move our index back one character (with formIndex(before:)) this will jump to the first extended grapheme cluster boundary before index which in case A and B is one character before and in C the start of that character.
I any case, the utf8-index will now be guaranteed to be at most 32 and at an extended grapheme cluster boundary, so prefix(upTo: index) will safely create a prefix with length ≀32.
…but it's not perfect.
In theory this should also be always the optimal solution, i.e. the prefix's count is as close as possible to maxLength but sometimes when the string ends with an extended grapheme cluster consisting of more than one Unicode scalar, formIndex(before: &index) goes back one character too many than would be necessary, so the prefix ends up shorter. I'm not exactly sure why that's the case.
EDIT: A not as elegant but in exchange completely "correct" solution would be this (still only O(n)):
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
let endIndex = self.utf8.index(self.startIndex, offsetBy: maxLength)
var index = self.startIndex
while index <= endIndex {
self.formIndex(after: &index)
}
self.formIndex(before: &index)
return self.prefix(upTo: index)
}
}
I like the first solution you came up with. I've found it works more correctly (and simpler) if you take out the formIndex:
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
let index = self.utf8.index(self.startIndex, offsetBy: maxLength)
return self.prefix(upTo: index)
}
}
My solution looks like this:
extension String {
func prefix(maxUTF8Length: Int) -> String {
if self.utf8.count <= maxUTF8Length { return self }
var utf8EndIndex = self.utf8.index(self.utf8.startIndex, offsetBy: maxUTF8Length)
while utf8EndIndex > self.utf8.startIndex {
if let stringIndex = utf8EndIndex.samePosition(in: self) {
return String(self[..<stringIndex])
} else {
self.utf8.formIndex(before: &utf8EndIndex)
}
}
return ""
}
}
It takes the highest possible utf8 index, checks if it is a valid character index using the Index.samePosition(in:) method. If not, it reduces the utf8 index one by one until it finds a valid character index.
The advantage is that you could replace utf8 with utf16 and it would also work.

Number validation and formatting

I want to format, in real time, the number entered into a UITextField. Depending on the field, the number may be an integer or a double, may be positive or negative.
Integers are easy (see below).
Doubles should be displayed exactly as the user enters with three possible exceptions:
If the user begins with a decimal separator, or a negative sign followed by a decimal separator, insert a leading zero:
"." becomes "0."
"-." becomes "-0."
Remove any "excess" leading zeros if the user deletes a decimal point:
If the number is "0.00023" and the decimal point is deleted, the number should become "23".
Do not allow a leading zero if the next character is not a decimal separator:
"03" becomes "3".
Long story short, one and only one leading zero, no trailing zeros.
It seemed like the easiest idea was to convert the (already validated) string to a number then use format specifiers. I've scoured:
https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/Strings/Articles/formatSpecifiers.html
and
http://www.cplusplus.com/reference/cstdio/printf/
and others but can't figure out how to format a double so that it does not add a decimal when there are no digits after it, or any trailing zeros. For example:
x = 23.0
print (String(format: "%f", x))
//output is 23.000000
//I want 23
x = 23.45
print (String(format: "%f", x))
//output is 23.450000
//I want 23.45
On How to create a string with format?, I found this gem:
var str = "\(INT_VALUE) , \(FLOAT_VALUE) , \(DOUBLE_VALUE), \(STRING_VALUE)"
print(str)
It works perfectly for integers (why I said integers are easy above), but for doubles it appends a ".0" onto the first character the user enters. (It does work perfectly in Playground, but not my program (why???).
Will I have to resort to counting the number of digits before and after the decimal separator and inserting them into a format specifier? (And if so, how do I count those? I know how to create the format specifier.) Or is there a really simple way or a quick fix to use that one-liner above?
Thanks!
Turned out to be simple without using NumberFormatter (which I'm not so sure would really have accomplished what I want without a LOT more work).
let decimalSeparator = NSLocale.current.decimalSeparator! as String
var tempStr: String = textField.text
var i: Int = tempStr.count
//remove leading zeros for positive numbers (integer or real)
if i > 1 {
while (tempStr[0] == "0" && tempStr[1] != decimalSeparator[0] ) {
tempStr.remove(at: tempStr.startIndex)
i = i - 1
if i < 2 {
break
}
}
}
//remove leading zeros for negative numbers (integer or real)
if i > 2 {
while (tempStr[0] == "-" && tempStr[1] == "0") && tempStr[2] != decimalSeparator[0] {
tempStr.remove(at: tempStr.index(tempStr.startIndex, offsetBy: 1))
i = i - 1
if i < 3 {
break
}
}
}
Using the following extension to subscript the string:
extension String {
subscript (i: Int) -> Character {
return self[index(startIndex, offsetBy: i)]
}
}

Left side of mutating operator isn't mutable: 'gpa' is a 'let' constant

I am currently struggling with an error for a homework assignment in my coding class. We are creating a loop that loops through an array of gpa values and then adds it to a variable named totalGradePoints. The problem is that I am coming across an error when the loop runs:
Left side of mutating operator isn't mutable: 'gpa' is a 'let' constant
The error is on this line:
var totalGradePoints = Double()
for gpa in gpaValues {
let averageGPA: Double = gpa += totalGradePoints
}
Here is my full code:
//: Playground - noun: a place where people can play
import UIKit
// You are the university registrar processing a transfer student's transcripts that contains grades that are a mix of letters and numbers. You need to add them to our system, but first you need to convert the letters into grade points.
// Here's an array of the student's grades.
var transferGrades: [Any] = ["C", 95.2, 85, "D", "A", 93.23, "P", 90, 100]
// To prepare for converting the letters to numerical grades, create a function that returns a double, inside which you create a switch that will convert an A to a 95, B to 85, C to 75, D to 65, , P (for passing) to 75. Everything else will be a zero.
func gradeConverter(letterGrade: String) -> Double {
switch letterGrade {
case "A":
return 95
case "B":
return 85
case "C":
return 75
case "D":
return 65
case "P":
return 75
default: // Is this where everything else is zero?
return 0
}
}
// Create a new array called convertedGrades that stores doubles.
var convertedGrades: [Double] = [98.75, 75.5, 60.0, 100.0, 82.25, 87.5]
// Loop through the transferGrades array, inspecing each item for type and sending strings (your letter grades) to the function you just made and storing the returned double in your convertedGrades array. If your loop encounters a double, you can place it directly into the new array without converting it. It it encounters an int, you will need to convert it to a double before storing it. Print the array. (You may notice that some of your doulbes are stored with many zeros in the decimal places. It's not an error, so you can ignore that for now)
for grade in transferGrades {
if let gradeAsString = grade as? String {
gradeConverter(letterGrade: gradeAsString)
} else if let gradeAsDouble = grade as? Double {
transferGrades.append(gradeAsDouble)
} else if let gradeAsInt = grade as? Int {
Double(gradeAsInt)
transferGrades.append(gradeAsInt)
}
}
print(transferGrades)
// Now that we have an array of numerical grades, we need to calculate the student's GPA. Create a new array called GPAValues that stores doubles.
var gpaValues: [Double] = [2.5, 3.0, 4.0, 3.12, 2.97, 2.27]
// Like with the letter conversion function and switch you created before, make a new function called calculateGPA that takes a double and returns a double. Inside your function, create another switch that does the following conversion. Grades below 60 earn zero grade points, grades in the 60s earn 1, 70s earn 2, 80s earn 3, and 90s and above earn 4.
func calculateGPA(gpaValue: Double) -> Double {
switch gpaValue {
case 0..<59:
return 0
case 60...69:
return 1
case 70...79:
return 2
case 80...89:
return 3
case 90..<100:
return 4
default:
return 0
}
}
// Loop through your convertedGrades array and append the grade point value to the GPAValues array. Because your calculateGPA function returns a value, you can use it just like a varialbe, so rather than calculate the grade points and then put that varialbe in your append statement, append the actual function. i.e. myArray.append(myFunction(rawValueToBeConverted))
for gpa in gpaValues {
gpaValues.append(calculateGPA(gpaValue: gpa))
}
// Finally, calculate the average GPA by looping through the GPA and using the += operator to add it to a variable called totalGradePoints. You may need to initialize the variable before using it in the loop. i.e. var initialized = Double()
var totalGradePoints = Double()
for gpa in gpaValues {
let averageGPA: Double = gpa += totalGradePoints
}
// Count the number of elements in the array (by using the count method, not your fingers) and store that number in a variable called numberOfGrades. Pay attention to creating your variables with the right types. Swift will tell you if you're doing it wrong.
var numberOfGrades: Int = gpaValues.count
// Divide the totalGradePoints by numberOfGrades to store in a variable called transferGPA.
var transferGPA: Double = Double(totalGradePoints) / Double(numberOfGrades)
// Using code, add one numerical grade and one letter grade to the transferGrades array that we started with (i.e. append the values rather than manualy writing them into the line at the beginning of this file) and check that your transferGPA value updates. You'll need to append the new grades on the line below the existing transferGrades array so that your changes ripple through the playground.
transferGrades.append(97.56)
transferGrades.append("B")
averageGPA must be define using the var keyword to make it mutable later on when summing up the values.
var averageGPA: Double = 0
for gpa in gpaValues {
averageGPA += gpa
}
averageGPA = averageGPA / Double(gpaValues.count)
Recall the average is calculated by summing up the score and dividing the number of scores.
Defining something with let means that the following will be a constant.
let answer: Int = 42
answer = 43 /* Illegal operation. Cannot mutate a constant */
Left side of mutating operator isn't mutable: 'gpa' is a 'let' constant
The problem is that gpa is a constant, you can't modify its value. And the "+=" operator means "increase gpa's value by totalGradePoints", it is trying to increase the value of gpa. What you probably mean to do is make averageGPA equal the sum of gpa and totalGradePoints. For that you would do this:
let averageGPA: Double = gpa + totalGradePoints