Swift: how to convert readLine() input " [-5,20,8...] " to an Int array - swift

I already run the search today and found a similar issue here, but it not fully fix the issue. In my case, I want to convert readLine input string "[3,-1,6,20,-5,15]" to an Int array [3,-1,6,20,-5,15].
I'm doing an online coding quest from one website, which requires input the test case from readLine().
For example, if I input [1,3,-5,7,22,6,85,2] in the console then I need to convert it to Int array type. After that I could deal with the algorithm part to solve the quest. Well, I think it is not wise to limit the input as readLine(), but simply could do nothing about that:(
My code as below, it could deal with positive array with only numbers smaller than 10. But for this array, [1, -3, 22, -6, 5,6,7,8,9] it will give nums as [1, 3, 2, 2, 6, 5, 6, 7, 8, 9], so how could I correctly convert the readLine() input?
print("please give the test array with S length")
if let numsInput = readLine() {
let nums = numsInput.compactMap {Int(String($0))}
print("nums: \(nums)")
}

Here is a one liner to convert the input into an array of integers. Of course you might want to split this up in separate steps if some validation is needed
let numbers = input
.trimmingCharacters(in: .whitespacesAndNewlines)
.dropFirst()
.dropLast()
.split(separator: ",")
.compactMap {Int($0)}
dropFirst/dropLast can be replaced with a replace using a regular expression
.replacingOccurrences(of: "[\\[\\]]", with: "", options: .regularExpression)

Use split method to get a sequence of strings from an input string
let nums = numsInput.split(separator: ",").compactMap {Int($0)}

Related

Swift 5 split string at integer index

It used to be you could use substring to get a portion of a string. That has been deprecated in favor on string index. But I can't seem to make a string index out of integers.
var str = "hellooo"
let newindex = str.index(after: 3)
str = str[newindex...str.endIndex]
No matter what the string is, I want the second 3 characters. So and str would contain "loo". How can I do this?
Drop the first three characters and the get the remaining first three characters
let str = "helloo"
let secondThreeCharacters = String(str.dropFirst(3).prefix(3))
You might add some code to handle the case if there are less than 6 characters in the string

Why does String.subscript(_:) require the types `String.Index` and `Int` to be equal when there is no `Int` involved?

I fail to understand the problem Xcode is confronting me with in this line:
iteration.template = template[iterationSubstring.endIndex...substring.startIndex]
template is a String and iterationSubstring and substring are Substrings of template. Xcode highlights the opening square bracket with the following message:
Subscript 'subscript(_:)' requires the types 'Substring.Index' and 'Int' be equivalent
The error message does not make any sense to me. I try to obtain a Substring by creating a Range<String.Index> with the [template.startIndex...template.endIndex] subscript. How is this related to Int? And why does the same pattern work elsewhere?
Xcode playground code reproducing the problem:
import Foundation
let template = "This is an ordinary string literal."
let firstSubstringStart = template.index(template.startIndex, offsetBy: 5)
let firstSubstringEnd = template.index(template.startIndex, offsetBy: 7)
let firstSubstring = template[firstSubstringStart...firstSubstringEnd]
let secondSubstringStart = template.index(template.startIndex, offsetBy: 10)
let secondSubstringEnd = template.index(template.startIndex, offsetBy: 12)
let secondSubstring = template[secondSubstringStart...secondSubstringEnd]
let part: String = template[firstSubstring.endIndex...secondSubstring.startIndex]
After all I have a template string and two substrings of it. I want to get a String ranging from the end of the first Substring to the start of the second Substring.
The current version of Swift works with the Substring struct which is a sliced String.
The error seems to be misleading and occurs if you are going to assign a (range-subscripted) Substring to a String variable.
To fix the error create a String from the Substring
iteration.template = String(template[iterationSubstring.endIndex...substring.startIndex])
Nevertheless you are strongly discouraged from creating ranges with indices from different strings (iterationSubstring and substring). Slice the main string, the indices are preserved.
The crash in the second (meanwhile deleted) example occurred because the last character of a string is at index before endIndex, it's
template[template.startIndex..<template.endIndex]
or shorter
template[template.startIndex...]

Swift split strings efficiently

I have a lot of strings like this one:
"substring1:substring2:...:substring9"
So the number of substrings in string is always 9, and some substrings in string may be empty.
I want to split the string by ":" into array of strings and i do it like this:
let separator = Character(":")
let arrayOfStrings = string.split(separator: separator, maxSplits: 8, omittingEmptySubsequences: false).map({ String($0) })
For example for 13.5k strings it took about 150ms to convert them into arrays of strings.
Is there any other method that is more efficient in terms of time for this task?
Try this:
let arrayOfStrings = string.components(separatedBy: ":")
This should improve performance as it doesn't use .map(), which isn't really required in your case.
Or
As #Martin R suggested, if you can work with an array of SubString instead, the following should perform better:
let arrayOfStrings = string.split(separatedBy: ":")
split returns [Substring] which only uses references, does not allocate a new String and should be faster.
Also, .split is a method on String, (unlike .component which is a method on NSString) and hence there is no bridging conversion as pointed by #JeremyP.

How do I format a string from a string with %# in Swift

I am using Swift 4.2. I am getting extraneous characters when formatting one string (s1) from another string(s0) using the %# format code.
I have searched extensively for details of string formatting but have come up with only partial answers including the code in the second line below. I need to be able to format s1 so that I can customize output from a Swift process. I ask this because I have not found an answer while searching for ways to format a string from a string.
I tried the following three statements:
let s0:[String] = ["abcdef"]
let s1:[String] = [String(format:"%#",s0)]
print(s1)
...
The output is shown below. It may not be clear, here, but there are four leading spaces to the left of the abcdef string.
["(\n abcdef\n)"]
How can I format s1 so it does not include the brackets, the \n escape characters, and the leading spaces?
The issue here is you are using an array but a string in s0.
so the following index will help you.
let s0:[String] = ["abcdef"]
let s1:[String] = [String(format:" %#",s0[0])]
I am getting extraneous characters when formatting one string (s1) from another string (s0) ...
The s0 is not a string. It is an array of strings (i.e. the square brackets of [String] indicate an array and is the same as saying Array<String>). And your s1 is also array, but one that that has one element, whose value is the string representation of the entire s0 array of strings. That’s obviously not what you intended.
How can I format s1 so it does not include the brackets, the \n escape characters, and the leading spaces?
You’re getting those brackets because s1 is an array. You’re getting the string with the \n and spaces because its first value is the string representation of yet another array, s0.
So, if you’re just trying to format a string, s0, you can do:
let s0: String = "abcdef"
let s1: String = String(format: "It is ‘%#’", s0)
Or, if you really want an array of strings, you can call String(format:) for each using the map function:
let s0: [String] = ["abcdef", "ghijkl"]
let s1: [String] = s0.map { String(format: "It is ‘%#’", $0) }
By the way, in the examples above, I didn’t use a string format of just %#, because that doesn’t accomplish anything at all, so I assumed you were formatting the string for a reason.
FWIW, we generally don’t use String(format:) very often. Usually we do “string interpolation”, with \( and ):
let s0: String = "abcdef"
let s1: String = "It is ‘\(s0)’"
Get rid of all the unneccessary arrays and let the compiler figure out the types:
let s0 = "abcdef" // a string
let s1 = String(format:"- %# -",s0) // another string
print(s1) // prints "- abcdef -"

Include a UTF8 character literal in a [UInt8] array or Data

I would like something similar to:
let a = ["v".utf8[0], 1, 2]
The closest I have figured out is:
let a = [0x76, 1, 2]
and
"v".data(using: String.Encoding.utf8)! + [1, 2]
Note: Either [UInt8] or Data is an acceptable type.
String's UTF8View is not indexed by an Int, rather it's own String.UTF8View.Index type, therefore in order to include the first byte of a UTF-8 sequence of a given string in your array literal, you could use its first property instead:
let a = ["v".utf8.first!, 1, 2] // [118, 1, 2]
If there's more than one byte in the sequence, you can concatenate the UTF-8 bytes with an array literal simply by using the + operator:
let a = "😀".utf8 + [1, 2] // [240, 159, 152, 128, 1, 2]
Also note that your example to concatenate a [UInt8] to a Data could be shortened slightly to:
let a = "v".data(using: .utf8)! + [1, 2] // Data with bytes [0x76, 0x1, 0x2]
There is a specific UInt8 initializer (introduced in Swift 2.2+):
let a = [UInt8(ascii:"v"), 1 ,2]
(Some addendums to the already posted answers; regarding UnicodeScalar's in particular)
In you question you've used a literal "v" as the base instance to be converted to UInt8; we don't really know if this is a String or e.g. UnicodeScalar in your actual use case. The accepted answer shows some neat approaches in case you are working wit a String instance.
In case you happen to be working with a UnicodeScalar instance (rather than a String), one answer has already mentioned the init(ascii:) initializer of UInt8. You should take care however, to verify that the UnicodeScalar instance used in this initializer is indeed one that that fits within ASCII character encoding; the majority of UnicodeScalar values will not (which will lead to a runtime exeception for this initializer). You may use e.g. the isASCII property of UnicodeScalar to verify this fact prior to making use of the initializer.
let ucScalar: UnicodeScalar = "z"
var a = [UInt8]()
if ucScalar.isASCII {
a = [UInt8(ascii: ucScalar), 1, 2]
}
else {
// ... unexpected but not a runtime error
}
Another approach, in case you'd like to encode the full UnicodeScalar into UInt8 format (even for UnicodeScalar's that cannot be single-byte ASCII endoded) is using the encode(_:into:) method of UTF8:
let ucScalar: UnicodeScalar = "z"
var bytes: [UTF8.CodeUnit] = []
UTF8.encode(ucScalar, into: { bytes.append($0) })
bytes += [1, 2]
print(bytes) // [122, 1, 2]
// ...
let ucScalar: UnicodeScalar = "\u{03A3}" // Σ
var bytes: [UTF8.CodeUnit] = []
UTF8.encode(ucScalar, into: { bytes.append($0) })
bytes += [1, 2]
print(bytes) // [206, 163, 1, 2]