How to convert ClosedRange<Int> to ClosedRange<Double> in Swift - swift

I have a ClosedRange passed in as a parameter, and now I need to convert it to a ClosedRange:
let range: ClosedRange<Int>
init(range: ClosedRange<Int>) {
self.range = range
}
var body: some View {
Slider(value: doubleBinding, in: range.startIndex...range.endIndex, step: 1)
}
The Slider init function takes a ClosedRange argument. So I got this error:
Cannot convert value of type 'ClosedRange<Int>.Index' to expected argument type 'Double'
So I tried this:
let min = Double(Int(range.startIndex))
let max = Double(Int(range.endIndex))
Slider(value: doubleBinding, in: min...max, step: 1)
But got this error:
Initializer 'init(_:)' requires that 'ClosedRange<Int>.Index' conform to 'BinaryInteger'

You've used the wrong property. It's not startIndex and endIndex. It's lowerBound and upperBound:
Slider(value: doubleBinding,
in: Double(range.lowerBound)...Double(range.upperBound),
step: 1)
You can write a ClosedRange initialiser that conveniently does this:
extension ClosedRange {
init<Other: Comparable>(_ other: ClosedRange<Other>, _ transform: (Other) -> Bound) {
self = transform(other.lowerBound)...transform(other.upperBound)
}
}
Usage:
Slider(value: doubleBinding,
in: ClosedRange(range, Double.init),
step: 1)

Related

'(key: String, value: Any)' is not convertible to '[String : Any]'

I'm trying to refactor this code:
var indices = [String:[Int:Double]]()
apps.forEach { app in indices[app] = [Int:Double]()}
var index = 0
timeSeries.forEach { entry in
entry.apps.forEach{ (arg: (key: String, value: Double)) in
let (app, value) = arg
indices[app]?[index] = value
}
index += 1
}
so I have the signature:
var parameters = timeSeries.map{ entry in entry.apps as [String:Any] }
var indices = getIndices(with: apps, in: parameters) as? [String:[Int:Double]] ?? [String:[Int:Double]]()
and the method:
func getIndices(with source: [String], in entryParameters: [[String:Any]]) -> [String:[Int:Any]] {
var indices = [String:[Int:Any]]()
source.forEach { item in indices[item] = [Int:Any]() }
var index = 0
entryParameters.forEach { (arg: (key: String, value: Any)) in
let (key, value) = arg
indices[key]?[index] = value
index += 1
}
return indices
}
But this (only in the method, not the original, which works fine) gives: '(key: String, value: Any)' is not convertible to '[String : Any]' at the entryParameters line
The reason I must use Any is because the other source is [String:[Int:Bool]]
edit: some more details:
timeSeries is [TimeSeriesEntry]
// this will need to be defined per-model, so in a different file in final project
struct TimeSeriesEntry: Codable, Equatable {
let id: String
let uid: String
let date: Date
let apps: [String:Double]
let locations: [String:Bool]
func changeApp(app: String, value: Double) -> TimeSeriesEntry {
var apps = self.apps
apps[app] = value
return TimeSeriesEntry(id: self.id, uid: self.uid, date: self.date, apps: apps, locations: self.locations)
}
}
notes:
changed calling signature, thanks impression. problem remains.
The problem is that every value in entryParameters is a dictionary so when you do entryParameters.forEach you get dictionary type in the closure not (key, value).
You will get (key,value) when you call the forEach on this dictionary. So your method should look something like this:
func getIndices(with source: [String], in entryParameters: [[String:Any]]) -> [String:[Int:Any]] {
var indices = [String:[Int:Any]]()
source.forEach { item in indices[item] = [Int:Any]() }
var index = 0
entryParameters.forEach { entry in
entry.forEach {(arg: (key: String, value: Any)) in
let (key, value) = arg
indices[key]?[index] = value
}
index += 1
}
return indices
}
I just tested this briefly in Playground.
var double = [String:[Int:Double]]()
double["Taco"] = [2: 3.2]
func myFunc(double: [String:[Int:Double]]) {
print(double.count) //Prints 1
print(double["Taco"]!) //Prints [2:3.2]
}
func myFunc2(all: [String:Any]) {
print(all.count) //Prints 1
print(all["Taco"]!) //Prints [2:3.2]
}
myFunc(double: double)
myFunc2(all: double as [String:Any])
I have my initial [String:[Int:Double]](). Inside this dictionary I set double["Taco"] = [2: 3.2]. I can use 2 different functions, one that is taken as the original type [String:[Int:Double]] and it is easy to use as the functions take in the same parameter type. However, now I created a function that takes in a dictionary of [String:Any]. Now, to USE this method, we MUST typecast variables as [String:Any] when calling the method, as you can see below.

Cannot conform to RandomAccessCollection due to Stride type

I'm trying to make a collection that wraps another and vends out fixed-size sub-collections as the elements:
struct PartitionedCollection<C: RandomAccessCollection>: BidirectionalCollection {
typealias TargetCollection = C
let collection: C
let wholePartitionCount: C.IndexDistance
let stragglerPartitionCount: C.IndexDistance
let span: C.IndexDistance
init(on c: C, splittingEvery stride: C.IndexDistance) {
let (q, r) = c.count.quotientAndRemainder(dividingBy: stride)
collection = c
wholePartitionCount = q
stragglerPartitionCount = r.signum()
span = stride
}
var startIndex: C.IndexDistance {
return 0
}
var endIndex: C.IndexDistance {
return wholePartitionCount + stragglerPartitionCount
}
subscript(i: C.IndexDistance) -> C.SubSequence {
// If `C` was only a Collection, calls to `index` would be O(n) operations instead of O(1).
let subStartIndex = collection.index(collection.startIndex, offsetBy: i * span)
if let subEndIndex = collection.index(subStartIndex, offsetBy: span, limitedBy: collection.endIndex) {
return collection[subStartIndex ..< subEndIndex]
} else {
return collection[subStartIndex...]
}
}
func index(after i: C.IndexDistance) -> C.IndexDistance {
return i.advanced(by: +1)
}
func index(before i: C.IndexDistance) -> C.IndexDistance {
return i.advanced(by: -1)
}
}
It seems that I can make this a RandomAccessCollection. I changed the base protocol, and the playground compiler complained that the type doesn't conform to Collection, BidirectionalCollection, nor RandomAccessCollection. I get offered Fix-It stubs for the last one. The compiler adds 3 copies of:
var indices: CountableRange<C.IndexDistance>
Before I can erase two of the copies, all of them are flagged with:
Type 'C.IndexDistance.Stride' does not conform to protocol 'SignedInteger'
I keep the error even if I fill out the property:
var indices: CountableRange<C.IndexDistance> {
return startIndex ..< endIndex
}
I thought C.IndexDistance is Int, which is its own Stride and should conform to SignedInteger. What's going on? Can I define a different type for indices? Should I define some other members to get random-access (and which ones and how)?
I tried adding the index(_:offsetBy:) and distance(from:to:) methods; didn't help. I tried changing indices to Range<C.IndexDistance>; didn't help either, it made the compiler disavow the type as BidirectionalCollection and RandomAccessCollection.

Swift struct extension add initializer

I'm trying to add an initializer to Range.
import Foundation
extension Range {
init(_ range: NSRange, in string: String) {
let lower = string.index(string.startIndex, offsetBy: range.location)
let upper = string.index(string.startIndex, offsetBy: NSMaxRange(range))
self.init(uncheckedBounds: (lower: lower, upper: upper))
}
}
But, the last line has a Swift compiler error.
Cannot convert value of type '(lower: String.Index, upper: String.Index)' (aka '(lower: String.CharacterView.Index, upper: String.CharacterView.Index)') to expected argument type '(lower: _, upper: _)'
How do I get it to compile?
The problem is even though String.Index does conform to Comparable protocol, you still need to specify the Range type you want to work with public struct Range<Bound> where Bound : Comparable {}
Note: As NSString uses UTF-16, check this and also in the link you've referred to, your initial code does not work correctly for characters consisting of more than one UTF-16 code point. The following is the updated working version for Swift 3.
extension Range where Bound == String.Index {
init(_ range: NSRange, in string: String) {
let lower16 = string.utf16.index(string.utf16.startIndex, offsetBy: range.location)
let upper16 = string.utf16.index(string.utf16.startIndex, offsetBy: NSMaxRange(range))
if let lower = lower16.samePosition(in: string),
let upper = upper16.samePosition(in: string) {
self.init(lower..<upper)
} else {
fatalError("init(range:in:) could not be implemented")
}
}
}
let string = "❄️Let it snow! ☃️"
let range1 = NSRange(location: 0, length: 1)
let r1 = Range<String.Index>(range1, in: string) // ❄️
let range2 = NSRange(location: 1, length: 2)
let r2 = Range<String.Index>(range2, in: string) // fatal error: init(range:in:) could not be implemented
To answer the OP's comment: The problem is an NSString object encodes a Unicode-compliant text string, represented as a sequence of UTF–16 code units. Unicode scalar values that make up a string’s contents can be up to 21 bits long. The longer scalar values may need two UInt16 values for storage.
Therefore, some letters like ❄️ takes up two UInt16 values in NSString but only one in String. As you pass an NSRange argument to the initializer, you may expect it to work correctly in NSString.
In my example, the results for r1 and r2 after you convert string to utf16 are '❄️' and a fatal error. Meanwhile, the results from your original solution are '❄️L' and 'Le', respectively. Hopefully, you see the difference.
In case you insist with the solution without converting to utf16, you can take a look at the Swift source code to make decision. In Swift 4, you will have the initializer as a built-in lib. The code is as follows.
extension Range where Bound == String.Index {
public init?(_ range: NSRange, in string: String) {
let u = string.utf16
guard range.location != NSNotFound,
let start = u.index(u.startIndex, offsetBy: range.location, limitedBy: u.endIndex),
let end = u.index(u.startIndex, offsetBy: range.location + range.length, limitedBy: u.endIndex),
let lowerBound = String.Index(start, within: string),
let upperBound = String.Index(end, within: string)
else { return nil }
self = lowerBound..<upperBound
}
}
You need to constrain your range initializer to where Bound is equal to String.Index, get your NSRange utf16 indexes and find the same position of the string index in your string as follow:
extension Range where Bound == String.Index {
init?(_ range: NSRange, in string: String) {
guard
let start = string.utf16.index(string.utf16.startIndex, offsetBy: range.location, limitedBy: string.utf16.endIndex),
let end = string.utf16.index(string.utf16.startIndex, offsetBy: range.location + range.length, limitedBy: string.utf16.endIndex),
let startIndex = start.samePosition(in: string),
let endIndex = end.samePosition(in: string)
else {
return nil
}
self = startIndex..<endIndex
}
}
The signature for that method requires a "Bound" type (at least in swift 4)
Since Bound is just an associated type of "Comparable" and String.Index conforms to it, you should just be able to cast it.
extension Range {
init(_ range: NSRange, in string: String) {
let lower : Bound = string.index(string.startIndex, offsetBy: range.location) as! Bound
let upper : Bound = string.index(string.startIndex, offsetBy: NSMaxRange(range)) as! Bound
self.init(uncheckedBounds: (lower: lower, upper: upper))
}
}
https://developer.apple.com/documentation/swift/rangeexpression/2894257-bound

How do I use a formula I have in a Dictionary of formulas in a function

I have a dictionary of formulas (in closures) that I now what to use in a function to calculate some results.
var formulas: [String: (Double, Double) -> Double] = [
"Epley": {(weightLifted, repetitions) -> Double in return weightLifted * (1 + (repetitions)/30)},
"Brzychi": {(weightLifted, repetitions) -> Double in return weightLifted * (36/(37 - repetitions)) }]
Now I'm trying to write a function that will get the correct formula from the dictionary based on the name, calculate the result, and return it.
func calculateOneRepMax(weightLifted: Double, repetitions: Double) -> Double {
if let oneRepMax = formulas["Epley"] { $0, $1 } <-- Errors here because I clearly don't know how to do this part
return oneRepMax
}
var weightlifted = 160
var repetitions = 2
let oneRepMax = Calculator.calculateOneRepMax(weightlifted, repetitions)
Now Xcode is giving me errors like 'Consecutive statements on a line must be separated by a ';' which tells me the syntax I'm trying to use isn't correct.
On a side note, I wasn't sure if I should use a dictionary for this but after a lot of homework I'm confident it's the correct choice considering I need to iterate through it to get the values when I need them and I need to know the number of key/value pairs so I can do things like display their names in a Table View.
I've searched far and wide for answers, read Apple's documentation over and over and I'm really stuck.
Thanks
formulas["Epley"] returns an optional closure which needs to be
unwrapped before you can apply it to the given numbers. There are several options you can choose from:
Optional binding with if let:
func calculateOneRepMax(weightLifted: Double, repetitions: Double) -> Double {
if let formula = formulas["Epley"] {
return formula(weightLifted, repetitions)
} else {
return 0.0 // Some appropriate default value
}
}
This can be shortened with optional chaining and the
nil-coalescing operator ??:
func calculateOneRepMax(weightLifted: Double, repetitions: Double) -> Double {
return formulas["Epley"]?(weightLifted, repetitions) ?? 0.0
}
If a non-existing key should be treated as a fatal error instead
of returning a default value, then guard let would be
appropriate:
func calculateOneRepMax(weightLifted: Double, repetitions: Double) -> Double {
guard let formula = formulas["Epley"] else {
fatalError("Formula not found in dictionary")
}
return formula(weightLifted, repetitions)
}

Convert String.CharacterView.Index to int [duplicate]

I want to convert the index of a letter contained within a string to an integer value. Attempted to read the header files but I cannot find the type for Index, although it appears to conform to protocol ForwardIndexType with methods (e.g. distanceTo).
var letters = "abcdefg"
let index = letters.characters.indexOf("c")!
// ERROR: Cannot invoke initializer for type 'Int' with an argument list of type '(String.CharacterView.Index)'
let intValue = Int(index) // I want the integer value of the index (e.g. 2)
Any help is appreciated.
edit/update:
Xcode 11 • Swift 5.1 or later
extension StringProtocol {
func distance(of element: Element) -> Int? { firstIndex(of: element)?.distance(in: self) }
func distance<S: StringProtocol>(of string: S) -> Int? { range(of: string)?.lowerBound.distance(in: self) }
}
extension Collection {
func distance(to index: Index) -> Int { distance(from: startIndex, to: index) }
}
extension String.Index {
func distance<S: StringProtocol>(in string: S) -> Int { string.distance(to: self) }
}
Playground testing
let letters = "abcdefg"
let char: Character = "c"
if let distance = letters.distance(of: char) {
print("character \(char) was found at position #\(distance)") // "character c was found at position #2\n"
} else {
print("character \(char) was not found")
}
let string = "cde"
if let distance = letters.distance(of: string) {
print("string \(string) was found at position #\(distance)") // "string cde was found at position #2\n"
} else {
print("string \(string) was not found")
}
Works for Xcode 13 and Swift 5
let myString = "Hello World"
if let i = myString.firstIndex(of: "o") {
let index: Int = myString.distance(from: myString.startIndex, to: i)
print(index) // Prints 4
}
The function func distance(from start: String.Index, to end: String.Index) -> String.IndexDistance returns an IndexDistance which is just a typealias for Int
Swift 4
var str = "abcdefg"
let index = str.index(of: "c")?.encodedOffset // Result: 2
Note: If String contains same multiple characters, it will just get the nearest one from left
var str = "abcdefgc"
let index = str.index(of: "c")?.encodedOffset // Result: 2
encodedOffset has deprecated from Swift 4.2.
Deprecation message:
encodedOffset has been deprecated as most common usage is incorrect. Use utf16Offset(in:) to achieve the same behavior.
So we can use utf16Offset(in:) like this:
var str = "abcdefgc"
let index = str.index(of: "c")?.utf16Offset(in: str) // Result: 2
When searching for index like this
⛔️ guard let index = (positions.firstIndex { position <= $0 }) else {
it is treated as Array.Index. You have to give compiler a clue you want an integer
✅ guard let index: Int = (positions.firstIndex { position <= $0 }) else {
Swift 5
You can do convert to array of characters and then use advanced(by:) to convert to integer.
let myString = "Hello World"
if let i = Array(myString).firstIndex(of: "o") {
let index: Int = i.advanced(by: 0)
print(index) // Prints 4
}
To perform string operation based on index , you can not do it with traditional index numeric approach. because swift.index is retrieved by the indices function and it is not in the Int type. Even though String is an array of characters, still we can't read element by index.
This is frustrating.
So ,to create new substring of every even character of string , check below code.
let mystr = "abcdefghijklmnopqrstuvwxyz"
let mystrArray = Array(mystr)
let strLength = mystrArray.count
var resultStrArray : [Character] = []
var i = 0
while i < strLength {
if i % 2 == 0 {
resultStrArray.append(mystrArray[i])
}
i += 1
}
let resultString = String(resultStrArray)
print(resultString)
Output : acegikmoqsuwy
Thanks In advance
Here is an extension that will let you access the bounds of a substring as Ints instead of String.Index values:
import Foundation
/// This extension is available at
/// https://gist.github.com/zackdotcomputer/9d83f4d48af7127cd0bea427b4d6d61b
extension StringProtocol {
/// Access the range of the search string as integer indices
/// in the rendered string.
/// - NOTE: This is "unsafe" because it may not return what you expect if
/// your string contains single symbols formed from multiple scalars.
/// - Returns: A `CountableRange<Int>` that will align with the Swift String.Index
/// from the result of the standard function range(of:).
func countableRange<SearchType: StringProtocol>(
of search: SearchType,
options: String.CompareOptions = [],
range: Range<String.Index>? = nil,
locale: Locale? = nil
) -> CountableRange<Int>? {
guard let trueRange = self.range(of: search, options: options, range: range, locale: locale) else {
return nil
}
let intStart = self.distance(from: startIndex, to: trueRange.lowerBound)
let intEnd = self.distance(from: trueRange.lowerBound, to: trueRange.upperBound) + intStart
return Range(uncheckedBounds: (lower: intStart, upper: intEnd))
}
}
Just be aware that this can lead to weirdness, which is why Apple has chosen to make it hard. (Though that's a debatable design decision - hiding a dangerous thing by just making it hard...)
You can read more in the String documentation from Apple, but the tldr is that it stems from the fact that these "indices" are actually implementation-specific. They represent the indices into the string after it has been rendered by the OS, and so can shift from OS-to-OS depending on what version of the Unicode spec is being used. This means that accessing values by index is no longer a constant-time operation, because the UTF spec has to be run over the data to determine the right place in the string. These indices will also not line up with the values generated by NSString, if you bridge to it, or with the indices into the underlying UTF scalars. Caveat developer.
In case you got an "index is out of bounds" error. You may try this approach. Working in Swift 5
extension String{
func countIndex(_ char:Character) -> Int{
var count = 0
var temp = self
for c in self{
if c == char {
//temp.remove(at: temp.index(temp.startIndex,offsetBy:count))
//temp.insert(".", at: temp.index(temp.startIndex,offsetBy: count))
return count
}
count += 1
}
return -1
}
}