how to extend Double and Int in Swift with the same function - swift

I want to create an extension for Swift Double, Int, and other numeric types that support the random(in:) function, like so:
extension Double {
// function to generate multiple random numbers of type
static func random(in range: ClosedRange<Self>, count: Int) -> [Self] {
var values = [Self]()
if count > 0 {
for _ in 0..<count {
values.append(Self.random(in: range))
}
}
return values
}
}
How do I do this without creating a separate extension for each type?

You can't really since there is no common protocol involved here but as far as I understand the types that has the method random(in:) can be grouped into types conforming to two protocols so you need only two implementations
// Double & Float
extension BinaryFloatingPoint {
static func random(in range: ClosedRange<Self>, count: Int) -> [Self] where Self.RawSignificand: FixedWidthInteger {
var values = [Self]()
if count > 0 {
for _ in 0..<count {
values.append(Self.random(in: range))
}
}
return values
}
}
// Int & UInt
extension FixedWidthInteger {
static func random(in range: ClosedRange<Self>, count: Int) -> [Self] {
var values = [Self]()
if count > 0 {
for _ in 0..<count {
values.append(Self.random(in: range))
}
}
return values
}
}

Related

Non-nominal type "Self" doesn't support explicit initialisation (protocol extension for enums)

I'm trying to find a way of adding a "count" variable to enums that allows the number of cases to be quantified. This is straightforward and works well when hard coded into each enum but a logical extension is to make it a protocol extension...but I run into the above error.
Reading up other posts suggest that the compiler doesn't have enough information to instantiate the enum. I've tried adding associated types but the error persists. Is this just something I'm going to have to give up on? It's not hugely important but I'd like to understand what I'm doing wrong.
Code as follows:
protocol CountableEnum: RawRepresentable {
static var count: Int { get }
}
extension CountableEnum {
static var count: Int {
var max = 0
while let _ = self(rawValue: max) { max += 1 }
return max
}
}
You need to use the proper Self reference instead of self and constrain your extension to only types with the RawValue of Int.
protocol CountableEnum: RawRepresentable {
static var count: Int { get }
}
extension CountableEnum where RawValue == Int {
static var count: Int {
var max = 0
while let _ = Self(rawValue: max) { max += 1 }
return max
}
}
RawRepresentable have an associatedtype RawValue, in your example it doesn't specify that it is an Int. If you constraint protocol extension with where as in example below it should work fine.
protocol CountableEnum: RawRepresentable {
static var count: Int { get }
}
extension CountableEnum where RawValue == Int {
static var count: Int {
var max = 0
while let _ = self.init(rawValue: max) { max += 1 }
return max
}
}

`sum` extension on Measurement Sequence

Here's what I've currently got. It has at least these problems:
It crashes, when used on an array of Measurement<UnitType>, when
the units are mixed, such as with grams and milligrams.
Using an instance zero property isn't as good as a static alternative, which allows the return type to be the type's zero, instead of nil, for an empty sequence. I can't figure out if this is avoidable.
I use the first extension because Sequence.first doesn't exist in the current version of Swift. 🤔
import Foundation
public extension Sequence {
var first: Iterator.Element? {
return self.first{_ in true}
}
}
public extension Sequence where Iterator.Element: SummableUsingInstanceZero {
var sum: Iterator.Element? {
guard let zero = first?.zero
else {return nil}
return self.reduce(zero, +)
}
}
public protocol SummableUsingInstanceZero {
static func + (_: Self, _: Self) -> Self
var zero: Self {get}
}
extension Measurement: SummableUsingInstanceZero {
public var zero: Measurement {
return Measurement(
value: 0,
unit: unit
)
}
}
This question is old, but the solution is still not built-in. Conforming to AdditiveArithmetic is the way to go!
import Foundation
extension Measurement: AdditiveArithmetic where UnitType: Dimension {
public static var zero: Self {
Self( value: 0, unit: .baseUnit() )
}
public static func += (measurement0: inout Self, measurement1: Self) {
measurement0 = measurement0 + measurement1
}
public static func -= (measurement0: inout Self, measurement1: Self) {
measurement0 = measurement0 - measurement1
}
}

Adopting CollectionType (Collection) in Swift

I'm writing a graphics library to display data in a graph. Since most of the projects I do tend to have a large learning component in them, I decided to create a generically typed struct to manage my data set DataSet<T: Plottable> (note here that Plottable is also Comparable).
In trying to conform to MutableCollectionType, I've run across an error. I'd like to use the default implementation of sort(), but the compiler is giving the following error when trying to use the sorting function.
Ambiguous reference to member 'sort()'
Here's a code example:
var data = DataSet<Int>(elements: [1,2,3,4])
data.sort() //Ambiguous reference to member 'sort()'
The compiler suggests two candidates, but will not actually display them to me. Note that the compiler error goes away if I explicitly implement sort() on my struct.
But the bigger question remains for me. What am I not seeing that I expect the default implementation to be providing? Or am I running across a bug in Swift 3 (this rarely is the case... usually I have overlooked something).
Here's the balance of the struct:
struct DataSet<T: Plottable>: MutableCollection, BidirectionalCollection {
typealias Element = T
typealias Iterator = DataSetIterator<T>
typealias Index = Int
/**
The list of elements in the data set. Private.
*/
private var elements: [Element] = []
/**
Initalize the data set with an array of data.
*/
init(elements data: [T] = []) {
self.elements = data
}
//MARK: Sequence Protocol
func makeIterator() -> DataSetIterator<T> {
return DataSetIterator(self)
}
//MARK: Collection Protocol
subscript(_ index:DataSet<T>.Index) -> DataSet<T>.Iterator.Element {
set {
elements[index] = newValue
}
get {
return elements[index]
}
}
subscript(_ inRange:Range<DataSet<T>.Index>) -> DataSet<T> {
set {
elements.replaceSubrange(inRange, with: newValue)
}
get {
return DataSet<T>(elements: Array(elements[inRange]))
}
}
//required index for MutableCollection and BidirectionalCollection
var endIndex: Int {
return elements.count
}
var startIndex: Int {
return 0
}
func index(after i: Int) -> Int {
return i+1
}
func index(before i: Int) -> Int {
return i-1
}
mutating func append(_ newElement: T) {
elements.append(newElement)
}
// /**
// Sorts the elements of the DataSet from lowest value to highest value.
// Commented because I'd like to use the default implementation.
// - note: This is equivalent to calling `sort(by: { $0 < $1 })`
// */
// mutating func sort() {
// self.sort(by: { $0 < $1 })
// }
//
// /**
// Sorts the elements of the DataSet by an abritrary block.
// */
// mutating func sort(by areInIncreasingOrder: #noescape (T, T) -> Bool) {
// self.elements = self.elements.sorted(by: areInIncreasingOrder)
// }
/**
Returns a `DataSet<T>` with the elements sorted by a provided block.
This is the default implementation `sort()` modified to return `DataSet<T>` rather than `Array<T>`.
- returns: A sorted `DataSet<T>` by the provided block.
*/
func sorted(by areInIncreasingOrder: #noescape (T, T) -> Bool) -> DataSet<T> {
return DataSet<T>(elements: self.elements.sorted(by: areInIncreasingOrder))
}
func sorted() -> DataSet<T> {
return self.sorted(by: { $0 < $1 })
}
}
Your DataSet is a BidirectionalCollection. The sort() you're trying to use requires a RandomAccessCollection. The most important thing you need to add is an Indicies typealias.
typealias Indices = Array<Element>.Indices
Here's my version of your type:
protocol Plottable: Comparable {}
extension Int: Plottable {}
struct DataSet<Element: Plottable>: MutableCollection, RandomAccessCollection {
private var elements: [Element] = []
typealias Indices = Array<Element>.Indices
init(elements data: [Element] = []) {
self.elements = data
}
var startIndex: Int {
return elements.startIndex
}
var endIndex: Int {
return elements.endIndex
}
func index(after i: Int) -> Int {
return elements.index(after: i)
}
func index(before i: Int) -> Int {
return elements.index(before: i)
}
subscript(position: Int) -> Element {
get {
return elements[position]
}
set {
elements[position] = newValue
}
}
subscript(bounds: Range<Int>) -> DataSet<Element> {
get {
return DataSet(elements: Array(elements[bounds]))
}
set {
elements[bounds] = ArraySlice(newValue.elements)
}
}
}
var data = DataSet(elements: [4,2,3,1])
data.sort()
print(data.elements) // [1,2,3,4]
You don't actually need an Iterator if you don't want one. Swift will give you Sequence automatically if you implement Collection.

Multiple index types in Swift

In Swift (a language I'm still fairly new to), I'm trying to define a class that allows indexing using either an Int or Range<Int>. For example:
var m = Matrix(rows: 10, cols: 10)
var v = m[1, 0..<10]
The two ways I can think of allowing this issue is to either adding an extension to the Range class:
extension Range {
init(_ intValue:Element) {
self.init(start: intValue, end: intValue.successor())
}
}
or to create an enum that allows for either type:
enum IndexType {
case value(Int)
case range(Range<Int>)
init(_ v:Int) {
self = .value(v)
}
init(_ r:Range<Int>) {
self = .range(r)
}
}
However, neither solution works as intended, as I still need to specify the type manually. For example, given the function:
func slice(indices:IndexType...) -> [Double] {
// ...
}
I can't then just do:
slice(1, 3...4, 5)
but instead have to do:
slice(IndexType(1), IndexType(3...4), 5)
Is there a way to accomplish this in Swift? I'm used to c++, which would do type inference automatically, but the same doesn't appear to work with Swift.
One similar question I found was:
Swift Arrays of Multiple Types
However, I really would like to avoid the use of Any as I know the two types it should be.
protocol MatrixIndex {
var Matrix_range: Range<Int> { get }
}
extension Int : MatrixIndex {
var Matrix_range: Range<Int> {
get {
return Range<Int>(start: self, end: self+1)
}
}
}
extension Range : MatrixIndex {
var Matrix_range: Range<Int> {
get {
return Range<Int>(start: self.startIndex as! Int, end: self.endIndex as! Int)
}
}
}
class Matrix {
subscript(row: MatrixIndex, column: MatrixIndex) -> () {
get {
print("getting \(row.Matrix_range) \(column.Matrix_range)")
}
set(newValue) {
print("setting \(row.Matrix_range) \(column.Matrix_range)")
}
}
}

Swift: Function returning an Array of IntegerType

I would like to create a function that takes an NSData parameter, and, depending on what it reads from the NSData it returns an Array<Int8>, Array<Int16>, Array<Int32>, or Array<Int64>.
Basically, I need to return an array of IntegerType, with the specific subtype being determined at runtime.
I am stuck at the signature declaration of the function. (The inside would just be a simple switch, that would create the specific array type and return it).
The following very basic test does not compile
class Test {
func test(data:NSData) -> Array<IntegerType> {
return [1, 2, 3]
}
}
EDIT
It seems to be currently not possible, not because of having to return an array of a protocol type, but because the IntegerType protocol uses Self. Here is an interesting related question
IntegerType is a protocol, so the following should work:
class Test {
func test(data:NSData) -> Array<T: IntegerType> {
[1, 2, 3]
}
}
You can use enum with associated values for that:
enum IntArray {
case Int8([Swift.Int8])
case Int16([Swift.Int16])
case Int32([Swift.Int32])
case Int64([Swift.Int64])
}
class Test {
func test(data:NSData) -> IntArray {
return IntArray.Int8([1, 2, 3]);
}
}
on user side:
let obj = Test()
let array = obj.test(dat)
switch array {
case .Int8(let ary):
// here, ary is [Int8]
...
case .Int16(let ary):
// here, ary is [Int16]
...
case .Int32(let ary):
// here, ary is [Int32]
...
case .Int64(let ary):
// here, ary is [Int64]
...
}
As others have said in the comments, this won't work if you are trying to determine the function's return type at runtime. Swift generics only work at compile time, so changing the return type based off of what's in an NSData won't work.
If you can determine the return type at compile time, then you can use a generic function declaration like so:
func test<T: IntegerType>(data: NSData) -> Array<T> {
return [1, 2, 3]
}
Note: If you don't specify the type explicitly in your function somehow, then you'll need to define the variable the value returned from the function is assigned to. Like so:
var int8Array: Array<Int8> = test(NSData())
Since none of the generic based solutions are working, why don't you try returning [Any] and just check the return type as follows:-
func test(data:NSData) -> [Any]
{
var value1:Int8 = 1
var value2:Int8 = 2
return [value1,value2]
}
var x = test(NSData())
for xin in x
{
var intxin = xin as? Int8
if intxin != nil
{
println(intxin!)
}
}
The way I solved my problem was by defining the following protocol:
protocol IntegerArrayProtocol {
init(rawData: NSData!, length: Int)
func count() -> Int
subscript(index:Int) -> Int { get }
}
This deals with all the operations I need to perform on the array:
Read it from raw memory
Count how many elements it has
Access its elements by index, always returning Ints, regardless of the underlying integer
type
Then, I created a parameterized class that implements the protocol:
final class IntegerArray<T: ConvertibleToInteger>: IntegerArrayProtocol {
let arr: [T]
init(rawData: NSData!, length: Int){
//create array and allocate memory for all elements
arr = Array<T>(count: length, repeatedValue: T.zero())
// read it from the NSData source
// ....
}
func count() -> Int{
return arr.count
}
subscript(index:Int) -> Int {
get {
return arr[index].asInt()
}
}
}
The parameter types T should be able to convert themselves to Int, and should have a zero value (used when I initialize the array). For that, I created the ConvertibleToInteger protocol, which is used above to restrict the possible Ts:
protocol ConvertibleToInteger {
class func zero() -> Self
func asInt() -> Int
}
Then, I extended every type that I would like to create arrays of:
extension Int8: ConvertibleToInteger{
static func zero() -> Int8{
return 0
}
func asInt() -> Int{
return Int(self)
}
}
extension Int16: ConvertibleToInteger{
static func zero() -> Int16{
return 0
}
func asInt() -> Int{
return Int(self)
}
}
extension Int32: ConvertibleToInteger{
static func zero() -> Int32{
return 0
}
func asInt() -> Int{
return Int(self)
}
}
extension Int64: ConvertibleToInteger{
static func zero() -> Int64{
return 0
}
func asInt() -> Int{
return Int(self)
}
}
Finally, to read an array from NSData, I created the following function:
func readArray(rawData: NSData, length: Int): IntegerArrayProtocol? {
qBytes = // read from NSData how many bytes each element is
switch(qBytes){
case 1:
return IntegerArray<Int8>(rawData: rawData, length: length)
case 2:
return IntegerArray<Int16>(rawData: rawData, length: length)
case 3 ... 4:
return IntegerArray<Int32>(rawData: rawData, length: length)
case 5 ... 8:
return IntegerArray<Int64>(rawData: rawData, length: length)
default:
return nil
}
}