I'm trying to generate seeded random numbers with Swift 4.2+, with the Int.random() function, however there is no given implementation that allows for the random number generator to be seeded. As far as I can tell, the only way to do this is to create a new random number generator that conforms to the RandomNumberGenerator protocol. Does anyone have a recommendation for a better way to do it, or an implementation of a RandomNumberGenerator conforming class that has the functionality of being seeded, and how to implement it?
Also, I have seen two functions srand and drand mentioned a couple times while I was looking for a solution to this, but judging by how rarely it was mentioned, I'm not sure if using it is bad convention, and I also can't find any documentation on them.
I'm looking for the simplest solution, not necessarily the most secure or fastest performance one (e.g. using an external library would not be ideal).
Update: By "seeded", I mean that I was to pass in a seed to the random number generator so that if I pass in the same seed to two different devices or at two different times, the generator will produce the same numbers. The purpose is that I'm randomly generating data for an app, and rather than save all that data to a database, I want to save the seed and regenerate the data with that seed every time the user loads the app.
So I used Martin R's suggestion to use GamePlayKit's GKMersenneTwisterRandomSource to make a class that conformed to the RandomNumberGenerator protocol, which I was able to use an instance of in functions like Int.random():
import GameplayKit
class SeededGenerator: RandomNumberGenerator {
let seed: UInt64
private let generator: GKMersenneTwisterRandomSource
convenience init() {
self.init(seed: 0)
}
init(seed: UInt64) {
self.seed = seed
generator = GKMersenneTwisterRandomSource(seed: seed)
}
func next<T>(upperBound: T) -> T where T : FixedWidthInteger, T : UnsignedInteger {
return T(abs(generator.nextInt(upperBound: Int(upperBound))))
}
func next<T>() -> T where T : FixedWidthInteger, T : UnsignedInteger {
return T(abs(generator.nextInt()))
}
}
Usage:
// Make a random seed and store in a database
let seed = UInt64.random(in: UInt64.min ... UInt64.max)
var generator = Generator(seed: seed)
// Or if you just need the seeding ability for testing,
// var generator = Generator()
// uses a default seed of 0
let chars = ['a','b','c','d','e','f']
let randomChar = chars.randomElement(using: &generator)
let randomInt = Int.random(in: 0 ..< 1000, using: &generator)
// etc.
This gave me the flexibility and easy implementation that I needed by combining the seeding functionality of GKMersenneTwisterRandomSource and the simplicity of the standard library's random functions (like .randomElement() for arrays and .random() for Int, Bool, Double, etc.)
Here's alternative to the answer from RPatel99 that accounts GKRandom values range.
import GameKit
struct ArbitraryRandomNumberGenerator : RandomNumberGenerator {
mutating func next() -> UInt64 {
// GKRandom produces values in [INT32_MIN, INT32_MAX] range; hence we need two numbers to produce 64-bit value.
let next1 = UInt64(bitPattern: Int64(gkrandom.nextInt()))
let next2 = UInt64(bitPattern: Int64(gkrandom.nextInt()))
return next1 ^ (next2 << 32)
}
init(seed: UInt64) {
self.gkrandom = GKMersenneTwisterRandomSource(seed: seed)
}
private let gkrandom: GKRandom
}
Simplified version for Swift 5:
struct RandomNumberGeneratorWithSeed: RandomNumberGenerator {
init(seed: Int) { srand48(seed) }
func next() -> UInt64 { return UInt64(drand48() * Double(UInt64.max)) }
}
#State var seededGenerator = RandomNumberGeneratorWithSeed(seed: 123)
// when deployed used seed: Int.random(in: 0..<Int.max)
Then to use it:
let rand0to99 = Int.random(in: 0..<100, using: &seededGenerator)
I ended up using srand48() and drand48() to generate a pseudo-random number with a seed for a specific test.
class SeededRandomNumberGenerator : RandomNumberGenerator {
let range: ClosedRange<Double> = Double(UInt64.min) ... Double(UInt64.max)
init(seed: Int) {
// srand48() — Pseudo-random number initializer
srand48(seed)
}
func next() -> UInt64 {
// drand48() — Pseudo-random number generator
return UInt64(range.lowerBound + (range.upperBound - range.lowerBound) * drand48())
}
}
So, in production the implementation uses the SystemRandomNumberGenerator but in the test suite it uses the SeededRandomNumberGenerator.
Example:
let messageFixtures: [Any] = [
"a string",
["some", ["values": 456]],
]
var seededRandomNumberGenerator = SeededRandomNumberGenerator(seed: 13)
func randomMessageData() -> Any {
return messageFixtures.randomElement(using: &seededRandomNumberGenerator)!
}
// Always return the same element in the same order
randomMessageData() //"a string"
randomMessageData() //["some", ["values": 456]]
randomMessageData() //["some", ["values": 456]]
randomMessageData() //["some", ["values": 456]]
randomMessageData() //"a string"
Looks like Swift's implementation of RandomNumberGenerator.next(using:) changed in 2019. This affects Collection.randomElement(using:) and causes it to always return the first element if your generator's next()->UInt64 implementation doesn't produce values uniformly across the domain of UInt64. The GKRandom solution provided here is therefore problematic because it's next->Int method states:
* The value is in the range of [INT32_MIN, INT32_MAX].
Here's a solution that works for me using the RNG in Swift's TensorFlow found here:
public struct ARC4RandomNumberGenerator: RandomNumberGenerator {
var state: [UInt8] = Array(0...255)
var iPos: UInt8 = 0
var jPos: UInt8 = 0
/// Initialize ARC4RandomNumberGenerator using an array of UInt8. The array
/// must have length between 1 and 256 inclusive.
public init(seed: [UInt8]) {
precondition(seed.count > 0, "Length of seed must be positive")
precondition(seed.count <= 256, "Length of seed must be at most 256")
var j: UInt8 = 0
for i: UInt8 in 0...255 {
j &+= S(i) &+ seed[Int(i) % seed.count]
swapAt(i, j)
}
}
// Produce the next random UInt64 from the stream, and advance the internal
// state.
public mutating func next() -> UInt64 {
var result: UInt64 = 0
for _ in 0..<UInt64.bitWidth / UInt8.bitWidth {
result <<= UInt8.bitWidth
result += UInt64(nextByte())
}
print(result)
return result
}
// Helper to access the state.
private func S(_ index: UInt8) -> UInt8 {
return state[Int(index)]
}
// Helper to swap elements of the state.
private mutating func swapAt(_ i: UInt8, _ j: UInt8) {
state.swapAt(Int(i), Int(j))
}
// Generates the next byte in the keystream.
private mutating func nextByte() -> UInt8 {
iPos &+= 1
jPos &+= S(iPos)
swapAt(iPos, jPos)
return S(S(iPos) &+ S(jPos))
}
}
Hat tip to my coworkers Samuel, Noah, and Stephen who helped me get to the bottom of this.
Related
Can I create a generator in Swift?
With iterator, I need store intermediate results, for example:
struct Countdown: IteratorProtocol, Sequence {
private var value = 0
init(start: Int) {
self.value = start
}
mutating func next() -> Int? {
let nextNumber = value - 1
if nextNumber < 0 {
return nil
}
value -= 1
return nextNumber
}
}
for i in Countdown(start: 3) {
print(i)
} // print 1 2 3
In this example, I need store the value.
In my situation, I want to use generator instead of iterator, because I don't want store the intermediate results of my sequence in each next.
Understanding how generators work (and why they are less important in swift) is at first difficult coming from Python.
Up to Swift v2.1 there was a protocol called GeneratorType. This was renamed to IteratorProtocol in Swift v3.0+. You can conform to this protocol to make your own objects that do just-in-time computations similar to what can be done in Python.
More information can be found in the Apple Documentation: IteratorProtocol
A simple example from IteratorProtocol page:
struct CountdownIterator: IteratorProtocol {
let countdown: Countdown
var times = 0
init(_ countdown: Countdown) {
self.countdown = countdown
}
mutating func next() -> Int? {
let nextNumber = countdown.start - times
guard nextNumber > 0
else { return nil }
times += 1
return nextNumber
}
}
let threeTwoOne = Countdown(start: 3)
for count in threeTwoOne {
print("\(count)...")
}
// Prints "3..."
// Prints "2..."
// Prints "1..."
However, you need to think about why you are using a generator:
Swift automatically does something "called copy on write." This means that many of the cases that use a Python generator to avoid the large copying cost of collections of objects (arrays, lists, dictionaries, etc) are unnecessary in Swift. You get this for free by using one of the types that use copy on write.
Which value types in Swift supports copy-on-write?
It is also possible to use a wrapper to force almost any object to be copy on write, even if it is not part of a collection:
How can I make a container with copy-on-write semantics?
The optimizations in swift usually mean that you do not not have to write generators. If you really do need to (usually because of data heavy, scientific calculations) it is possible as above.
Based on the code you provided and the little bit knowledge of generators that I do have, you can do something like
struct Countdown {
private var _start = 0
private var _value = 0
init(value: Int) {
_value = value
}
mutating func getNext() -> Int? {
let current = _start
_start += 1
if current <= _value {
return current
} else {
return nil
}
}
}
and then wherever you want to use it, you can do something like
var counter = Countdown(value: 5)
while let value = counter.getNext() {
print(value)
}
Walter provides a lot of good information, and generally you shouldn't be doing this in Swift, but even if you wanted an Iterator, the right way to do it is with composition, not by building your own. Swift has a lot of existing sequences that can be composed to create what you want without maintaining your own state. So in your example, you'd differ to a range's iterator:
struct Countdown: Sequence {
private var value = 0
init(start: Int) {
self.value = start
}
func makeIterator() -> AnyIterator<Int> {
return AnyIterator((0..<value).reversed().makeIterator())
}
}
for i in Countdown(start: 3) {
print(i)
} // print 1 2 3
Something has to keep the state; that's the nature of these kinds of functions (even in a world with coroutines). It's fine not to maintain it directly; just delegate to a more primitive type. Swift has a couple of dozen built-in Iterators you can use to build most things you likely need, and any iterator can be lifted to an AnyIterator to hide the implementation details. If you have something custom enough that it really requires a next(), then yes, storing the state is your problem. Something has to do it. But I've found this all to be extremely rare, and often suggests over-design when it comes up.
I have a solution similar to above, but with a slightly more "yield-y" feeling to it.
struct Countdown
{
static func generator(withStart: Int) -> () -> Int?
{
var start = withStart + 1
return {
start = start - 1
return start > 0 ? start : nil
}
}
}
let countdown = Countdown.generator(withStart: 5)
while let i = countdown()
{
print ("\(i)")
}
According to the documentation, Swift's Dictionary type has this property named underestimatedCount:
A value less than or equal to the number of elements in the
collection.
Anybody knows why on Westeros is this considered useful??!
Baffles me...
Summary: technically speaking, underestimatedCount belongs to Sequence, and gets inherited by Collection, and Dictionary. Dictionary doesn't override the default implementation that returns zero.
Looking at the source code, it seems that underestimatedCount is used as an indicator to determine the amount of the growth of a mutable collection when new items are added to it.
Here's a snippet from StringCore.Swift:
public mutating func append<S : Sequence>(contentsOf s: S)
where S.Iterator.Element == UTF16.CodeUnit {
...........
let growth = s.underestimatedCount
var iter = s.makeIterator()
if _fastPath(growth > 0) {
let newSize = count + growth
let destination = _growBuffer(newSize, minElementWidth: width)
Similarily, from StringCharacterView.swift:
public mutating func append<S : Sequence>(contentsOf newElements: S)
where S.Iterator.Element == Character {
reserveCapacity(_core.count + newElements.underestimatedCount)
for c in newElements {
self.append(c)
}
}
Or even better, from Arrays.swift.gyb:
public mutating func append<S : Sequence>(contentsOf newElements: S)
where S.Iterator.Element == Element {
let oldCount = self.count
let capacity = self.capacity
let newCount = oldCount + newElements.underestimatedCount
if newCount > capacity {
self.reserveCapacity(
Swift.max(newCount, _growArrayCapacity(capacity)))
}
_arrayAppendSequence(&self._buffer, newElements)
}
Oddly enough, I could find only one implementation for underestimatedCount, in Sequence, and that one returns zero.
At this time is seems that the underestimatedCount has more value to custom implementations of collections/sequences, as for the standard Swift collections Apple has already a good idea about the growth of those.
Suppose I have some function that I want to populate my data structure using a multi-dimensional array (e.g. a Tensor class):
class Tensor {
init<A>(array:A) { /* ... */ }
}
while I could add in a shape parameter, I would prefer to automatically calculate the dimensions from the array itself. If you know apriori the dimensions, it's trivial to read it off:
let d1 = array.count
let d2 = array[0].count
However, it's less clear how to do it for an N-dimensional array. I was thinking there might be a way to do it by extending the Array class:
extension Int {
func numberOfDims() -> Int {
return 0
}
}
extension Array {
func numberOfDims() -> Int {
return 1+Element.self.numberOfDims()
}
}
Unfortunately, this won't (rightfully so) compile, as numberOfDims isn't defined for most types. However, I'm don't see any way of constraining Element, as Arrays-of-Arrays make things complicated.
I was hoping someone else might have some insight into how to solve this problem (or explain why this is impossible).
If you're looking to get the depth of a nested array (Swift's standard library doesn't technically provide you with multi-dimensional arrays, only jagged arrays) – then, as shown in this Q&A, you can use a 'dummy protocol' and typecasting.
protocol _Array {
var nestingDepth: Int { get }
}
extension Array : _Array {
var nestingDepth: Int {
return 1 + ((first as? _Array)?.nestingDepth ?? 0)
}
}
let a = [1, 2, 3]
print(a.nestingDepth) // 1
let b = [[1], [2, 3], [4]]
print(b.nestingDepth) // 2
let c = [[[1], [2]], [[3]], [[4], [5]]]
print(c.nestingDepth) // 3
(I believe this approach would've still worked when you had originally posted the question)
In Swift 3, this can also be achieved without a dummy protocol, but instead by casting to [Any]. However, as noted in the linked Q&A, this is inefficient as it requires traversing the entire array in order to box each element in an existential container.
Also note that this implementation assumes that you're calling it on a homogenous nested array. As Paul notes, it won't give a correct answer for [[[1], 2], 3].
If this needs to be accounted for, you could write a recursive method which will iterate through each of the nested arrays and returning the minimum depth of the nesting.
protocol _Array {
func _nestingDepth(minimumDepth: Int?, currentDepth: Int) -> Int
}
extension Array : _Array {
func _nestingDepth(minimumDepth: Int?, currentDepth: Int) -> Int {
// for an empty array, the minimum depth is the current depth, as we know
// that _nestingDepth is called where currentDepth <= minimumDepth.
guard !isEmpty else { return currentDepth }
var minimumDepth = minimumDepth
for element in self {
// if current depth has exceeded minimum depth, then return the minimum.
// this allows for the short-circuiting of the function.
if let minimumDepth = minimumDepth, currentDepth >= minimumDepth {
return minimumDepth
}
// if element isn't an array, then return the current depth as the new minimum,
// given that currentDepth < minimumDepth.
guard let element = element as? _Array else { return currentDepth }
// get the new minimum depth from the next nesting,
// and incrementing the current depth.
minimumDepth = element._nestingDepth(minimumDepth: minimumDepth,
currentDepth: currentDepth + 1)
}
// the force unwrap is safe, as we know array is non-empty, therefore minimumDepth
// has been assigned at least once.
return minimumDepth!
}
var nestingDepth: Int {
return _nestingDepth(minimumDepth: nil, currentDepth: 1)
}
}
let a = [1, 2, 3]
print(a.nestingDepth) // 1
let b = [[1], [2], [3]]
print(b.nestingDepth) // 2
let c = [[[1], [2]], [[3]], [[5], [6]]]
print(c.nestingDepth) // 3
let d: [Any] = [ [[1], [2], [[3]] ], [[4]], [5] ]
print(d.nestingDepth) // 2 (the minimum depth is at element [5])
Great question that sent me off on a goose chase!
To be clear: I’m talking below about the approach of using the outermost array’s generic type parameter to compute the number of dimensions. As Tyrelidrel shows, you can recursively examine the runtime type of the first element — although this approach gives nonsensical answers for heterogenous arrays like [[[1], 2], 3].
Type-based dispatch can’t work
As you note, your code as written doesn’t work because numberOfDims is not defined for all types. But is there a workaround? Does this direction lead somewhere?
No, it’s a dead end. The reason is that extension methods are statically dispatched for non-class types, as the following snippet demonstrates:
extension CollectionType {
func identify() {
print("I am a collection of some kind")
}
func greetAndIdentify() {
print("Hello!")
identify()
}
}
extension Array {
func identify() {
print("I am an array")
}
}
[1,2,3].identify() // prints "I am an array"
[1,2,3].greetAndIdentify() // prints "Hello!" and "I am a collection of some kind"
Even if Swift allowed you to extend Any (and it doesn’t), Element.self.numberOfDims() would always call the Any implementation of numberOfDims() even if the runtime type of Element.self were an Array.
This crushing static dispatch limitation means that even this promising-looking approach fails (it compiles, but always returns 1):
extension CollectionType {
var numberOfDims: Int {
return self.dynamicType.numberOfDims
}
static var numberOfDims: Int {
return 1
}
}
extension CollectionType where Generator.Element: CollectionType {
static var numberOfDims: Int {
return 1 + Generator.Element.numberOfDims
}
}
[[1],[2],[3]].numberOfDims // return 1 ... boooo!
This same constraint also applies to function overloading.
Type inspection can’t work
If there’s a way to make it work, it would be something along these lines, which uses a conditional instead of type-based method dispatch to traverse the nested array types:
extension Array {
var numberOfDims: Int {
return self.dynamicType.numberOfDims
}
static var numberOfDims: Int {
if let nestedArrayType = Generator.Element.self as? Array.Type {
return 1 + nestedArrayType.numberOfDims
} else {
return 1
}
}
}
[[1,2],[2],[3]].numberOfDims
The code above compiles — quite confusingly — because Swift takes Array.Type to be a shortcut for Array<Element>.Type. That completely defeats the attempt to unwrap.
What’s the workaround? There isn’t one. This approach can’t work because we need to say “if Element is some kind of Array,” but as far as I know, there’s no way in Swift to say “array of anything,” or “just the Array type regardless of Element.”
Everywhere you mention the Array type, its generic type parameter must be materialized to a concrete type or a protocol at compile time.
Cheating can work
What about reflection, then? There is a way. Not a nice way, but there is a way. Swift’s Mirror is currently not powerful enough to tell us what the element type is, but there is another reflection method that is powerful enough: converting the type to a string.
private let arrayPat = try! NSRegularExpression(pattern: "Array<", options: [])
extension Array {
var numberOfDims: Int {
let typeName = "\(self.dynamicType)"
return arrayPat.numberOfMatchesInString(
typeName, options: [], range: NSMakeRange(0, typeName.characters.count))
}
}
Horrid, evil, brittle, probably not legal in all countries — but it works!
Unfortunately I was not able to do this with a Swift array but you can easily convert a swift array to an NSArray.
extension NSArray {
func numberOfDims() -> Int {
var count = 0
if let x = self.firstObject as? NSArray {
count += x.numberOfDims() + 1
} else {
return 1
}
return count
}
}
How to iterate array without using position (index) i and for/for in loop?
var a = [1, 2, 3]
for var i = 0; i < a.count; i++ {
//
}
for item in a {
//
}
A SequenceType (which CollectionType, and thus all Swift collections including array, conform to) is pretty simple. It requires you provide a generate() function that returns a type conforming to GeneratorType.
GeneratorType in turn only needs to provide one method: a next() that returns each element until the elements are exhausted. It returns an optional, returning nil after the last element is returned. This makes them pretty similar to Java iterators, only with next and hasNext combined into one via use of optionals.
Swift’s for…in is really syntactic sugar for a combination of getting a generator and then repeatedly calling next on it:
let a = [1, 2, 3]
for i in a { print(i) }
// is equivalent to:
var g = a.generate()
// the generator, being stateful, must be declared with var
while let i = g.next() { print(i) }
If using generators like this, take note of the comment above the definition of GeneratorType in the std lib doc:
Encapsulates iteration state and interface for iteration over a
sequence.
Note: While it is safe to copy a generator, advancing one
copy may invalidate the others.
Since writing a generator for a collection often involves a lot of boiler plate, there is a helper type, IndexingGenerator, that can be used. This implements a generator that starts at startIndex, and returns the value at that index and increments the index each time. A generate() that returns an IndexingGenerator is provided as the default implementation for CollectionType, which means if this is good enough for your purposes, you don’t need to implement generate when implementing a collection:
struct Bitfield: CollectionType {
let data: UInt
var startIndex: UInt { return 0 }
var endIndex: UInt { return UInt(sizeofValue(data)*8) }
subscript(idx: UInt) -> Bit {
return (data >> idx) & 1 == 0
? Bit.Zero : Bit.One
}
// no need to implement generate()
}
This default was added in Swift 2.0. Prior to that, you had to provide a minimal generator that just returned an IndexingGenerator(self).
You can do it using IndexingGenerator:
var a = [1, 2, 3]
var generator = a.generate()
while let item = generator.next() {
//
}
P. S. I created and answered my own question because I did not find anything when tried to figure out how to use Java-like iterators in Swift.
It could be done with the help of IteratorProtocol.
let a = [1, 2, 3]
var aIterator = a.makeIterator()
while let aItem = aIterator.next() {
// do something with array item
}
More on IteratorProtocol in Apple's documentation here
if you use Java 8 you can use streams.
As an example:
List<User> olderUsers = users.stream().filter(u -> u.age > 30).collect(Collectors.toList());
I'm trying to port the Matrix example from Swift book to be generic.
Here's what I got so far:
struct Matrix<T> {
let rows: Int, columns: Int
var grid: T[]
init(rows: Int, columns: Int, repeatedValue: T) {
self.rows = rows
self.columns = columns
grid = Array(count: rows * columns, repeatedValue: repeatedValue)
}
func indexIsValidForRow(row: Int, column: Int) -> Bool {
return row >= 0 && row < rows && column >= 0 && column < columns
}
subscript(row: Int, column: Int) -> T {
get {
assert(indexIsValidForRow(row, column: column), "Index out of range")
return grid[(row * columns) + column]
}
set {
assert(indexIsValidForRow(row, column: column), "Index out of range")
grid[(row * columns) + column] = newValue
}
}
}
Note that I had to pass repeatedValue: T to the constructor.
In C#, I would have just used default(T) which would be 0 for numbers, false for booleans and null for reference types. I understand that Swift doesn't allow nil on non-optional types but I'm still curious if passing an explicit parameter is the only way, or if I there is some equivalent of default(T) there.
There isn't. Swift forces you to specify the default value, just like then you handle variables and fields. The only case where Swift has a concept of default value is for optional types, where it's nil (Optional.None).
An iffy 'YES'. You can use protocol constraints to specify the requirement that your generic class or function will only work with types that implement the default init function (parameter-less). The ramifications of this will most likely be bad (it doesn't work the way you think it does), but it is the closest thing to what you were asking for, probably closer than the 'NO' answer.
For me I found this personally to be helpful during development of a new generic class, and then eventually I remove the constraint and fix the remaining issues. Requiring only types that can take on a default value will limit the usefulness of your generic data type.
public protocol Defaultable
{
init()
}
struct Matrix<Type: Defaultable>
{
let rows: Int
let columns: Int
var grid: [Type]
init(rows: Int, columns: Int)
{
self.rows = rows
self.columns = columns
grid = Array(count: rows * columns, repeatedValue: Type() )
}
}
There is a way to get the equivalent of default(T) in swift, but it's not free and it has an associated hazard:
public func defaultValue<T>() -> T {
let ptr = UnsafeMutablePointer<T>.alloc(1)
let retval = ptr.memory
ptr.dealloc(1)
return retval;
}
Now this is clearly a hack because we don't know if alloc() initializes to something knowable. Is it all 0's? Stuff left over in the heap? Who knows? Furthermore, what it is today could be something different tomorrow.
In fact, using the return value for anything other than a placeholder is dangerous. Let's say that you have code like this:
public class Foo { /* implementation */
public struct Bar { public var x:Foo }
var t = defaultValue<Bar>();
t = someFactoryThatReturnsBar(); // here's our problem
At the problem line, Swift thinks that t has been initialized because that's what Swift's semantics say: you cannot have a variable of a value type that is uninitialized. Except that it is because default<T> breaks those semantics. When you do the assignment, Swift emits a call into the value witness table to destroy the existing type. This will include code that will call release on the field x, because Swift semantics say that instances of objects are never nil. And then you get a runtime crash.
However, I had cause to interoperate with Swift from another language and I had to pass in an optional type. Unfortunately, Swift doesn't provide me with a way to construct an optional at runtime because of reasons (at least I haven't found a way), and I can't easily mock one because optionals are implemented in terms of a generic enum and enums use a poorly documented 5 strategy implementation to pack the payload of an enum.
I worked around this by passing a tuple that I'm going to call a Medusa tuple just for grins: (value: T, present: Bool) which has the contract that if present is true, then value is guaranteed to be valid, invalid otherwise. I can use this safely now to interop:
public func toOptional<T>(optTuple: (value:T, present:Bool)) -> T?
{
if optTuple.present { return optTuple.value }
else { return nil }
}
public func fromOptional<T>(opt: T?) -> (T, Bool)
{
if opt != nil { return (opt!, true) }
else {
return (defaultValue(), false)
}
}
In this way, my calling code passes in a tuple instead of an optional and the receiving code and turn it into an optional (and the reverse).
In fact, generic types can have optional values.
You can do it with protocols
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
let someType = SomeType<String>()
let someTypeWithDefaultGeneric = SomeType()
}
}
protocol Default {
init()
}
extension Default where Self: SomeType<Void> {
init() {
self.init()
}
}
class SomeType<T>: Default {
required init() {}
}