I'm translating the Objective-C code in this repo into Swift, in order to learn some of the basics of OpenGL. I'm totally new to it. I've got a working project that compiles and produces a working NSOpenGLView with a floating rectangle, but the colors are wrong. I've narrowed the problem down to my use of the glVertexAttribPointer functions that point to the vertex and color data.
Here is how I have my vertex and color data stored:
struct Vertex {
var position: (x: GLfloat, y: GLfloat, z: GLfloat, w: GLfloat)
var color: (r: GLfloat, g: GLfloat, b: GLfloat, a: GLfloat)
}
struct Vertices {
var v1: Vertex
var v2: Vertex
var v3: Vertex
var v4: Vertex
}
var vertexData = Vertices(
v1: Vertex( position: (x: -0.5, y: -0.5, z: 0.0, w: 1.0),
color: (r: 1.0, g: 0.0, b: 0.0, a: 1.0)),
v2: Vertex( position: (x: -0.5, y: 0.5, z: 0.0, w: 1.0),
color: (r: 0.0, g: 1.0, b: 0.0, a: 1.0)),
v3: Vertex( position: (x: 0.5, y: 0.5, z: 0.0, w: 1.0),
color: (r: 0.0, g: 0.0, b: 1.0, a: 1.0)),
v4: Vertex( position: (x: 0.5, y: -0.5, z: 0.0, w: 1.0),
color: (r: 1.0, g: 1.0, b: 1.0, a: 1.0)) )
The Objective-C versions of the glVertexAttribPointer functions that I am trying to translate look like this:
glVertexAttribPointer((GLuint)positionAttribute, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *)offsetof(Vertex, position));
glVertexAttribPointer((GLuint)colourAttribute , 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (const GLvoid *)offsetof(Vertex, colour ));
The Objective-C versions use the offsetof macro to set the pointer parameter of these functions. Swift doesn't allow macros, and so I'm trying to figure out what to use in its place. I've tried passing nil, like this:
glVertexAttribPointer(GLuint(positionAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), nil)
glVertexAttribPointer(GLuint(colorAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), nil)
But if I do that, the color data array is filled with the position data - no offset is taken into account, so it uses the position data for both the position and color attributes.
I found this stackoverflow answer which suggests using withUnsafePointer and tried it out, like this:
withUnsafePointer(&vertexData.v1.position) { ptr in
glVertexAttribPointer(GLuint(positionAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), ptr)
}
withUnsafePointer(&vertexData.v1.color) { ptr in
glVertexAttribPointer(GLuint(colorAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), ptr)
}
But that crashes the whole display, requiring a forced shutdown and a reboot.
I'm not sure what to try next. The complete code that I'm working on is available here.
EDIT:
I've also tried taking a pointer to the first data point and advancing it by 4 GLfloats, like this:
let ptr = UnsafePointer<GLfloat>([vertexData.v1.position.x])
glVertexAttribPointer(GLuint(positionAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), ptr)
glVertexAttribPointer(GLuint(colorAttribute), 4, UInt32(GL_FLOAT), UInt8(GL_FALSE), GLsizei(sizeof(Vertex)), ptr.advancedBy(4))
The display doesn't crash, but nothing is drawn to the screen at all.
To replicate the offsetof functionality you just need to know at which byte offset does each field lie in your structure.
In your case, assuming your struct is tightly packed, the first call should receive 0, and the second 4 * GLfloat because that's the size of the first component. I'm not sure how to extract this data directly from the structure, especially since you're using tuples.
To illustrate, your structure:
struct Vertex {
var position: (x: GLfloat, y: GLfloat, z: GLfloat, w: GLfloat)
var color: (r: GLfloat, g: GLfloat, b: GLfloat, a: GLfloat)
}
is most probably laid out like that:
GLfloat // <-- structure start // <-- position
GLfloat
GLfloat
GLfloat
GLfloat // <-- color
GLfloat
GLfloat
GLfloat
Hence the position lies at offset 0, and color at 4 * GLfloat.
To create a pointer with a value 16, this answer looks relevant:
let ptr = UnsafePointer<()> + 16
So I suppose this should work as well:
let ptr = UnsafePointer<()> + sizeof(GLfloat) * 4
I defined Vertex structure like below:
struct Vertex {
var x : GLfloat = 0.0
var y : GLfloat = 0.0
var z : GLfloat = 0.0
var r : GLfloat = 0.0
var g : GLfloat = 0.0
var b : GLfloat = 0.0
var a : GLfloat = 1.0
init(_ x : GLfloat, _ y : GLfloat, _ z : GLfloat, _ r : GLfloat = 0.0, _ g : GLfloat = 0.0, _ b : GLfloat = 0.0, _ a : GLfloat = 1.0) {
self.x = x
self.y = y
self.z = z
self.r = r
self.g = g
self.b = b
self.a = a
}
and then I set the vertices
let vertices : [Vertex] = [
Vertex( 1.0, -1.0, 0, 1.0, 0.0, 0.0, 1.0),
Vertex( 1.0, 1.0, 0, 0.0, 1.0, 0.0, 1.0),
Vertex(-1.0, 1.0, 0, 0.0, 0.0, 1.0, 1.0),
Vertex(-1.0, -1.0, 0, 1.0, 1.0, 0.0, 1.0)
]
let indices : [GLubyte] = [
0, 1, 2,
2, 3, 0
]
I use GLKViewController, so glkView(view:drawInRect:) function like below:
override func glkView(view: GLKView, drawInRect rect: CGRect) {
glClearColor(1.0, 0.0, 0.0, 1.0);
glClear(GLbitfield(GL_COLOR_BUFFER_BIT))
shader.prepareToDraw()
glEnableVertexAttribArray(VertexAttributes.Position.rawValue)
glVertexAttribPointer(
VertexAttributes.Position.rawValue,
3,
GLenum(GL_FLOAT),
GLboolean(GL_FALSE),
GLsizei(sizeof(Vertex)), BUFFER_OFFSET(0))
glEnableVertexAttribArray(VertexAttributes.Color.rawValue)
glVertexAttribPointer(
VertexAttributes.Color.rawValue,
4,
GLenum(GL_FLOAT),
GLboolean(GL_FALSE),
GLsizei(sizeof(Vertex)), BUFFER_OFFSET(3 * sizeof(GLfloat))) // x, y, z | r, g, b, a :: offset is 3*sizeof(GLfloat)
glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer)
glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer)
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(indices.count), GLenum(GL_UNSIGNED_BYTE), nil)
glDisableVertexAttribArray(VertexAttributes.Position.rawValue)
}
the BUFFER_OFFSET function is defined like below:
func BUFFER_OFFSET(n: Int) -> UnsafePointer<Void> {
let ptr: UnsafePointer<Void> = nil
return ptr + n
}
I can get the result
Great Thanks #Bartek!
Related
I have class that have two similar looking initializers:
public init(
contentInsets: MyInsets = .init(vertical: .level0, horizontal: .level6),
midToLeftSpacing: CGFloat = 0,
rightToMidSpacing: CGFloat = 0
)
public init(
contentInsets: MyInsets = .init(vertical: .level0, horizontal: .level6),
midToLeftSpacing: MyInsetLevel = .level0,
rightToMidSpacing: MyInsetLevel = .level0
)
Now i run into problem, looks like when i trying to call initializer compiler can't figure out which one to chose:
init(
vertical: .level0,
left: .level22,
right: .level0
)
Gives error - Ambiguous use of 'init'
It could be easily fixed with modifying initializer or adding next argument with default value, but it doesn't look right, is there any other way to specify init method called?
Your example doesn't match your code (there is no init(vertical:left:right) defined), so I'm going to try to generalize this problem a bit with a full example that avoids the extra types. I'll just use Double and Int here.
Imagine a struct X that internally stores three Doubles:
struct X {
var x: Double
var y: Double
var z: Double
init(x: Double = 0, y: Double = 0, z: Double = 0) {
self.x = x
self.y = y
self.z = z
}
}
You also want to be able to pass three Ints instead, but all three must either be Doubles or Ints, and you want default values. Every use of default values is just a shorthand for an extra method with fewer parameters. To avoid conflicts, you need to include at least one of the parameters as non-default:
extension X {
init(x: Int, y: Int = 0, z: Int = 0) {
self.init(x: Double(x), y: Double(y), z: Double(z))
}
init(y: Int, z: Int = 0) { self.init(x: 0, y: y, z: z) }
init(z: Int) { self.init(x: 0, y: 0, z: z) }
}
Now, every combination is legal, but there are no ambiguities.
This is nice when there's just one extra way to call the init. But maybe you want there to be many. Then a protocol is useful. For example, this approach allows either Int or Double for every parameter:
protocol XParameter {
var xParameterValue: Double { get }
}
extension Double: XParameter {
var xParameterValue: Double { self }
}
extension Int: XParameter {
var xParameterValue: Double { Double(self) }
}
struct X {
var x: Double
var y: Double
var z: Double
init(x: some XParameter = 0, y: some XParameter = 0, z: some XParameter = 0) {
self.x = x.xParameterValue
self.y = y.xParameterValue
self.z = z.xParameterValue
}
}
I'm trying to figure out how to call variadic C functions that write to pointers from Swift, such as vsscanf, but I don't understand how to actually construct the list of pointers to Swift variables.
I figure that if I have a string, I can get an UnsafePointer<CChar> and call vsscanf on it, but... how do I tell it where to actually write the data? How do I construct the CVaListPointer to pass to vsscanf?
var a: Int
var b: Float
"(5, 3.14)".withCString{buffer in
let r = vsscanf(buffer, "(%d, %f)", /* how do I put a and b here? */)
}
Basically, doing the same thing as here (C):
#include <stdio.h>
#include <stdarg.h>
int parse(const char *buffer, char *format, ...)
{
va_list args;
va_start(args, format);
int result = vsscanf(buffer, format, args);
va_end(args);
return result;
}
int main(int argc, char const *argv[])
{
int a;
float b;
char s[] = "(5, 3.14)";
int r = parse(s, "(%d, %f)", &a, &b);
printf("a: %d, b: %f\n", a, b);
// "a: 5, b: 3.140000"
return 0;
}
UPDATE: Thanks to the answers so far, I feel like I understand how this works a bit better, but what I'm still struggling with is populating the CVaListPointer in an easier way. Here's an example of parsing a triplet of Doubles out of a string:
func parseVec3(s: String) -> (Double, Double, Double)? {
var x: Double = 0
var y: Double = 0
var z: Double = 0
let r = s.withCString { buffer in
withUnsafeMutablePointer(to: &x) { ptrx in
withUnsafeMutablePointer(to: &y) { ptry in
withUnsafeMutablePointer(to: &z) { ptrz in
withVaList([ptrx, ptry, ptrz]) { va in
return vsscanf(buffer, "(%lf %lf %lf)", va)
}
}
}
}
}
return r == 3 ? (x, y, z) : nil
}
if let v = parseVec3(s: "(1 2 3)") {
print(v)
}
Now, this does work. But my problem is that I'm parsing a file where the bulk of the lines (thousands upon thousands of them) are six groups of triplets of numbers. The towering structure of withUnsafeMutablePointer would look downright ridiculous. I'm sure I could parse it using some more Swift-native approach (or just regex) but I was hoping to just use vsscanf because parsing this file in C is outrageously simple:
int main(int argc, char const *argv[])
{
char s[] = "(1 2 3) (5 9 1) (0 5 8)";
Vec3 a, b, c = {0, 0, 0};
sscanf(s,
"(%f %f %f) (%f %f %f) (%f %f %f)",
&(a.x), &(a.y), &(a.z),
&(b.x), &(b.y), &(b.z),
&(c.x), &(c.y), &(c.z)
);
printf("a: (x: %f, y: %f, z: %f)\n", a.x, a.y, a.z);
printf("b: (x: %f, y: %f, z: %f)\n", b.x, b.y, b.z);
printf("c: (x: %f, y: %f, z: %f)\n", c.x, c.y, c.z);
return 0;
}
Doing this with the withUnsafeMutablePointer approach in Swift as above would result in 11 with<Whatever> scopes, and that's only half of the floats parsed...
I figure I should be able to do something like this, but I can't figure out how to get the pointer offset to the other struct members:
func parseVec3_3(s: String) -> Vector3? {
var output = Vector3(x: 0, y: 0, z: 0)
var r: CInt = 0
s.withCString { buffer in
withUnsafeMutablePointer(to: &output) { ptr in
withVaList([ptr, /* offset pointer here */]) { va in
r = vsscanf(buffer, "(%lf %lf %lf)", va)
}
}
}
return r == 3 ? output : nil
}
Like this:
var a: Int = 0
var b: Float = 0
withUnsafePointer(to: &a) { pointerToA in
withUnsafePointer(to: &b) { pointerToB in
withVaList([pointerToA, pointerToB]) { va_list in
"(5, 3.14)".withCString { buffer in
let r = vsscanf(buffer, "(%d, %f)", va_list)
}
}
}
}
print(a)
print(b)
outputs
5
3.14
Update
The answer for OP edit including question about Vector3(x: 0, y: 0, z: 0) should become:
public struct Vector3: Equatable {
public var x: CGFloat
public var y: CGFloat
public var z: CGFloat
public init(x: CGFloat, y: CGFloat, z: CGFloat) {
self.x = x
self.y = y
self.z = z
}
}
func parseVec3_3(s: String) -> Vector3? {
var output = Vector3(x: 0, y: 0, z: 0)
var r: CInt = 0
s.withCString { buffer in
withUnsafePointer(to: &output) { (outputPointer: UnsafePointer<Vector3>) in
outputPointer.withMemoryRebound(to: CGFloat.self, capacity: 3) {
withVaList([OpaquePointer($0), OpaquePointer($0.advanced(by: 1)), OpaquePointer($0.advanced(by: 2))]) { va in
r = vsscanf(buffer, "(%lf %lf %lf)", va)
}
}
}
}
return r == 3 ? output : nil
}
if let vector: Vector3 = parseVec3_3(s: "(1.0 2.0 3.0)") {
print(vector)
}
outputs
Vector3(x: 1.0, y: 2.0, z: 3.0)
From the documentation on CVarArgs:
To create a wrapper for the c_api function, write a function that takes CVarArg arguments, and then call the imported C function using the withVaList(_:_:) function.
Swift only imports C variadic functions that use a va_list for their arguments. C functions that use the ... syntax for variadic arguments are not imported, and therefore can’t be called using CVarArg arguments.
Your wrapper function could look like:
func vsscanfSwiftWrapper(
buffer: UnsafePointer<CChar>,
format: UnsafePointer<CChar>,
_ arguments: CVarArg...
) -> CInt {
withVaList(arguments) { vaList in
vsscanf(buffer, format, vaList)
}
}
This question already has answers here:
round trip Swift number types to/from Data
(3 answers)
Swift 5.0: 'withUnsafeBytes' is deprecated: use `withUnsafeBytes<R>(...)
(4 answers)
Closed 1 year ago.
I am not sure where I got this piece of code. But here I have some code to get series of Vertex(or whatever) out of Data in swift.
struct Vertex {
var x, y, z, w: Float16
var r, g, b, a: Float16
}
extension Data {
func elements<T>() -> [T] {
return withUnsafeBytes {
Array(UnsafeBufferPointer<T>(start: $0, count: count/MemoryLayout<T>.stride))
}
}
}
It works fine to me, but I have this warning. I spent some time, but I cannot figure this out. So could someone please help me out?
'withUnsafeBytes' is deprecated: use `withUnsafeBytes<R>(_: (UnsafeRawBufferPointer) throws -> R) rethrows -> R` instead
By the way, I am trying to save and load large number of data with this way with pairing of following piece of code.
extension Array {
var data: Data {
var value = self
return NSData(bytes: &value, length: MemoryLayout<Element>.stride * self.count) as Data
}
}
Thank you,
EDIT
Here is the code what I like to do, it works fine, but I like to get rid of warning...
struct Vertex: Equatable {
var x, y, z, w: Float16
var r, g, b, a: Float16
// assumption: no precision error
static func == (lhs: Self, rhs: Self) -> Bool {
return lhs.x == rhs.x && lhs.y == rhs.y && lhs.z == rhs.z && lhs.w == rhs.w &&
lhs.r == rhs.r && lhs.g == rhs.g && lhs.b == rhs.b && lhs.a == rhs.a
}
}
let v0 = Vertex(x: 1, y: 2, z: 3, w: 4, r: 0, g: 0.25, b: 0.5, a: 0.75)
let v1 = Vertex(x: 5, y: 6, z: 7, w: 8, r: 0.2, g: 0.4, b: 0.6, a: 0.8)
let v2 = Vertex(x: 9, y: 0, z: 1, w: 2, r: 0.5, g: 0.75, b: 0.0, a: 1.0)
let original: [Vertex] = [v0, v1, v2]
let data = original.data
print(data as NSData)
let restored: [Vertex] = data.elements()
let w0 = restored[0]
let w1 = restored[1]
let w2 = restored[2]
print(v0 == w0)
print(v1 == w1)
print(v2 == w2)
First make your struct conform to ContiguousBytes:
extension Vertex: ContiguousBytes {
func withUnsafeBytes<R>(_ body: (UnsafeRawBufferPointer) throws -> R) rethrows -> R {
try Swift.withUnsafeBytes(of: self) { try body($0) }
}
}
Then create a custom initializer on ContiguousBytes to allow initializing any type that conforms to it with contiguous bytes:
extension ContiguousBytes {
init<T: ContiguousBytes>(_ bytes: T) {
self = bytes.withUnsafeBytes { $0.load(as: Self.self) }
}
}
To extract the bytes/data from the types that conform to it:
extension ContiguousBytes {
var bytes: [UInt8] { withUnsafeBytes { .init($0) } }
var data: Data { withUnsafeBytes { .init($0) } }
}
Now you can simply make the magic happen.
Playground testing:
struct Vertex {
var x, y, z, w: Float16
var r, g, b, a: Float16
}
let vertex = Vertex(x: 1.2, y: 2.3, z: 3.4, w: 4.5, r: 0.5, g: 0.6, b: 0.7, a: 1)
let bytes = vertex.bytes // [205, 60, 154, 64, 205, 66, 128, 68, 0, 56, 205, 56, 154, 57, 0, 60]
let loadedVertex = Vertex(bytes)
print(loadedVertex) // Vertex(x: 1.2, y: 2.3, z: 3.4, w: 4.5, r: 0.5, g: 0.6, b: 0.7, a: 1.0)
edit/update:
to convert your bytes to a collection of vertices:
extension Array {
var data: Data {
var value = self
return .init(bytes: &value, count: MemoryLayout<Element>.stride * count)
}
}
extension ContiguousBytes {
func objects<T>() -> [T] { withUnsafeBytes { .init($0.bindMemory(to: T.self)) } }
var vertices: [Vertex] { objects() }
}
let vertex1 = Vertex(x: 1.2, y: 2.3, z: 3.4, w: 4.5, r: 0.5, g: 0.6, b: 0.7, a: 1)
let vertex2 = Vertex(x: 2.3, y: 3.4, z: 4.5, w: 5.6, r: 1, g: 0.8, b: 1, a: 1)
let data = [vertex1, vertex2].data
let loadedVertices = data.vertices
print(loadedVertices) // [__lldb_expr_8.Vertex(x: 1.2, y: 2.3, z: 3.4, w: 4.5, r: 0.5, g: 0.6, b: 0.7, a: 1.0), __lldb_expr_8.Vertex(x: 2.3, y: 3.4, z: 4.5, w: 5.6, r: 1.0, g: 0.8, b: 1.0, a: 1.0)]
To mirror what is being done in .data I believe you just want to copy the bytes directly into the Array. This is all pretty dangerous, because it's making some pretty strong assumptions around memory management (if T, or any property of T were a class or included hidden classes internally, the reference counts could be wrong). So I'd be really uncomfortable making this a generic extension like this. But for Float16 it's probably fine, assuming that you never encounter an endian issue (or an architecture that pads differently).
But to just do what you're doing, I'd recommend:
extension Data {
func elements<T>() -> [T] {
Array(unsafeUninitializedCapacity: count/MemoryLayout<T>.stride) { buffer, initializedCount in
copyBytes(to: buffer)
initializedCount = buffer.count
}
}
}
I have something like the following
struct ht : Hashable {
var x : Int
var y : Int
var z : Int
//Edit added following func following Ahmad's comment
var hashValue: Int { return (x+y+z) }
static func == (lhs: ht, rhs: ht) -> Bool
{
return lhs.x == rhs.x
&& lhs.y == rhs.y
&& lhs.z == rhs.z
}
}
let defaultX = 4
let searchForX = 7
//let searchForX = 1000000
var ss : Set<ht> = [ ht(x:1,y:2,z:3), ht(x:4,y:5,z:6), ht(x:7, y:8, z:100), ht(x:9, y:19, z:12)]
Does Swift have LINQ like functionality where I can search the set ss to retrieve the struct that matches searchForX = 7 as shown above to return (7, 8, 100)? If searchForX = 1000000 I want it to return (4, 5, 6), so failing over to returning the value where defaultX = 4
My current code uses loops. In C# you can use LINQ and specify .FirstOrDefault(...).
Is this possible in Swift?
Thank you
Swift isn't really my thing but a ternary operator similar to the code below may move you along.
var answer = ss.first(where: { $0.x == searchForX }) ?? ss.first(where: { $0.x == defaultX })
Regards
Mike
So the quick answer to your question is
let element = ss.first { $0.x == searchForX } ?? ss.first { $0.x == defaultX }
This can be turned a simple extension
extension Sequence {
func first(_ criteria: (Element) -> Bool, orDefault: (Element) -> Bool) -> Element? {
return first(where: criteria) ?? first(where: orDefault)
}
}
let element = ss.first({ $0.x == searchForX }, orDefault: { $0.x == searchForX })
By making this an extension on Sequence, it will work for Arrays and anything that implements Sequence.
If you are concerned about performance you can roll your own.
extension Sequence {
func first(_ criteria: (Element) -> Bool, orDefault: (Element) -> Bool) -> Element? {
var defaultElement: Element?
for element in self {
if criteria(element) { return element }
if criteria(element) { defaultElement = element }
}
return defaultElement
}
}
You could implement your own custom method to do the desired functionality.
First of all, if you would declare a set of custom structs, note that your struct must be Hashable protocol -which means that it implicitly must be Equatable -, thus it would be implemented as follows:
struct MyStruct: Hashable, Equatable {
var x : Int
var y : Int
var z : Int
// Hashable
var hashValue: Int
// Equatable
static func ==(lhs: MyStruct, rhs: MyStruct) -> Bool {
return lhs.x == rhs.x
}
}
So, you could declare a set of MyStruct.
Probably this is not the right way to assign a value to the hashValue, but for the purpose of answering the question I will assign dummy values to them.
For implementing the method for achieving the desired output, I would recommend to declare it in a set extension (if you would let it also functional for arrays, you could declare it as a Collection extension instead), constrained by the set element type:
extension Set where Element == MyStruct {
func firstOrDefault(for expectedXValue: Int, defaultXValue: Int) -> MyStruct? {
// expeccted
if let expected = first(where: { $0.x == expectedXValue }) {
return expected
}
// default
if let defaultValue = first(where: { $0.x == defaultXValue }) {
return defaultValue
}
// neither
return nil
}
}
Note that if there are more than one expected or default x value in the set, it would returns the first matched one.
You would notice that the main functionality of implementing firstOrDefault method is to use first(where:) method (which is similar to what are you looking for), with a little bit of logic to get a -first- default value instead.
Output:
In case of there is an expected object with x = 7:
let mySet: Set<MyStruct> = [MyStruct(x: 1, y: 2, z: 3, hashValue: 101),
MyStruct(x: 4, y: 5, z: 6, hashValue: 102),
MyStruct(x: 7, y: 8, z: 100, hashValue: 103),
MyStruct(x: 9, y: 19, z: 12, hashValue: 104)]
let myObject = mySet.firstOrDefault(for: 7, defaultXValue: 4)
dump(myObject)
/*
▿ Optional(__lldb_expr_187.MyStruct(x: 7, y: 8, z: 9, hashValue: 102))
▿ some: __lldb_expr_187.MyStruct
- x: 7
- y: 8
- z: 9
- hashValue: 102
*/
In case of there is no expected value and the default x = 4:
let mySet2: Set<MyStruct> = [MyStruct(x: 1, y: 2, z: 3, hashValue: 101),
MyStruct(x: 4, y: 5, z: 6, hashValue: 102),
MyStruct(x: 1000, y: 8, z: 100, hashValue: 103),
MyStruct(x: 9, y: 19, z: 12, hashValue: 104)]
let myObject2 = mySet2.firstOrDefault(for: 7, defaultXValue: 4)
dump(myObject2)
/*
▿ Optional(__lldb_expr_249.MyStruct(x: 4, y: 5, z: 6, hashValue: 102))
▿ some: __lldb_expr_249.MyStruct
- x: 4
- y: 5
- z: 6
- hashValue: 102
*/
In case of there is no expected and no default value:
let mySet3: Set<MyStruct> = [MyStruct(x: 1, y: 2, z: 3, hashValue: 101),
MyStruct(x: 1000, y: 5, z: 6, hashValue: 102),
MyStruct(x: 1000, y: 8, z: 100, hashValue: 103),
MyStruct(x: 9, y: 19, z: 12, hashValue: 104)]
let myObject3 = mySet3.firstOrDefault(for: 7, defaultXValue: 4)
dump(myObject3) // - nil
I want to do set operations on co-ordinate pair elements from an x-y grid.
E.g. {(0,0),(1,4),(1,5),(2,3)} union with {(2,3),(1,4),(2,6)} = {(0,0),(1,4),(1,5),(2,3),(2,6)}
Unfortunately I can't work out a way of inserting tuples into Swift's Set commands as it says that they do not conform to the 'hashable' protocol.
Error: type '(Int, Int)' does not conform to protocol 'Hashable'
I believe I've got a work around but it involves a lot of code. Is there a simple way that I'm missing before I hit the grindstone?
Rather than using tuples to represent points, use the built in type CGPoint. You can extend CGPoint to be hashable by extending it:
import UIKit
extension CGPoint: Hashable {
public var hashValue: Int {
return self.x.hashValue << sizeof(CGFloat) ^ self.y.hashValue
}
}
// Hashable requires Equatable, so define the equality function for CGPoints.
public func ==(lhs: CGPoint, rhs: CGPoint) -> Bool {
return CGPointEqualToPoint(lhs, rhs)
}
Now that CGPoint is Hashable, you can use it in sets. For example:
let point1 = CGPoint(x: 0, y: 1)
let point2 = CGPoint(x: 0, y: 2)
let point3 = CGPoint(x: 1, y: 1)
let point4 = CGPoint(x: 3, y: 3)
let point5 = CGPoint(x: 3, y: 3) // Intentionally the same as point4 to see the effect in union and difference.
let set1 = Set([point1, point2 , point5])
let set2 = Set([point4, point3])
let union = set1.union(set2) // -> {{x 0 y 2}, {x 3 y 3}, {x 0 y 1}, {x 1 y 1}}
let difference = set1.intersect(set2) // -> {{x 3 y 3}}
You could make a struct as a Hashable type:
struct Point: Hashable {
let x: Int
let y: Int
}
Now that you have a hashable tuple, normal Set operations can be used:
let set1 = Set([
Point(x:0,y:0),
Point(x:1,y:4),
Point(x:1,y:5),
Point(x:2,y:3)
])
let set2 = Set([
Point(x:2,y:3),
Point(x:1,y:4),
Point(x:2,y:6)
])
let setUnion = set1.union(set2)
/*
setUnion = {
Point(x: 1, y: 5),
Point(x: 0, y: 0),
Point(x: 1, y: 4),
Point(x: 2, y: 3),
Point(x: 2, y: 6)
}
*/
Here you go:
class Pair {
var x: Int
var y: Int
init(x: Int, y:Int){
self.x = x
self.y = y
}
func isExisted(inPairs pairs:[Pair]) -> Bool {
for p in pairs {
if p.y == self.y && p.x == self.x{
return true
}
}
return false
}
static func union (pairs1: [Pair], pairs2: [Pair]) -> [Pair] {
var pairsFinal = [Pair]()
for p in pairs1 {
pairsFinal.append(p)
}
for p in pairs2 {
if !p.isExisted(inPairs: pairsFinal){
pairsFinal.append(p)
}
}
return pairsFinal
}
}
let pari1 = Pair(x: 4, y: 7)
let pair2 = Pair(x: 5, y: 2)
let pair3 = Pair(x: 4, y: 7)
let pair4 = Pair(x: 3, y: 9)
let pairs1 = [pari1, pair2]
let pairs2 = [pair3, pair4]
let f = Pair.union(pairs1, pairs2: pairs2)
And this is the result of the union: