How to create a matrix of CAShapeLayers? - swift

I have this code:
var triangles: [[[CAShapeLayer]]] = Array(repeating: Array(repeating: Array(repeating: 0, count: 2), count: 15), count: 15);
But it generates an "Cannot convert value of type..." compilation error.
How can I solve that? I want to access my CAShapeLayers like this:
triangles[1][2][1].fillColor = UIColor(red: 40/255, green: 73/255, blue: 80/255, alpha: 1).cgColor;

Use optionals.
var triangles: [[[CAShapeLayer?]]] = Array(repeating: Array(repeating: Array(repeating: nil, count: 2), count: 15), count: 15)
Now there's a nil instead of a 0, which is what I think you were hinting at. But every triangles[x][y][z] is now an optional type you'll have to safely unwrap.
So now you have to do something like triangles[x][y][z] = CAShapeLayer() before you do anything to that object.
Edit for correction. Thanks #OOPer
I thought about it some more, and realized I didn't really answer your question.
So you may use for loops to initialize everything (which would be a pain), or you could do something like this every time you access an index:
if triangles[x][y][z] == nil
{
triangles[x][y][z] = CAShapeLayer()
}
let bloop = triangles[x][y][z]!
bloop.fillColor = UIColor(...
Then you could pull it out into an outside method so it becomes a 1 liner. Like:
func tri(at x: Int, _ y: Int, _ z: Int) -> CAShapeLayer
{
if triangles[x][y][z] == nil
{
triangles[x][y][z] = CAShapeLayer()
}
return triangles[x][y][z]!
}
Then when using it:
tri(at: 1, 2, 1).fillColor = ...
Of course, you should pull triangles out and make it a property of the class you're in, or you can include it in the parameter list of that 1 liner method.

All that nesting makes your code hard to understand, and Array(repeating:count:) can't do what you want anyway.
func newGrid() -> [[[CAShapeLayer]]] {
func newStack() -> [CAShapeLayer] {
return (0 ..< 2).map({ _ in CAShapeLayer() })
}
func newRow() -> [[CAShapeLayer]] {
return (0 ..< 15).map({ _ in newStack() })
}
return (0 ..< 15).map({ _ in newRow() })
}
var triangles = newGrid()

You cannot use "0" as the repeating value, it will be inferred to be type [[[Int]]]. Just replace "0" with "CAShapeLayer()"

Related

How to extract UnsafePointer<CGFloat> from UnsafePointer<CGPoint> - Swift

I’m playing around with learning about pointers in Swift.
For instance, this code starts with a CGPoint array, creates an UnsafePointer and then extracts all the x values into a CGFloat array:
import Foundation
let points = [CGPoint(x:1.2, y:3.33), CGPoint(x:1.5, y:1.21), CGPoint(x:1.48, y:3.97)]
print(points)
let ptr = UnsafePointer(points)
print(ptr)
func xValues(buffer: UnsafePointer<CGPoint>, count: Int) -> [CGFloat]? {
return UnsafeBufferPointer(start: buffer, count: count).map { $0.x }
}
let x = xValues(buffer: ptr, count: points.count)
print(x)
And the expected output is:
[Foundation.CGPoint(x: 1.2, y: 3.33), Foundation.CGPoint(x: 1.5, y: 1.21), Foundation.CGPoint(x: 1.48, y: 3.97)]
0x0000556d6b818aa0
Optional([1.2, 1.5, 1.48])
Now I’d like to have the xValues function return directly UnsafePointer<CGFloat>, instead of going through [CGFloat].
How do I do that, is that possible?
It is unsafe to output pointers like that. As mentioned in comments you should use withUnsafeBufferPointer method to access the underlying buffer:
let points = [
CGPoint(x:1.2, y:3.33),
CGPoint(x:1.5, y:1.21),
CGPoint(x:1.48, y:3.97)
]
let xValues = points.withUnsafeBufferPointer { buffer in
return buffer.map { $0.x }
}
If you need a pointer to the array of CGFloat just use the same method as above:
xValues.withUnsafeBufferPointer { buffer in
// Do things with UnsafeBufferPointer<CGFloat>
}
A good Swift Pointer tutorial here.
Edit
Here is a working example:
let points = [
CGPoint(x:1.2, y:3.33),
CGPoint(x:1.5, y:1.21),
CGPoint(x:1.48, y:3.97)
]
// Create, init and defer dealoc
let ptr = UnsafeMutablePointer<CGFloat>.allocate(capacity: points.count)
ptr.initialize(repeating: 0.0, count: points.count)
defer {
ptr.deallocate()
}
// Populate pointer
points.withUnsafeBufferPointer { buffer in
for i in 0..<buffer.count {
ptr.advanced(by: i).pointee = buffer[i].x
}
}
// Do things with UnsafeMutablePointer<CGFloat>, for instance:
let buffer = UnsafeBufferPointer(start: ptr, count: points.count)
for (index, value) in buffer.enumerated() {
print("index: \(index), value: \(value)")
}

Using an SKAction to call a function with parameters

I am trying to call a function via an SKAction that takes 1 input parameter.
When I declare the SKAction, I get an error: "Cannot convert value of type '()' to expected argument type '() -> Void'
self.run(SKAction.repeat(SKAction.sequence([SKAction.wait(forDuration: 1), SKAction.run(self.decreaseHealth(by: 5.0))]), count: 10))
func decreaseHealth(by amount: Double){
print(health)
health -= amount
print(health)
let percentageToDecrease = amount / totalHealth
let resizeAction = SKAction.resize(toHeight: CGFloat(600*(1-percentageToDecrease)), duration: 1)
let redAmount: CGFloat = CGFloat(1.0 - health / totalHealth)
let greenAmount: CGFloat = CGFloat(health / totalHealth)
let recolorAction = SKAction.colorize(with: UIColor(red: redAmount, green: greenAmount, blue: CGFloat(0.0), alpha: CGFloat(1.0)), colorBlendFactor: 0, duration: 1)
healthBar.run(SKAction.group([resizeAction, recolorAction]))
}
usually when you get that error, it just means you need to enclose the function inside of curly brackets. so:
self.run(SKAction.repeat(SKAction.sequence([SKAction.wait(forDuration: 1), SKAction.run( { self.decreaseHealth(by: 5.0) } )]), count: 10))
If you had split this into multiple lines, it would have been easier to spot.
this is because decreaseHealth is of type () but { decreasHealth } if of type () -> () or () -> Void
(Closures are function type in Swift, thus have parameter / arguments and returns)
I haven't used SpriteKit for 3 months. But something like the following should work.
func repeatMe() {
let waitAction = SKAction.wait(forDuration: 1.0)
let completionAction = SKAction.run {
self.decreaseHealth(by: 5.0)
}
let seqAction = SKAction.sequence([waitAction, completionAction])
let repeatAction = SKAction.repeat(seqAction, count: 10)
}

RGB Values doing strange things when colorizing? - Swift

func colorBall() {
let colorize1 = SKAction.colorizeWithColor(UIColor.redColor(), colorBlendFactor: 1.0, duration: 0.1)
let colorize2 = SKAction.colorizeWithColor(UIColor.greenColor(), colorBlendFactor: 1.0, duration: 0.1)
let colorize3 = SKAction.colorizeWithColor(UIColor.blueColor(), colorBlendFactor: 1.0, duration: 0.1)
let actions = [colorize1, colorize2, colorize3]
let randomIndex = Int(arc4random_uniform(3))
self.Ball.runAction(actions[randomIndex])
}
var colorBucket = [UIColor]()
func randomColor() -> UIColor {
if colorBucket.isEmpty {
fillBucket()
}
let randomIndex = Int(arc4random_uniform(UInt32(colorBucket.count)))
let randomColor = colorBucket[randomIndex]
colorBucket.removeAtIndex(randomIndex)
return randomColor
}
func fillBucket() {
colorBucket = [UIColor.redColor(), UIColor.greenColor(), UIColor.blueColor()]
}
When I run this code in my game, and print out the color value of my ball, it sometimes prints out numbers like this:
UIDeviceRGBColorSpace 1 2.98023e-08 2.98023e-08 1
Why does it do this? I just want it to say: UIDeviceRGBColorSpace 0 0 1 1 if it's blue, IDeviceRGBColorSpace 1 0 0 1 if it's red, etc.
How can I keep those numbers from going higher than one, or much lower than one? What makes them do that in my code?
Based partially on zneak's answer I've made this (no thrills or frills) extension to UIColor which could come in handy:
extension UIColor {
func isVisuallyEqual(color: UIColor) -> Bool {
let compareValues = CGColorGetComponents(color.CGColor)
let values = CGColorGetComponents(self.CGColor)
let count = CGColorGetNumberOfComponents(self.CGColor)
if count != CGColorGetNumberOfComponents(color.CGColor) {
debugPrint("color-parameter has mismatching colorSpace")
return false
}
for index in 0..<count {
if !fuzzyFloatCompares(values[index], float2: compareValues[index]) {
return false
}
}
return true
}
private func fuzzyFloatCompares(float1: CGFloat, float2: CGFloat) -> Bool {
let difference = float1 - float2
return difference >= -1/256 && difference <= 1/256
}
}
2.98023e-08 is 0.0000000298023. If you look up the value 2.98023e-08 on Google or another search engine, you can find several examples of people getting that value because of rounding errors. Rounding errors occur because of how computers treat floating-point numbers.
It's probably a rounding error from the interpolation code that colorizeWithColor uses, and you get it instead of zero. For practical purposes, when talking about color components about to be displayed to an end user, I'd say that anything smaller than 1/256 can be considered to be zero.
You can test if two floating point numbers are "about equal" like this (typed on my phone, not really guaranteed to work):
func areAboutTheSame(a: Double, b: Double) -> Bool {
let difference = a-b
return difference < 1/256 && difference > -1/256
}

How to create 3D array of optionals w/ set size

How would I go about creating a 3-dimensional array of UInt16? with a set size where each element is by default set to nil?
My attempt was
var courseInfo = UInt16?(count:10, repeatedValue: UInt16?(count:10, repeatedValue: UInt16?(count:10, repeatedValue:nil)))
although that doesn't seem to work. Any ideas?
Your code errored because you weren't creating arrays, you were mixing them up with UInt16?s.
Let's start at the base case, how do you make a one-dimensional array?
Array<UInt16?>(count: 10, repeatedValue: nil)
What if we wanted a two dimensional array? Well, now we are no longer initializing an Array<UInt16?> we are initializing an Array of Arrays of UInt16?, where each sub-array is initialized with UInt16?s.
Array<Array<UInt16?>>(count:10, repeatedValue: Array<UInt16?>(count:10, repeatedValue:nil))
Repeating this for the 3-dimensional case just requires more of the same ugly nesting:
var courseInfo = Array<Array<Array<UInt16?>>>(count:10, repeatedValue: Array<Array<UInt16?>>(count:10, repeatedValue: Array<UInt16?>(count:10, repeatedValue:nil)))
I'm not sure if this is the best way to do it, or to model a 3D structure, but this is the closest thing to your code right now.
EDIT:
Martin in the comments pointed out that a neater solution is
var courseInfo : [[[UInt16?]]] = Array(count: 10, repeatedValue: Array(count : 10, repeatedValue: Array(count: 10, repeatedValue: nil)))
Which works by moving the type declaration out front, making the repeatedValue: parameter unambiguous.
Build yourself an abstraction to allow:
var my3DArrayOfOptionalUInt16 = Matrix<UInt16?> (initial: nil, dimensions: 10, 10, 10)
using something like:
struct Matrix<Item> {
var items : [Item]
var dimensions : [Int]
var rank : Int {
return dimensions.count
}
init (initial: Item, dimensions : Int...) {
precondition(Matrix.allPositive(dimensions))
self.dimensions = dimensions
self.items = [Item](count: dimensions.reduce(1, combine: *), repeatedValue: initial)
}
subscript (indices: Int...) -> Item {
precondition (Matrix.validIndices(indices, dimensions))
return items[indexFor(indices)]
}
func indexFor (indices: [Int]) -> Int {
// Compute index into `items` based on `indices` x `dimensions`
// ... row-major-ish
return 0
}
static func validIndices (indices: [Int], _ dimensions: [Int]) -> Bool {
return indices.count == dimensions.count &&
zip(indices, dimensions).reduce(true) { $0 && $1.0 > 0 && ($1.0 < $1.1) }
}
static func allPositive (values: [Int]) -> Bool {
return values.map { $0 > 0 }.reduce (true) { $0 && $1 }
}
}

'(Int, Int)' is not identical to 'CGPoint'

I got error: '(Int, Int)' is not identical to 'CGPoint'
How to convert a (Int, Int) to CGPoint
let zigzag = [(100,100),
(100,150),(150,150),
(150,200)]
override func drawRect(rect: CGRect)
{
// Get the drawing context.
let context = UIGraphicsGetCurrentContext()
// Create the shape (a vertical line) in the context.
CGContextBeginPath(context)
//Error is here
CGContextAddLines(context, zigzag, zigzag.count)
// Configure the drawing environment.
CGContextSetStrokeColorWithColor(context,UIColor.redColor().CGColor)
// Request the system to draw.
CGContextStrokePath(context)
}
CGContextAddLines() expects an array of CGPoint. If you already have an
array of (Int, Int) tuples then you can convert it with
let points = zigzag.map { CGPoint(x: $0.0, y: $0.1) }
An alternate way to avoid the boilerplate code required to create instances of the same type is to make CGPoint implement the ArrayLiteralConvertible, making it initializable by assigning an array of CGFloat:
extension CGPoint : ArrayLiteralConvertible {
public init(arrayLiteral elements: CGFloat...) {
self.x = elements.count > 0 ? elements[0] : 0.0
self.y = elements.count > 1 ? elements[1] : 0.0
}
}
and then use it as follows:
let zigzag:[CGPoint] = [
[100,100],
[100,150],
[150,150],
[150,200]
]
A few notes:
stylistically, it doesn't look good - it would be good if literals could be used for tuples, but I am not aware of any way to do that
if an empty array is used, the CGPoint is initialized with x = 0 and y = 0
if an array with one element is used, it is initialized with y = 0
if more than 2 values are used, all the ones after the 2nd are ignored
If it tells you to use CGPoint, use it! Just (number,number) is a pair of ints.
let zigzag = [CGPointMake(100,100),
CGPointMake(100,150),CGPointMake(150,150),
CGPointMake(150,200)]
Yet another:
func CGPoints(points:(x:CGFloat, y:CGFloat)...) -> [CGPoint] {
return map(points) { CGPoint($0) }
}
let zigzag = CGPoints(
(100,100),(100,150),(150,150),(150,200)
)