Is it possible to make a comparison between several colors and then change it to its closest coounter part? - swift

For eg.
I have 3 colors.
color1 = UIColor(red: 142/255, green: 165/255, blue: 94/255, alpha: 1)
color2 = UIColor(red: 141/255, green: 114/255, blue: 96/255, alpha: 1)
color3 = UIColor(red: 214/255, green: 194/255, blue: 149/255, alpha: 1)
Is it possible to identify that color1 is closest to color2 and not color3 in swift and then make the necessary adjustments to color1 such that color1 == color2?
If you compare color1 and color2, you can see that they are of slightly different shades but are very very similar.
can I compare merely based on RGB values or do I have to use hex values?
Thanks

Related

Creating CGColor from RGB Value

I use the following code to set the background of a viewcontroller
view.wantsLayer = true
let myColor = NSColor(calibratedRed: 50, green: 50, blue: 50, alpha: 1.0)
view.layer?.backgroundColor = myColor.cgColor
But on debugging myColor i get the following color instead of the intended color
Here is the documentation. Anything above 1 is considered as 1. You need to divide each value by 255.
view.wantsLayer = true
let myColor = NSColor(calibratedRed: 50/255, green: 50/255, blue: 50/255, alpha: 1.0)
view.layer?.backgroundColor = myColor.cgColor
Also, check this out -> Link
Range of color components (RGBA) vary between [0,1]
init(red:green:blue:alpha:)
Creates a color object with the specified red, green, blue, and alpha
channel values. This method accepts extended color component values.
If the red, green, blue, or alpha values are outside of the 0-1.0
range,
So you can get cgColor
view.layer?.backgroundColor = NSColor(calibratedRed: 50.0/255.0, green: 50.0/255.0, blue: 50.0/255.0, alpha: 1.0).cgColor

How to Convert a string into a UIColor (Swift)

I have several colors defined in my SKScene:
let color1 = UIColor(red: 1, green: 153/255, blue: 0, alpha: 1)
let color2 = UIColor(red: 74/255, green: 134/255, blue: 232/255, alpha: 1)
let color3 = UIColor(red: 0, green: 0, blue: 1, alpha: 1)
let color4 = UIColor(red: 0, green: 1, blue: 0, alpha: 1)
let color5 = UIColor(red: 153/255, green: 0, blue: 1, alpha: 1)
let color6 = UIColor(red: 1, green: 0, blue: 0 , alpha: 1)
These colors correspond to tiles with different values in my game. The one tile goes with color1 and so on...
When the tiles are added together, I add their values and want to give them a new color according to their value.
So I want the value of the tile to play an effect on what color the tile is.
I have tried:
tile.color = UIColor(named: "color\(value)") ?? color1
But when I do this, it always uses the default value(color1).
How do I make so the value of the tile plays an effect in the color of the tile?
Named UIColors are initalized from color set in xcassets catalog.
You can set color depending on your value
switch value {
case 2: tile.color = color2
...
default: tile.color = color1
}
or you can create color sets in xcassets catalog.
UIColor named: only works when you define a color set asset. Since your colors are defined in code, UIColor named: will never return anything but nil.
One solution is to put your colors into a dictionary instead of separate variables.
let colors: [String: UIColor] = [
"color1" : UIColor(red: 1, green: 153/255, blue: 0, alpha: 1),
"color2" : UIColor(red: 74/255, green: 134/255, blue: 232/255, alpha: 1),
"color3" : UIColor(red: 0, green: 0, blue: 1, alpha: 1),
"color4" : UIColor(red: 0, green: 1, blue: 0, alpha: 1),
"color5" : UIColor(red: 153/255, green: 0, blue: 1, alpha: 1),
"color6" : UIColor(red: 1, green: 0, blue: 0 , alpha: 1),
]
Then you can get your color as:
tile.color = colors["color\(value)"] ?? colors["color1"]!

Why does only the built-in UIColors work here? [duplicate]

This question already has answers here:
UIColor not working with RGBA values
(6 answers)
Closed 3 years ago.
Having failed miserably at further attempts to solve this question on my own, I'm trying something I thought would work for certain:
func switchColor(data:UInt32){
switch data {
case 1..<200:
backgroundGeometry.firstMaterial?.diffuse.contents =
UIColor(red: CGFloat(242), green: CGFloat(90), blue: CGFloat(90), alpha: 1.0)
case 200..<400:
backgroundGeometry.firstMaterial?.diffuse.contents =
UIColor(red: CGFloat(252), green: CGFloat(162), blue: CGFloat(115), alpha: 1.0)
case 400..<600:
backgroundGeometry.firstMaterial?.diffuse.contents =
UIColor(red: CGFloat(244), green: CGFloat(235), blue: CGFloat(99), alpha: 1.0)
case 600..<800:
backgroundGeometry.firstMaterial?.diffuse.contents =
UIColor(red: CGFloat(110), green: CGFloat(195), blue: CGFloat(175), alpha: 1.0)
case 800..<1000:
backgroundGeometry.firstMaterial?.diffuse.contents =
UIColor(red: CGFloat(91), green: CGFloat(118), blue: CGFloat(211), alpha: 1.0)
default:
backgroundGeometry.firstMaterial?.diffuse.contents = UIColor.green
}
}
All the non-default cases turns the node white.
The default case does turn it green - and within each case, statements like UIColor.red, UIColor.blue work fine as well.
So why the heck doesn't the above statements work?
Hope you can help, I'm completely at a loss here :(
Edit: Thanks for the swift and not least correct answers! All accepted and upvoted, but I'm too much of a newbie for it to display. Thanks! :)
This should work for you:
func switchColor(data: UInt32) {
guard let contents = backgroundGeometry.firstMaterial?.diffuse.contents else {
fatalError("First material is nil") // If this really can be empty, just replace this with return
}
switch data {
case 1..<200:
contents = UIColor(red: 242/255, green: 90/255, blue: 90/255, alpha: 1)
case 200..<400:
contents = UIColor(red: 252/255, green: 162/255, blue: 115/255, alpha: 1)
case 400..<600:
contents = UIColor(red: 244/255, green: 235/255, blue: 99/255, alpha: 1)
case 600..<800:
contents = UIColor(red: 110/255, green: 195/255, blue: 175/255, alpha: 1)
case 800..<1000:
contents = UIColor(red: 91/255, green: 118/255, blue: 211/255, alpha: 1)
default:
contents = .green
}
}
The maximum value of a color is 1.0, not 255. Therefore you need to divide the values.
According to the documentation, the red, green, blue and alfa values are Float between 0.0 to 1.0 respectively. Also, a value below 0.0 is treated as 0.0 and value above 1.0 is treated as 1.0.
So you must construct the UIColor as this
UIColor(red: 91/255, green: 118/255, blue: 211/255, alpha: 1)
You need to construct them like
UIColor(red:242.0/255.0, green:90.0/255.0, blue:90.0/255.0, alpha: 1.0)
you can find the init in Docs
red/blue/green values below 0.0 are interpreted as 0.0, and values above 1.0 are interpreted as 1.0.
Another note also there is a difference between
90/255 // wrong
and
90.0/255.0 // right
the latter is the correct one as the former will truncate the floating part as it's an integer division

Xcode is displaying colors inline in the code

How to prevent Xcode from displaying colors inline
let colors:[UIColor] = [
#colorLiteral(red: 0.1019607857, green: 0.2784313858, blue: 0.400000006, alpha: 1),
#colorLiteral(red: 0.1019607857, green: 0.2784313858, blue: 0.400000006, alpha: 1),
#colorLiteral(red: 0.1019607857, green: 0.2784313858, blue: 0.400000006, alpha: 1),
#colorLiteral(red: 0.1019607857, green: 0.2784313858, blue: 0.400000006, alpha: 1),
#colorLiteral(red: 0.1019607857, green: 0.2784313858, blue: 0.400000006, alpha: 1)
]
renders the color values in boxes inline.
How to prevent this from happening?
The other answers may prevent the color display, but make your code run slower, because every time the initialiser is used there is real code being run to create an Objective-C object. The #colorLiteral doesn't generate any code.
And I can't quite get why you are opposed to actually seeing the colors.
Use the UIColor(red:green:blue:alpha) initializer instead of color literals.
let color = UIColor(red: 0, green: 1, blue: 1, alpha: 0)
Use like below example:
static let Blue : UIColor = UIColor(red: 43.0/255.0, green: 81.0/255.0, blue: 162.0/255.0, alpha: 1.0)

Extra argument in call when using var

I'm trying to use this code:
var alpha : Float
alpha = 0.5
self.view.backgroundColor = UIColor(red: 1, green:0, blue: 0, alpha:alpha)
However, I get the error:
Extra argument 'green' in call
What is wrong with this code? Moreover, why is
self.view.backgroundColor = UIColor(red: 1, green:0, blue: 0, alpha: 0)
working just fine?
Answer was: Swift UIColor initializer - compiler error only when targeting iPhone5s
Use float instead of integers.
UIColor(red: 1.0, green:0.0, blue: 0.0, alpha:alpha)
This also happens when you unwrap the a UIColor instance that wasn't declared as optional.
Instead of:
let brokenColor = UIColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 1.0)!
Use this:
let color: UIColor! = UIColor(red: 1.0, green: 1.0, blue: 1.0, alpha: 1.0)
If you are using variables use following -
var color: UIColor = UIColor(red: CGFloat(red), green: CGFloat(green), blue: CGFloat(blue), alpha: CGFloat(alpha))
My particular iteration of this error happened when I was trying to set the border color of a button, and was getting the "extra argument 'green' in call" error, but once I stored it in a constant the true error arose, which was the constant not being CGColor. So this fixed it.
let borderColor:UIColor = UIColor(red: 23/255, green: 247/255, blue: 252/255, alpha: 1)
loginButton.layer.borderColor = borderColor.CGColor
Put a space after the semicolons in the call
green: 0, alpha: alpha