How to convert HSV(HSB) to range 0...1 - swift

I get HSB as 29, 90, 100.
How to convert it to the range 0...1?
UIColor is initialized only in this range, so I have a question.
let red: CGFloat = 1
let green: CGFloat = 0.5372
let blue: CGFloat = 0.0941
let hueOut = 29
let satOut = 90
let brightnessOut = 100
let color = UIColor(red: red, green: green, blue: blue, alpha: 1.0)
///r 1,0 g 0,537 b 0,094 a 1,0
let color2 = UIColor(hue: hueOut, saturation: satOut, brightness: brightnessOut, alpha: 1.0)
///r -9,0 g 0,0 b -3,0 a 1,0

Looks like your hue range is 0...360 and your saturation and brightness are 0...100. You just need to convert your integer to double and divide by 360 or 100:
let color2 = UIColor(hue: Double(hueOut)/360, saturation: Double(satOut)/100, brightness: Double(brightnessOut)/100, alpha: 1.0)
This will result in r 1.0 g 0.535 b 0.1 a 1.0

Related

Check if color is blue(ish), red(ish), green(ish),

I would like to implement a search where images can be filtered by colors. My image-model contains up to 10 UIColors that occur in that specific image, now I would like to have a filter with e.g. blue, green, red, yellow. How can I check (with a specified tolerance) if that specific image contains blue/green/...?
I tried with CIE94-difference, but that doesn't not match the perceived similarity by the human eye. I also tried to compare the hue and saturation value, but that doesn't work either.
As an example:
`#23567E` should be blue
`#7A010B` should be red as well as `#FD4E57`
`#0F8801` should be found for green as well as `#85FE97`
I have a specific instance of UIColor, e.g.
[UIColor colorWithRed:0.137 green:0.337 blue:0.494 alpha:1] // #23567E
this should be "equal" to .blue
[UIColor colorWithRed:0.478 green:0.00392 blue:0.0431 alpha:1] // #7A010B
should be "equal" to .red
and so on...
If your intend is only to check which color is your UIColor you can simply get your HSB color components and compare its hue value:
You will need a helper to convert your hexa string to UIColor
extension UIColor {
convenience init?(hexString: String) {
var chars = Array(hexString.hasPrefix("#") ? hexString.dropFirst() : hexString[...])
switch chars.count {
case 3: chars = chars.flatMap { [$0, $0] }; fallthrough
case 6: chars = ["F","F"] + chars
case 8: break
default: return nil
}
self.init(red: .init(strtoul(String(chars[2...3]), nil, 16)) / 255,
green: .init(strtoul(String(chars[4...5]), nil, 16)) / 255,
blue: .init(strtoul(String(chars[6...7]), nil, 16)) / 255,
alpha: .init(strtoul(String(chars[0...1]), nil, 16)) / 255)
}
}
And some helpers to extract its hue component:
extension UIColor {
enum Color {
case red, orange, yellow, yellowGreen, green, greenCyan, cyan, cyanBlue, blue, blueMagenta, magenta, magentaRed
}
func color(tolerance: Int = 15) -> Color? {
precondition(0...15 ~= tolerance)
guard let hsb = hsb else { return nil }
if hsb.saturation == 0 { return nil }
if hsb.brightness == 0 { return nil }
let hue = Int(hsb.hue * 360)
switch hue {
case 0 ... tolerance: return .red
case 30 - tolerance ... 30 + tolerance: return .orange
case 60 - tolerance ... 60 + tolerance: return .yellow
case 90 - tolerance ... 90 + tolerance: return .yellowGreen
case 120 - tolerance ... 120 + tolerance: return .green
case 150 - tolerance ... 150 + tolerance: return .greenCyan
case 180 - tolerance ... 180 + tolerance: return .cyan
case 210 - tolerance ... 210 + tolerance: return .cyanBlue
case 240 - tolerance ... 240 + tolerance: return .blue
case 270 - tolerance ... 270 + tolerance: return .blueMagenta
case 300 - tolerance ... 300 + tolerance: return .magenta
case 330 - tolerance ... 330 + tolerance: return .magentaRed
case 360 - tolerance ... 360 : return .red
default: break
}
return nil
}
var hsb: (hue: CGFloat, saturation: CGFloat, brightness: CGFloat, alpha: CGFloat)? {
var hue: CGFloat = 0, saturation: CGFloat = 0, brightness: CGFloat = 0, alpha: CGFloat = 0
guard getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha) else {
return nil
}
return (hue, saturation, brightness, alpha)
}
}
Playground testing:
let blue: UIColor.Color? = UIColor(hexString: "#23567E").color() // r 0.137 g 0.337 b 0.494 a 1.0
let red1: UIColor.Color? = UIColor(hexString: "#7A010B").color() // r 0.478 g 0.004 b 0.043 a 1.0
let red2: UIColor.Color? = UIColor(hexString: "#FD4E57").color() // r 0.992 g 0.306 b 0.341 a 1.0
let green1: UIColor.Color? = UIColor(hexString: "#0F8801").color() // r 0.059 g 0.533 b 0.004 a 1.0
let green2: UIColor.Color? = UIColor(hexString: "#85FE97").color() // t 0.522 g 0.996 b 0.592 a 1.0
UIColor(hue: 90/360, saturation: 0, brightness: 1, alpha: 1).color() // nil
UIColor(hue: 90/360, saturation: 1, brightness: 0, alpha: 1).color() // nil
UIColor(hue: 90/360, saturation: 0, brightness: 0, alpha: 1).color() // nil
UIColor.black.color() // nil
UIColor.white.color() // nil
UIColor.lightGray.color() // nil
UIColor.darkGray.color() // nil
UIColor.red.color() // 0...15 && 346...360 = red
UIColor.orange.color() // 16...45 = orange
UIColor.yellow.color() // 46...75 = yellow
UIColor(hue: 90/360, saturation: 1, brightness: 1, alpha: 1).color() // 76...105 yellowGreen
UIColor.green.color() // 106...135 = green
UIColor(hue: 150/360, saturation: 1, brightness: 1, alpha: 1).color() // 136...165 greenCyan
UIColor.cyan.color() // 166...195 = cyan
UIColor(hue: 210/360, saturation: 1, brightness: 1, alpha: 1).color() // 196...225 cyanBlue
UIColor.blue.color() // 226...255 = blue
UIColor(hue: 270/360, saturation: 1, brightness: 1, alpha: 1).color() // 256...285 blueMagenta
UIColor.magenta.color() // 286...315 = magenta
UIColor(hue: 330/360, saturation: 1, brightness: 1, alpha: 1).color() // 316...345 = magentaRed
UIColor's CGColor has a components property that returns an array of red, green, blue, alpha values as [CGFloat]?
It works as per your requirement with RGB-based colors. Works with other colors too, but will return different set of values instead of r, g, b, a values
Tested with the colors you've provided:
let blueShade = UIColor.init(hexString: "23567E")
print(blueShade.cgColor.components)
//prints Optional([0.13725490196078433, 0.33725490196078434, 0.49411764705882355, 1.0])
let redShade = UIColor.init(hexString: "7A010B")
print(redShade.cgColor.components)
//prints Optional([0.47843137254901963, 0.00392156862745098, 0.043137254901960784, 1.0])
let greenShade = UIColor.init(hexString: "0F8801")
print(greenShade.cgColor.components)
//prints Optional([0.058823529411764705, 0.5333333333333333, 0.00392156862745098, 1.0])
Is this what you're looking for?
You can get RGB values from UIColor instance with built-in method:
let color = UIColor(red: 0.3, green: 0.8, blue: 0.4, alpha: 0.9)
var red = CGFloat.zero
var green = CGFloat.zero
var blue = CGFloat.zero
var alpha = CGFloat.zero
color.getRed(&red, green: &green, blue: &blue, alpha: &alpha)
print("r: \(red), g: \(green), b: \(blue)")

Convert UIColor initialization to Color Literal Xcode

I have multiple different UIColor objects. Some of them are initialized by a constructor some of them are shown as color literals.
static let optionsHeader = UIColor([ ]) // XCode is showing a color rect.
static let optionButtonSelected = UIColor(red: 0.865, green: 0.804, blue: 0.0, alpha: 1.0)
How can I convert the UIColor.init(...) statements to a color literal.
RGB color literal is same as UIColor initialization:
#colorLiteral(red: 1, green: 1, blue: 1, alpha: 1)
Or you can select color after typing #colorLiteral().
You can also use extensions to use hex color instead of inputting rgba values
extension UIColor {
convenience init(hexString: String, alpha: CGFloat = 1.0) {
let hexString: String = hexString.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines)
let scanner = Scanner(string: hexString)
if (hexString.hasPrefix("#")) {
scanner.scanLocation = 1
}
var color: UInt32 = 0
scanner.scanHexInt32(&color)
let mask = 0x000000FF
let r = Int(color >> 16) & mask
let g = Int(color >> 8) & mask
let b = Int(color) & mask
let red = CGFloat(r) / 255.0
let green = CGFloat(g) / 255.0
let blue = CGFloat(b) / 255.0
self.init(red:red, green:green, blue:blue, alpha:alpha)
}
func toHexString() -> String {
var r:CGFloat = 0
var g:CGFloat = 0
var b:CGFloat = 0
var a:CGFloat = 0
getRed(&r, green: &g, blue: &b, alpha: &a)
let rgb:Int = (Int)(r*255)<<16 | (Int)(g*255)<<8 | (Int)(b*255)<<0
return String(format:"#%06x", rgb)
}
}
Then you can code it with:
self.backgroundColor = UIColor(hexString: "#4A4A4A")

Converting CGFloat, CGFloat, CGFloat into UIColor?

I have an array of colours that look like this...
var purpleShades: [(CGFloat, CGFloat, CGFloat)] = [(186.0/255.0, 85.0/255.0, 211.0/255.0), (147.0/255.0, 112.0/255.0, 219.0/255.0), (138.0/255.0, 43.0/255.0, 226.0/255.0), (148.0/255.0, 0.0/255.0, 211.0/255.0), (153.0/255.0, 50.0/255.0, 204.0/255.0), (139.0/255.0, 0.0/255.0, 139.0/255.0)]
rather than duplicate code was wondering if anyone could help convert it to UIColor, so I can use it for this piece of code.
cell.tintColor = grayShades[Int(index)]
This variation of init might help you
It accepts red, green, blue and alpha as parameters.
Here's a nice extension to UIColor:
extension UIColor {
convenience init(hex: UInt, alpha: CGFloat) {
var red, green, blue: UInt
red = ((hex & 0xFF0000) >> 16)
green = ((hex & 0x00FF00) >> 8)
blue = hex & 0x0000FF
self.init(red: CGFloat(red) / 255, green: CGFloat(green) / 255, blue: CGFloat(blue) / 255, alpha: alpha)
}
}
With that you can write:
let purple = UIColor(hex: 0x9932CC, alpha: 1)
If you have a lot of colours, another extension on UIColor gives you…
extension UIColor {
static let darkOrchid = UIColor(hex: 0x 9932CC, alpha: 1)
static let darkMagenta = UIColor(hex: 0x 8B008B, alpha: 1)
static let indigo = UIColor(hex: 0x 4B0082, alpha: 1)
}
which allows you to say, for example…
cell.tintColor = .darkOrchid

Swift Generate A Random Color On A Colorwheel

I'm using the SwiftHSVColorPicker framework and needed to generate a random color on the color wheel.My current way of doing works but something that brightness is off.Here is my code
func generateRandomColor() -> UIColor {
let lowerx : UInt32 = UInt32(0.0)
let upperx : UInt32 = 707
let randomNumberx = arc4random_uniform(upperx - lowerx) + lowerx
let lowery : UInt32 = UInt32(0.0)
let uppery : UInt32 = 707
let randomNumbery = arc4random_uniform(upperx - lowerx) + lowerx
let c = Colorwheel.colorWheel.hueSaturationAtPoint(CGPoint(x: Double(randomNumberx), y: Double(randomNumbery)))
let brightness = 1.0
return UIColor(hue: c.hue, saturation: c.saturation, brightness: CGFloat(brightness), alpha: 1.0)
}
Why don't you use something like
func getRandomColor() -> UIColor{
let randomRed:CGFloat = CGFloat(arc4random()) / CGFloat(UInt32.max)
let randomGreen:CGFloat = CGFloat(arc4random()) / CGFloat(UInt32.max)
let randomBlue:CGFloat = CGFloat(arc4random()) / CGFloat(UInt32.max)
return UIColor(red: randomRed, green: randomGreen, blue: randomBlue, alpha: 1.0)
}
EDIT:
Try this, In this hue,brightness is also there
func generateRandomColor() -> UIColor {
let hue : CGFloat = CGFloat(arc4random() % 256) / 256 // use 256 to get full range from 0.0 to 1.0
let saturation : CGFloat = CGFloat(arc4random() % 128) / 256 + 0.5 // from 0.5 to 1.0 to stay away from white
let brightness : CGFloat = CGFloat(arc4random() % 128) / 256 + 0.5 // from 0.5 to 1.0 to stay away from black
return UIColor(hue: hue, saturation: saturation, brightness: brightness, alpha: 1)
}
SwiftHSVColorPicker results

Getting hue from UIColor yields wrong result

I'm doing the following in order to retrieve the hue from a UIColor():
let rgbColour = UIColor(red: 1.0, green: 0.0, blue: 0.0, alpha: 1.0)
var hue: CGFloat = 0
var saturation: CGFloat = 0
var brightness: CGFloat = 0
var alpha: CGFloat = 0
rgbColour.getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha)
println("\(hue),\(saturation),\(brightness)")
Output:
1.0,1.0,1.0
According to this link, I'm meant to be getting 0.0,1.0,1.0 for RGB (red) 1.0,0.0,0.0.
Am I doing something wrong?
First of all the range of the red/green/blue components in UIColor is 0.0 .. 1.0,
not 0.0 .. 255.0, so you probably want
let rgbColour = UIColor(red: 1.0, green: 0.0, blue: 0.0, alpha: 1.0)
But even then you get the output 1.0,1.0,1.0 and this is correct.
The hue component ranges from 0.0 to 1.0, which corresponds to the angle from 0º to 360º
in a color wheel (see for example HSL and HSV).
Therefore hue = 0.0 and hue = 1.0 describe an identical color.
If you need to normalize the hue component to the half-open interval
0.0 <= hue < 1.0 then you could do that with
hue = fmod(hue, 1.0)
To build on #Martin R's answer:
If you wanted to use HSB, you need to:
Divide the hue value by 360
Use decimals for the Saturation and Brightness values
So, say for example that Sketch is telling you the colour values in HSB are: Hue: 20, Saturation: 72 and Brightness: 96
In Xcode, create the colour as follows:
let myAwesomeColour = UIColor(hue: 20/360, saturation: 0.72, brightness: 0.96, alpha: 1.0)
Whether you use RGB or HSB is a matter of preference. The results are the same as far as Xcode is concerned, they both translate to a UIColor.