UIButton Borders Function Only Gives Back White Borders - swift

I'm trying to create a Button Borders Function in Swift to help style my UI. However whatever RGB values I pass in/initialize the function only creates white borders.
Here is my function:
func buttonsWithBorders(button: UIButton, borderWidth: CGFloat, redcolour: CGFloat , greencolour: CGFloat, bluecolour: CGFloat, alpha: CGFloat?) {
let redcolour : CGFloat = 7.0
var greencolour : CGFloat = 3.0
var bluecolour : CGFloat = 2.0
var alpha: CGFloat = 1.0
var widthOfBorder: CGFloat = borderWidth
var theButtonWithBorders: UIButton
var buttonBorderColour : UIColor = UIColor(red: redcolour, green: greencolour, blue: bluecolour, alpha: alpha)
button.layer.borderWidth = widthOfBorder
return button.layer.borderColor = buttonBorderColour.CGColor
}
And I call it using:
buttonsWithBorders(learnHomeButton, 2.0,2.0, 5.0, 5.0, 1.0)
Also I know that passing in values and initializing them is incorrect but Xcode complaines that I am not initializing before using them otherwise
Any help would be very much appreciated, Cheers

You aren't initializing them. You're declaring entirely new variables with the same names as the parameters you're passing in. Whenever you use let or var you are introducing a brand new variable.
When a new variable is introduced with the same name as another currently in scope, this is known as variable shadowing, and what you have here is an almost textbook case.
A better, more concise implementation of your function might look like this:
func addButtonBorder(button: UIButton, width: CGFloat, red: CGFloat, blue: CGFloat, green: CGFloat, alpha: CGFloat = 1.0) {
button.layer.borderColor = UIColor(red: red, green: green, blue: blue, alpha: alpha).CGColor
button.layer.borderWidth = width
}
I used a different name because buttonsWithBorders implies that one or more buttons will be returned from this function. That does not appear to be your intent. Since you are passing one button in, you could only ever get one out, but "buttons" implies more than one.
If I were going to initialize a lot of buttons with borders, I might do something like this:
extension UIButton {
convenience init(frame: CGRect, borderColor: UIColor, borderWidth: CGFloat = 1.0) {
self.init(frame: frame)
setBorder(borderColor, borderWidth: borderWidth)
}
func setBorder(borderColor: UIColor, borderWidth: CGFloat = 1.0) {
layer.borderWidth = borderWidth
layer.borderColor = borderColor.CGColor
}
}
Then you could say UIButton(frame: frame, borderColor: borderColor, borderWidth: 2.0) to initialize a new button or button.setBorder(borderColor, borderWidth: 2.0) to set the border on an existing button.

UIColor takes a float between 0 and 1. So you want to divide your RGB Values by 255.0
Here is the code I used, that works on playground :
import Foundation
import UIKit
func buttonsWithBorders(button: UIButton, borderWidth: CGFloat,
redcolour:CGFloat, greencolour:CGFloat, bluecolour:CGFloat,
alpha:CGFloat) {
let buttonBorderColour : UIColor = UIColor(red: redcolour, green: greencolour, blue: bluecolour, alpha: alpha)
button.layer.borderWidth = borderWidth
return button.layer.borderColor = buttonBorderColour.CGColor
}
let learnHomeButton = UIButton(frame: CGRect(x: 0, y: 0, width: 50,
height: 50))
buttonsWithBorders(learnHomeButton, 2.0, 177/255.0, 177/255.0,
177/255.0, 1.0)
I edited the code so you can pass the colors to the function as parameters. Hope it helps.

The colour values need to be between 0.0 and 1.0 so you should define them as:
let redcolour : CGFloat = 7.0 / 255.0
var greencolour : CGFloat = 3.0 / 255.0
var bluecolour : CGFloat = 2.0 / 255.0

Related

How to convert a UIColor to a black and white UIColor

I am setting the background color of my label, but I would like to have the color be the black and white UIColor instead of the original UIColor.
self.MyLabel.backgroundColor = self.selectedColors.color
Looks like you'll need to convert your colour to grayscale.
While you can do this by averaging the R, G and B components of the colour, apple actually provide a nice method to grab the grayscale value:
func getWhite(_ white: UnsafeMutablePointer<CGFloat>?,
alpha: UnsafeMutablePointer<CGFloat>?) -> Bool
So to use this, you would first extract the grayscale colour and then init a new UIColor:
let originalColor = self.selectedColors.color
var white: CGFloat = 0
var alpha: CGFloat = 0
guard originalColor.getWhite(&white, alpha: &alpha) else {
// The color couldn't be converted! Handle this unexpected error
return
}
let newColor = UIColor(white: white, alpha: alpha)
self.MyLabel.backgroundColor = newColor
By thanks of #Sam answer, I write an extension for UIColor:
extension UIColor {
var grayScale: UIColor? {
var white: CGFloat = 0
var alpha: CGFloat = 0
guard self.getWhite(&white, alpha: &alpha) else {
return nil
}
return UIColor(white: white, alpha: alpha)
}
}
You can use it like this:
var grayScaleColorOfRed = UIColor.red.grayScale ?? UIColor.grey

How to convert UIColor to SwiftUI‘s Color

I want to use a UIColor as foregroundcolor for an object but I don’t know how to convert UIColor to a Color
var myColor: UIColor
RoundedRectangle(cornerRadius: 5).foregroundColor(UIColor(myColor))
Starting with beta 5, you can create a Color from a UIColor:
Color(UIColor.systemBlue)
Both iOS and macOS
Color has a native initializer for it that takes an UIColor or NSColor as an argument:
Color(.red) /* or any other UIColor/NSColor you need INSIDE parentheses */
DO NOT call Color(UIColor.red) explicitly !!!. This will couple your SwiftUI code to the UIKit. Instead, just call Color(.red) That will infer the correct module automatically.
Also, Note this difference:
Color.red /* this `red` is SwiftUI native `red` */
Color(.red) /* this `red` is UIKit `red` */
Note that:
Color.red and UIColor.red are NOT same! They have different values and look different with each other. So DON'T assume this worth nothing
These are equal instead: SwiftUI.Color.Red == UIKit.UIColor.systemRed
Also, You can check out How to get RGB components from SwiftUI.Color
Extension
You can implement a custom variable for it to make it more like cgColor and ciColor
extension UIColor {
/// The SwiftUI color associated with the receiver.
var suColor: Color { Color(self) }
}
so it would be like:
UIColor.red // UIKit color
UIColor.red.suColor // SwiftUI color
UIColor.red.cgColor // Core graphic color
UIColor.red.ciColor // Core image color
Note: Click here to see How to convert SwiftUI.Color to UIColor
Using two helper extensions:
To extract components from UIColor:
extension UIColor {
var rgba: (red: CGFloat, green: CGFloat, blue: CGFloat, alpha: CGFloat) {
var red: CGFloat = 0
var green: CGFloat = 0
var blue: CGFloat = 0
var alpha: CGFloat = 0
getRed(&red, green: &green, blue: &blue, alpha: &alpha)
return (red, green, blue, alpha)
}
}
To init with UIColor:
extension Color {
init(uiColor: UIColor) {
self.init(red: Double(uiColor.rgba.red),
green: Double(uiColor.rgba.green),
blue: Double(uiColor.rgba.blue),
opacity: Double(uiColor.rgba.alpha))
}
}
Usage:
Color(uiColor: .red)
In Swift UI custom colors with a convenient extension:
extension UIColor {
struct purple {
static let normal = UIColor(red:0.043, green:0.576 ,blue:0.588 , alpha:1.00)
static let light = UIColor(red: 1, green: 1, blue: 1, alpha: 1)
static let dark = UIColor(red: 1, green: 1, blue: 1, alpha: 1)
}
struct gray {
static let normal = UIColor(red:0.5, green:0.5 ,blue:0.5 , alpha:1.00)
static let dark = UIColor(red: 1, green: 1, blue: 1, alpha: 1)
}
}
Wrapping Color of SwiftUI:
extension UIColor {
var toSUIColor: Color {
Color(self)
}
}
and using this:
var body: some View {
Text("Hello World")
.foregroundColor(Color(UIColor.purple.normal))
.background(Color(UIColor.gray.normal))
// with wrap
//.foregroundColor(UIColor.purple.normal.toSUIColor)
//.background(UIColor.gray.normal.toSUIColor)
}
I'm a really old hobbyist. Here is one way that works for me. Yes, I do use globals for reusable statements in a Constant.swift file. This example is inline so that it is easier to see. I do not say this is the way to go, it is just my old way.
Screenshot (27k)
import SwiftUI
// named color from the developer's pallet
let aluminumColor = Color(UIColor.lightGray)
// a custom color I use often
let butterColor = Color.init(red: 0.9993399978,
green: 0.9350042167,
blue: 0.5304131241)
// how I use this in a SwiftUI VStack:
Text("Fur People")
.font(.title)
.foregroundColor(aluminumColor)
.shadow(color: butterColor, radius: 4, x: 2, y: 2)
.minimumScaleFactor(0.75)
Create an extension like this:
extension Color {
static let someColor = Color(UIColor.systemIndigo) // << Select any UIColor
}
Usage:
struct ContentView: View {
var body: some View {
Rectangle()
.frame(width: 100, height: 100)
.foregroundColor(.someColor)
}
}

Computing complementary, triadic, tetradic, and analagous colors

I have created swift functions, where I send color value to and want to return triadic and tetrads values. It sort of works, but I am not happy about the color results. Can anyone help me to fine-tune the formula please?
I was following few sources, but the returned colours were too bright or saturated in comparison to several online web based color schemes. I know it's a matter of preference as well and I kinda like the results of the code below, but in some instances of colors the result of one color returned is way too close to the original one, so it's barely visible. It applies only to a few colors...
I was using the formula from here:
my code:
func getTriadColor(color: UIColor) -> (UIColor, UIColor){
var hue : CGFloat = 0
var saturation : CGFloat = 0
var brightness : CGFloat = 0
var alpha : CGFloat = 0
let triadHue = CGFloat(color.getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha))
let triadColor1 = UIColor(hue: (triadHue + 0.33) - 1.0, saturation: saturation, brightness: brightness, alpha: alpha)
let triadColor2 = UIColor(hue: (triadHue + 0.66) - 1.0, saturation: saturation, brightness: brightness, alpha: alpha)
return (triadColor1, triadColor2)
}
func getTetradColor(color: UIColor) -> (UIColor, UIColor, UIColor){
var hue : CGFloat = 0
var saturation : CGFloat = 0
var brightness : CGFloat = 0
var alpha : CGFloat = 0
let tetradHue = CGFloat(color.getHue(&hue, saturation: &saturation, brightness: &brightness, alpha: &alpha))
let tetradColor1 = UIColor(hue: (tetradHue + 0.25) - 1.0, saturation: saturation, brightness: brightness, alpha: alpha)
let tetradColor2 = UIColor(hue: (tetradHue + 0.5) - 1.0, saturation: saturation, brightness: brightness, alpha: alpha)
let tetradColor3 = UIColor(hue: (tetradHue + 0.75) - 1.0, saturation: saturation, brightness: brightness, alpha: alpha)
return (tetradColor1, tetradColor2, tetradColor3)
}
And I also found nice clean code for finding complementary color, which I am very happy about the results
func getComplementColor(color: UIColor) -> UIColor{
let ciColor = CIColor(color: color)
let compRed: CGFloat = 1.0 - ciColor.red
let compGreen: CGFloat = 1.0 - ciColor.green
let compBlue: CGFloat = 1.0 - ciColor.blue
return UIColor(red: compRed, green: compGreen, blue: compBlue, alpha: 1.0)
}
Your screen shot is of this web page. (Wayback Machine link because, six years later, the page has been deleted.) The formulas on that page are incorrect, because they specify the use of the absolute value function instead of the modulo function. That is, for example, your screen shot defines
H1 = |(H0 + 180°) - 360°|
but consider what this gives for the input H0 = 90°:
H1 = |(90° + 180°) - 360°| = |270° - 360°| = |-90°| = 90°
Do you think that the complementary hue of H0 = 90° is H1 = 90°, the same hue?
The correct formula is
H1 = (H0 + 180°) mod 360°
where “mod” is short for “modulo” and means “the remainder after dividing by”. In other words, if the answer would be above 360°, subtract 360°. For H0 = 90°, this gives the correct answer of H1 = 270°.
But you don't even have this problem in your code, because you didn't use the absolute value function (or the modulo function) in your code. Since you're not doing anything to keep your hue values in the range 0…1, your hue values that are less than zero are clipped to zero, and your hue values above one are clipped to one (and both zero and one mean red).
Your getComplementColor is also not at all the standard definition of the “complementary color”.
Here are the correct definitions:
extension UIColor {
var complement: UIColor {
return self.withHueOffset(0.5)
}
var splitComplement0: UIColor {
return self.withHueOffset(150 / 360)
}
var splitComplement1: UIColor {
return self.withHueOffset(210 / 360)
}
var triadic0: UIColor {
return self.withHueOffset(120 / 360)
}
var triadic1: UIColor {
return self.withHueOffset(240 / 360)
}
var tetradic0: UIColor {
return self.withHueOffset(0.25)
}
var tetradic1: UIColor {
return self.complement
}
var tetradic2: UIColor {
return self.withHueOffset(0.75)
}
var analagous0: UIColor {
return self.withHueOffset(-1 / 12)
}
var analagous1: UIColor {
return self.withHueOffset(1 / 12)
}
func withHueOffset(offset: CGFloat) -> UIColor {
var h: CGFloat = 0
var s: CGFloat = 0
var b: CGFloat = 0
var a: CGFloat = 0
self.getHue(&h, saturation: &s, brightness: &b, alpha: &a)
return UIColor(hue: fmod(h + offset, 1), saturation: s, brightness: b, alpha: a)
}
}
Here are some examples of complementary colors (original on top, complementary beneath):
Here are split complementary colors (original on top):
Here are triadic colors (original on top):
Here are tetradic colors (original on top):
Here are analagous colors (original in the middle):
Here is the playground I used to generate those images:
import XCPlayground
import UIKit
let view = UIView(frame: CGRectMake(0, 0, 320, 480))
view.backgroundColor = [#Color(colorLiteralRed: 0.9607843137254902, green: 0.9607843137254902, blue: 0.9607843137254902, alpha: 1)#]
let vStack = UIStackView(frame: view.bounds)
vStack.autoresizingMask = [ .FlexibleWidth, .FlexibleHeight ]
view.addSubview(vStack)
vStack.axis = .Vertical
vStack.distribution = .FillEqually
vStack.alignment = .Fill
vStack.spacing = 10
typealias ColorTransform = (UIColor) -> UIColor
func tile(color color: UIColor) -> UIView {
let view = UIView()
view.translatesAutoresizingMaskIntoConstraints = false
view.backgroundColor = color
return view
}
func strip(transforms: [ColorTransform]) -> UIStackView {
let strip = UIStackView()
strip.translatesAutoresizingMaskIntoConstraints = false
strip.axis = .Vertical
strip.distribution = .FillEqually
strip.alignment = .Fill
strip.spacing = 0
let hStacks = (0 ..< transforms.count).map { (i: Int) -> UIStackView in
let stack = UIStackView()
stack.translatesAutoresizingMaskIntoConstraints = false
stack.axis = .Horizontal
stack.distribution = .FillEqually
stack.alignment = .Fill
stack.spacing = 4
strip.addArrangedSubview(stack)
return stack
}
for h in 0 ..< 10 {
let hue = CGFloat(h) / 10
let color = UIColor(hue: hue, saturation: 1, brightness: 1, alpha: 1)
for (i, transform) in transforms.enumerate() {
hStacks[i].addArrangedSubview(tile(color: transform(color)))
}
}
return strip
}
vStack.addArrangedSubview(strip([
{ $0 },
{ $0.complement }]))
vStack.addArrangedSubview(strip([
{ $0 },
{ $0.splitComplement0 },
{ $0.splitComplement1 }]))
vStack.addArrangedSubview(strip([
{ $0 },
{ $0.triadic0 },
{ $0.triadic1 }]))
vStack.addArrangedSubview(strip([
{ $0 },
{ $0.tetradic0 },
{ $0.tetradic1 },
{ $0.tetradic2 }]))
vStack.addArrangedSubview(strip([
{ $0.analagous0 },
{ $0 },
{ $0.analagous1 }]))
XCPlaygroundPage.currentPage.liveView = view

Converting a CAShapeLayer to use in an NSImageView

I'm trying to port some iOS code for a Mac app. My code is as follows:
func innerRing() {
let innerRing = CAShapeLayer()
let circleRadius: CGFloat = 105.0
innerRing.frame = InnerRingView.bounds
func circleFrame() -> CGRect {
var circleFrame = CGRect(x: 0, y: 0, width: 2*circleRadius, height: 2*circleRadius)
circleFrame.origin.x = CGRectGetMidX(InnerRingView.bounds) - CGRectGetMidX(circleFrame)
circleFrame.origin.y = CGRectGetMidY(InnerRingView.bounds) - CGRectGetMidY(circleFrame)
return circleFrame
}
innerRing.path = UIBezierPath(ovalInRect: circleFrame()).CGPath
innerRing.lineWidth = 3.0
innerRing.strokeStart = 0.0
innerRing.strokeEnd = 1.0
innerRing.fillColor = UIColor.clearColor().CGColor
innerRing.strokeColor = UIColor(red: 147.0/255.0, green: 184.0/255.0, blue: 255.0/255.0, alpha: 1.0).CGColor
InnerRingView.layer.addSublayer(innerRing)
}
This code works very well, especially for adjusting the fill color, stroke color, and stroke start/end.
In my Mac app, I am effectively trying to use the same code, but apply it to an NSImageView (I want it to be able to appear on each row of a table, and I will adjust certain parameters (such as color) based on what that row details.
Could anyone assist with guidance on adding this simple circle to an NSImageView?
Why do you want to use an NSImageView? NSImageView is for displaying images (icons, pictures, etc).
Make yourself a custom NSView instead. Just remember that, unlike UIKit's UIView, NSView doesn't get a layer by default, so you need to tell It to by setting wantsLayer to true.
Like so:
class CircleView: NSView {
lazy var innerRing: CAShapeLayer = {
let innerRing = CAShapeLayer()
let circleRadius: CGFloat = 105.0
innerRing.frame = self.bounds
var circleFrame = CGRect(x: 0, y: 0, width: circleRadius, height: circleRadius)
circleFrame.origin.x = CGRectGetMidX(self.bounds) - CGRectGetMidX(circleFrame)
circleFrame.origin.y = CGRectGetMidY(self.bounds) - CGRectGetMidY(circleFrame)
innerRing.path = CGPathCreateWithEllipseInRect(circleFrame, nil)
innerRing.lineWidth = 3.0
innerRing.strokeStart = 0.0
innerRing.strokeEnd = 1.0
innerRing.fillColor = NSColor.clearColor().CGColor
innerRing.strokeColor = NSColor(red: 147.0/255.0, green: 184.0/255.0, blue: 255.0/255.0, alpha: 1.0).CGColor
return innerRing
}()
override func awakeFromNib() {
super.awakeFromNib()
wantsLayer = true
layer = CALayer()
layer?.addSublayer(innerRing)
}
}

How to update SKLabelNode color after the node is initiated?

I have SKLabelNode in my Swift-code. I need to change the Label's color during SKAction. Simply:
override func didMoveToView(view: SKView) {
...
var color = UIColor(red: CGFloat(1.0), green: CGFloat(0.0), blue: CGFloat(0.0), alpha: CGFloat(0.0))
myLabel.fontColor = color
...
}
Doesn't work. I still have to somehow update the node but how? I'm noobie to Swift and Sprite Kit.
Do you need it to be in an SKAction? If not, simply use this:
myLabel.fontColor = SKColor.blueColor()
Substitute blueColor with whichever color you want, or use the generic method where 'float here' is a fraction of 255 (such as 50.0f/255.0f).
myLabel.fontColor = SKColor(red: floatHere, green: floatHere, blue: floatHere, alpha: floatFrom0To1Here)
In case you do need to set the color through an SKAction, you can use this method:
myLabel.runAction(SKAction.colorizeWithColor(UIColor.blueColor(), colorBlendFactor: 1, duration: 1))
I had a similar problem a few weeks ago.
Try changing your color variable to the following:
var color = UIColor(red: 1.0 / 255, green: 0.0 / 255, blue: 0.0 / 255, alpha: 0.0)