I have custom colors within my code. I use them several times and I would like to have them allocated only once.
The situation / problem
If we get a look at UIColor headers we can see the following :
[...]
// Some convenience methods to create colors. These colors will be as calibrated as possible.
// These colors are cached.
open class var black: UIColor { get } // 0.0 white
open class var darkGray: UIColor { get } // 0.333 white
[...]
I've created an extension of UIColor, like so :
import UIKit
extension UIColor {
class func colorWithHexString(_ hex: String) -> UIColor {
print("\(#function): \(hex)")
// some code, then it return a UIColor
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
// Option A
open class var myColorOne : UIColor {
get {
return colorWithHexString("AABBCC")
}
}
// Option B
class func myColorTwo() -> UIColor {
return colorWithHexString("DDEEFF")
}
}
From there I can use my colors easily, with either having a variable or a function.
// A
UIColor.myColorOne
// B
UIColor.myColorTwo()
Sadly, I'm not fully happy with that. Indeed, every time I want to use those colors : a new UIColor allocation is made.
What I've tried
Apple managed to make their color cached apparently. I would like to do so myself too. I've tried several things but none seems to be ideal.
1 - Using dispatch_once ✗
As visible on Swift page : the free function dispatch_once is no longer available in Swift.
2 - Creating a constant (let) ✗
I get the following error : extensions may not contain stored properties
3 - Creating a singleton ~
It does work (each color are created only once) with the following
import UIKit
class Colors : UIColor {
// Singleton
static let sharedInstance = Colors()
let myColorOne : UIColor = {
return UIColor.colorWithHexString("AABBCC")
}()
let myColorTwo : UIColor = {
return UIColor.colorWithHexString("DDEEFF")
}()
}
But it forces me to have one more file and call my colors like so
Colors.sharedInstance.myColorOne
Isn't there any way to get the colors I want like that UIColor.myColorOne and have them cached like Apple does ?
You can use the same approach as in
Using a dispatch_once singleton model in Swift, i.e. static
constant stored properties
which are initialized lazily (and only once). These can be defined
directly in the UIColor extension:
extension UIColor {
convenience init(hex: String) {
// ...
}
static let myColorOne = UIColor(hex:"AABBCC")
static let myColorTwo = UIColor(hex:"DDEEFF")
}
There might be a better way to do it, but using a global variable (like Grimxn mentioned in the comment) is one way to solve the problem.
Below is an example you can copy & paste into a playground:
import UIKit
extension UIColor {
class func colorWithHexString(_ hex: String) -> UIColor {
print("allocate color")
// do you conversion here...
return UIColor.black
}
}
private let myPrivateColorOne = UIColor.colorWithHexString("#ffffff")
extension UIColor {
open class var myColorOne: UIColor {
get {
print("get color")
return myPrivateColorOne
}
}
}
UIColor.myColorOne
UIColor.myColorOne
When you execute the code, the getter will be called twice, but colorWithHexString only once.
You could use your singleton to store the generated UIColor values as above and solve the Colors.sharedInstance.myColorOne naming problem by extending UIColor and putting the access in there:
extension UIColor {
class func myColorTwo() -> UIColor {
return Colors.sharedInstance.myColorTwo
}
}
In Swift 3:
extension UIColor {
class func color(hexString: String) -> UIColor {
// ...
}
static let myColorOne = color(hexString: "AABBCC")
}
Related
I tried the steps below to expeand UIColor, but it didn't work.
At first, I added new color set in xcassets,
and then add UIColor + Extension.swift in my project folder.
// UIColor + Extension.swift
import Foundation
import UIKit
extension UIColor {
class var testColor1:UIColor {
return UIColor(red: 210.0/255.0, green: 105.0/255.0, blue: 130.0/255.0, alpha: 1.0)
}
class var testColor2: UIColor? { return UIColor(named: "testColor") }
}
I want to load custom color in AppDelegate.mm, but got the following error.
animationUIView.backgroundColor = [UIColor testColor1];
The code above doesn't work either.
What am I doing wrong?
I'm sorry, but I'm making a project with react native, so I don't know Swift and objective c well.
if you want to change a background with a custom color you should use :
someView.backgroundColor = UIColor(named: "testColor1")
I use custom fonts in my iOS application and have setup the fonts like so:
private enum MalloryProWeight: String {
case book = "MalloryMPCompact-Book"
case medium = "MalloryMPCompact-Medium"
case bold = "MalloryMPCompact-Bold"}
extension UIFont {
enum Caption {
private static var bookFont: UIFont {
UIFont(name: MalloryProWeight.book.rawValue, size: 1)!
}
private static var mediumFont: UIFont {
UIFont(name: MalloryProWeight.medium.rawValue, size: 1)!
}
private static var boldFont: UIFont {
UIFont(name: MalloryProWeight.bold.rawValue, size: 1)!
}
static var book: UIFont {
return bookFont.withSize(10)
}
static var medium: UIFont {
mediumFont.withSize(10)
}
static var bold: UIFont {
boldFont.withSize(10)
}
}
So that at the call site I can do the following:
UIFont.Caption.bold
This works well; I have an NSAttributed extension that takes in. UIFont and color and returns an attributed string = so it all fits nicely.
However, I now have a requirement to set the LetterSpacing and LineHeight on each of my fonts.
I don't want to go and update the NSAttributed extension to take in these values to set them - I ideally want them accessible from UIFont
So, I tried to subclass UIFont to add my own properties to it - like so:
class MrDMyCustomFontFont: UIFont {
var letterSpacing: Double?
}
And use it like so
private static var boldFont: UIFont {
MrDMyCustomFontFont(name: MalloryProWeight.bold.rawValue, size: 1)!
}
However the compiler complains and I am unsure how to resolve it:
Argument passed to call that takes no arguments
So my question is two part:
How can I add my own custom property (and set it on a per-instance base) on UIFont
Else how do I properly subclass UIFont so that I can add my own properties there?
Thanks!
You can't subclass UIFont because it is bridged to CTFont via UICTFont. That's why the init methods are marked "not inherited" in the header. It's not a normal kind of class.
You can easily add a new property to UIFont, but it won't work the way you want it to. It'll be exactly what you asked for: per-instance. But it won't be copied, so the instance returned from boldFont.withSize(10) won't have the same value as boldFont. If you want the code, this is how you do it:
private var letterSpacingKey: String? = nil
extension UIFont {
var letterSpacing: Double? {
get {
(objc_getAssociatedObject(self, &letterSpacingKey) as? NSNumber)?.doubleValue
}
set {
objc_setAssociatedObject(self, &letterSpacingKey, newValue.map(NSNumber.init(value:)),
.OBJC_ASSOCIATION_RETAIN)
}
}
}
And then you can set it:
let font = UIFont.boldSystemFont(ofSize: 1)
font.letterSpacing = 1
print(font.letterSpacing) // Optional(1)
But you'll lose it anytime a derived font is created:
let newFont = font.withSize(10)
print(newFont.letterSpacing) // nil
So I don't think you want that.
But most of this doesn't really make sense. What would you do with these properties? "Letter spacing" isn't a font characteristic; it's a layout/style characteristic. Lying about the font's height metric is probably the wrong tool as well; configuring that is also generally a paragraph characteristic.
What you likely want is a "Style" that tracks all the things in question (font, spacing, paragraph styles, etc) and can be applied to an AttributedString. Luckily that already exists in iOS 15+: AttributeContainer. Prior to iOS 15, you can just use a [NSAttributedString.Key: Any].
Then, instead of an (NS)AttributedString extension to merge your font in, you can just merge your Container/Dictionary directly (which is exactly how it's designed to work).
extension AttributeContainer {
enum Caption {
private static var boldAttributes: AttributeContainer {
var container = AttributeContainer()
container.font = UIFont(name: MalloryProWeight.bold.rawValue, size: 1)!
container.expansion = 1
let paragraphStyle = NSMutableParagraphStyle()
paragraphStyle.lineSpacing = 1.5
container.paragraphStyle = paragraphStyle
return container
}
static var bold: AttributeContainer {
var attributes = boldAttributes
attributes.font = boldAttributes.font.withsize(10)
return attributes
}
}
}
I explain myself:
I extend UIColor like that:
struct MyColors {
struct blue {
static let light = UIColor(netHex: 0x6ABB72)
static let normal = UIColor(netHex: 0x6ABB72)
static let dark = UIColor(netHex: 0x6ABB72)
}
}
is there a solution to do UIColor.MyColors.blue for UIColor.MyColors.blue.normal ?
If you want to adapt your app for light- and darkmode you could even do the following:
Go to your Assets-Folder -> Click the plus button in the lower left corner -> Select 'New Color Set' -> Then go to the inspector and set the Appearances option to 'Any, Light, Dark' -> Set the color for each appearance.
You could then use the color like so:
UIColor(named: <Name of your Color Set>)
You can extend UIColor with a static property storing the type MyColors.
extension UIColor {
static let myColors = MyColors.self
}
and then access the colors like
UIColor.myColors.blue.light
Or you can actually declare MyColors in the UIColor namespace like
extension UIColor {
struct MyColors {
struct Blue {
static let light = ...
And then access them like this: UIColor.MyColors.Blue.light
To approximate what you want to do, consider using a function to return the color with a given shade, and then use default arguments, e.g.:
extension UIColor {
enum MyColor {
case blue
enum Shade { case normal, light, dark }
func shade(_ shade: Shade) -> UIColor {
switch self {
case .blue:
switch shade {
case .normal: return .init(netHex: 0x000099)
case .light: return .init(netHex: 0x0066cc)
case .dark: return .init(netHex: 0x000066)
}
}
}
}
static func my(_ color: MyColor, shade: MyColor.Shade = .normal) -> UIColor {
return color.shade(shade)
}
}
The syntax for use would be slightly different:
let myBlue = UIColor.my(.blue)
As bonus, you can add these to MyColor:
var normal: UIColor { shade(.normal) }
var light: UIColor { shade(.light) }
var dark: UIColor { shade(.dark) }
Then you can do UIColor.MyColor.blue.dark like before.
However, I would instead suggest just adding the colors themselves as UIColor extensions:
extension UIColor {
static let myBlue = UIColor(named: "blue")
static let myLightBlue = UIColor(named: "lightBlue")
static let myDarkBlue = UIColor(named: "darkBlue")
}
This even allows you to use shorthand like:
label.textColor = .myBlue
I'm rewriting my app in Swift (yes, hooray) and I'm running into the following:
I inherited a class, of which the definition is (.h)
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
#interface MWColor : NSObject
+ (UIColor *)gray;
+ (UIColor *)green;
+ (UIColor *)themeColor;
#end
In here, I define colors, that I can use throughout my project as such:
myView.backgroundColor = MWColor.gray()
Now, I want to do this in a proper Swift way.
What would be the best approach? Extensions?
Help me to be a good Swift citizen
You can add computed colours to UIColor in an extension like this...
extension UIColor {
static var myRed: UIColor {
// define your color here
return UIColor(...)
}
}
or even...
extension UIColor {
static let myRed = UIColor(... define the color values here ...)
}
Then access it like...
let someColor: UIColor = .myRed
or
let otherColor = UIColor.myRed
This matches the way that standard colours are defined too..
UIColor.red
UIColor.yellow
UIColor.myRed
etc...
There are probably a thousand different ways to do this, but I use the following extension:
extension UIColor {
convenience init(rgb: UInt) {
self.init(
red: CGFloat((rgb & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgb & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgb & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
}
Then you can set the color of any object using the common HEX color code of RGB colors. Quickly find HEX colors here: http://www.color-hex.com/
view.backgroundColor = UIColor(rgb: 0xFF0000) will set the backgroundColor of view to red.
Hope this helps
Is it possible for a computed property in extension to have both getter and setter? Apple's guide does not mention it and the only example I have seen only shows read-only computed property in extension.
Is it possible computed property in extension that has getter and setter?
Yes.
Probably one of the most common uses of computed properties in extensions in my experience is providing a wrapper to make easier access to particular properties.
For example, when we want to modify the border layer, border color, or corner radius of anything out of UIKit, we're stuck going through the layer property.
But we can extend UIView with a property with both a setter & getter to provide a much more convenient means of changing the properties of its layer:
extension UIView {
var borderColor: UIColor? {
get {
guard let color = self.layer.borderColor else {
return nil
}
return UIColor(CGColor: color)
}
set {
self.layer.borderColor = newValue?.CGColor
}
}
}
Moreover, if we really want to, we can leverage the Objective-C run time to emulate stored properties in extensions (which of course mean setting & getting). Take part of this Stack Overflow answer for example:
private var kAssociationKeyNextField: UInt8 = 0
extension UITextField {
#IBOutlet var nextField: UITextField? {
get {
return objc_getAssociatedObject(self, &kAssociationKeyNextField) as? UITextField
}
set(newField) {
objc_setAssociatedObject(self, &kAssociationKeyNextField, newField, .OBJC_ASSOCIATION_RETAIN)
}
}
}
This serves as just one example of a property in an extension with a setter & getter.
This works:
extension Bool
{
public var integerValue: Int
{
get
{
return true ? 1 : 0
}
set
{
self = (newValue > 0) ? true : false
}
}
}
So yes.