Brandon
Brandon

Reputation: 2427

Swift implement LiteralConvertible protocol

I'm trying to implement the IntegerLiteralConvertible protocol on UIColor. What I'd really like to do is this

 let black: UIColor = 0x000000

I first tried following the swift blog here https://developer.apple.com/swift/blog/?id=8 which doesn't work out of the box. It seems according to swift in flux https://github.com/ksm/SwiftInFlux#literalconvertible-protocols-use-constructor that they are no longer using convertFromIntegerLiteral anymore and instead use the initializer. So accordingly this is what we should have:

extension UIColor: IntegerLiteralConvertible {
    public convenience init(integerLiteral value: IntegerLiteralType) {
    UIColor(red: 1, green: 0, blue: 0, alpha: 1)
    }
}

But then what goes inside the initializer? I can't set self. I'd like to say something like

self.init(red: 0, green: 0, blue: 0, alpha: 1)

that doesn't work and neither does anything else it seems. I get the error "initializer requirement init(IntegerLiteral) can only be satisfied by a 'required' initializer in the definiation of non final class 'UIColor' " which isn't very helpful. Any ideas of how to make this work?

Upvotes: 2

Views: 1309

Answers (4)

DevAndArtist
DevAndArtist

Reputation: 5149

Here is the (workaround) code for Swift 2.0:

public class Color: UIColor, IntegerLiteralConvertible {

    public typealias IntegerLiteralType = UInt32

    public required init(integerLiteral value: IntegerLiteralType) {

        let a = CGFloat((value & 0xFF000000) >> 24) / 255.0
        let r = CGFloat((value & 0xFF0000) >> 16) / 255.0
        let g = CGFloat((value & 0xFF00) >> 8) / 255.0
        let b = CGFloat((value & 0xFF)) / 255.0

        super.init(red: r, green: g, blue: b, alpha: a)
    }

    required public init(colorLiteralRed red: Float, green: Float, blue: Float, alpha: Float) {

        super.init(red: CGFloat(red), green: CGFloat(green), blue: CGFloat(blue), alpha: CGFloat(alpha))
    }

    required public init?(coder aDecoder: NSCoder) {

        super.init(coder: aDecoder)
    }
}

Upvotes: 2

Matt
Matt

Reputation: 248

Unfortunately it seems with Swift 1.1 the only work around for this is either class extension or struct composition :/

class MyColor : UIColor,  IntegerLiteralConvertible {
    required init(integerLiteral value: IntegerLiteralType) {
        //code to set values from int ...
        super.init(red: red, green: green, blue: blue, alpha: 1.0)
    }
    required init(coder aDecoder: NSCoder) {
        super.init(coder: aDecoder)
    }
}

var x:MyColor = 0x00FF00
println(x)

Upvotes: 0

Brandon
Brandon

Reputation: 2427

Looks like given the changes in Swift 1.1 this isn't possible anymore. Hopefully this changes in a future release.

https://devforums.apple.com/message/1057171#1057171

Upvotes: 1

Mundi
Mundi

Reputation: 80273

If you just write

let black = 0x000000

there is no way Swift can know you want to define a color. There is just a literal hexadecimal number, so black will simply be an integer by type inference.

Maybe you are looking for something like

let black : UIColor = 0x000000

But I think that the way you are going about it might be unnecessarily complicated. The most natural way is to simply write an initializer that takes an int.

extension UIColor {
    convenience init(_ hex: Int) {
        // add code to analyze the Int 
        // and generate appropriate r, g, b, as CGFloat
        return self.init(red: r, green: g, blue: b, alpha: 1)
    }
}

Notice the underscore before the argument name which lets you omit it. Now you can create your color succinctly:

let black = UIColor(0x000000)

Upvotes: 1

Related Questions