Reputation: 3443
This code when I assign the parameter alpha to the function implementation it displays the error Could not find a an overload for '/' that accepts the supplied arguments, but if i set alpha to 1.0 the error disappears.
What could be causing this?
import Foundation
import UIKit
extension UIColor {
enum AlphaLevel :CGFloat {
case Empty = 0.0
case Low = 0.25
case Half = 0.5
case High = 0.75
case Full = 1.0
}
class func hazeColor(alpha :AlphaLevel = .Full) -> UIColor! {
return UIColor(red: 230/255.0, green: 235/255.0, blue: 245/255.0, alpha: alpha)
}
}
I'm enlisting the attempts I've tried:
class func hazeColor(alpha :AlphaLevel = .Full) -> UIColor! {
return UIColor(red: Float(230)/255.0, green: Float(235)/255.0, blue: Float(245)/255.0, alpha: alpha)
}
Upvotes: 0
Views: 97
Reputation: 4631
It will be OK to use 230/255.0
, because the type of the number will be determined after the calculation. If you assign the two number to a variable (or constant), you will be have to convert them before calculating. So there is no problem in the 230/255.0
. (And in fact, CGFloat
is a Double
..not a Float
)
It seems there is a fatal bug for me to define the AlphaLevel
enum in the extension of UIColor
. Swift will crash if I do so (although it should be possible). Anyway...
The problem in your code is the alpha
you passed into the hazeColor
method is a enum of AlphaLevel
, instead of CGFloat
, so the type check failed. Just modify the UIColor(red: 230/255.0, green: 235/255.0, blue: 245/255.0, alpha: alpha)
to UIColor(red: 230/255.0, green: 235/255.0, blue: 245/255.0, alpha: alpha.toRaw())
, you can get around.
Upvotes: 1