Reputation: 1319
I am having some trouble converting a a hexcode to an NSColor. Note this is for a Mac App (hence the NSColor instead of UIColor). This is the code I have so far:
- (NSColor *) createNSColorFromString:(NSString *)string {
NSString* hexNum = [string substringFromIndex:1];
NSColor* color = nil;
unsigned int colorCode = 0;
unsigned char red, green, blue;
if (string) {
NSScanner* scanner = [NSScanner scannerWithString:hexNum];
(void) [scanner scanHexInt:&colorCode];
}
red = (unsigned char) (colorCode >> 16);
green = (unsigned char) (colorCode >> 8);
blue = (unsigned char) (colorCode);
color = [NSColor colorWithCalibratedRed:(float)red / 0xff green:(float)green / 0xff blue:(float)blue / 0xff alpha:1.0];
return color;
}
Any help would be appreciated.
Upvotes: 15
Views: 13409
Reputation: 1316
+(NSColor *) colorWithHexString:(NSString *)hexString alpha:(CGFloat)alpha {
unsigned long int hexValue = strtoul(hexString.UTF8String, NULL, 16);
return [NSColor colorWithHex:hexValue alpha:alpha];
}
+(NSColor *) colorWithHex:(unsigned long int)hexValue alpha:(CGFloat)alpha {
return [NSColor colorWithRed:((float)((hexValue & 0xFF0000) >> 16))/255.0
green:((float)((hexValue & 0x00FF00) >> 8))/255.0
blue:((float)((hexValue & 0x0000FF) >> 0))/255.0
alpha:alpha];
}
You may need:
#include <stdlib.h>
Upvotes: 0
Reputation: 1
The hexTriplet struct in PocketSVG contains an interesting alternative solution. It outputs a CGColor but that's probably a good thing if you want to use it cross-platform:
https://github.com/pocketsvg/PocketSVG/blob/master/Sources/SVGEngine.mm
Upvotes: 0
Reputation: 89509
Here's the swift 2 compatible version of Zlatan's answer above (and +1 to him!):
func getColorFromString(webColorString : String) -> NSColor?
{
var result : NSColor? = nil
var colorCode : UInt32 = 0
var redByte, greenByte, blueByte : UInt8
// these two lines are for web color strings that start with a #
// -- as in #ABCDEF; remove if you don't have # in the string
let index1 = webColorString.endIndex.advancedBy(-6)
let substring1 = webColorString.substringFromIndex(index1)
let scanner = NSScanner(string: substring1)
let success = scanner.scanHexInt(&colorCode)
if success == true {
redByte = UInt8.init(truncatingBitPattern: (colorCode >> 16))
greenByte = UInt8.init(truncatingBitPattern: (colorCode >> 8))
blueByte = UInt8.init(truncatingBitPattern: colorCode) // masks off high bits
result = NSColor(calibratedRed: CGFloat(redByte) / 0xff, green: CGFloat(greenByte) / 0xff, blue: CGFloat(blueByte) / 0xff, alpha: 1.0)
}
return result
}
Upvotes: 4
Reputation: 5569
NSColorParser.nsColor("#FF0000",1)//red nsColor
NSColorParser.nsColor("FF0",1)//red nsColor
NSColorParser.nsColor("0xFF0000",1)//red nsColor
NSColorParser.nsColor("#FF0000",1)//red nsColor
NSColorParser.nsColor("FF0000",1)//red nsColor
NSColorParser.nsColor(0xFF0000,1)//red nsColor
NSColorParser.nsColor(16711935,1)//red nsColor
http://stylekit.org/blog/2015/11/09/Supporting-7-Hex-color-types/
NOTE: This isn't plug and play, you have to dig a little in the code. But its all there and its probably faster than rolling your own.
Upvotes: 0
Reputation: 301
Here is two very useful macros
#define RGBA(r,g,b,a) [NSColor colorWithCalibratedRed:r/255.f green:g/255.f blue:b/255.f alpha:a/255.f]
#define NSColorFromRGB(rgbValue) [NSColor colorWithCalibratedRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 green:((float)((rgbValue & 0xFF00) >> 8))/255.0 blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
Upvotes: 8
Reputation: 698
+ (NSColor*)colorWithHexColorString:(NSString*)inColorString
{
NSColor* result = nil;
unsigned colorCode = 0;
unsigned char redByte, greenByte, blueByte;
if (nil != inColorString)
{
NSScanner* scanner = [NSScanner scannerWithString:inColorString];
(void) [scanner scanHexInt:&colorCode]; // ignore error
}
redByte = (unsigned char)(colorCode >> 16);
greenByte = (unsigned char)(colorCode >> 8);
blueByte = (unsigned char)(colorCode); // masks off high bits
result = [NSColor
colorWithCalibratedRed:(CGFloat)redByte / 0xff
green:(CGFloat)greenByte / 0xff
blue:(CGFloat)blueByte / 0xff
alpha:1.0];
return result;
}
It doesn't take alpha values into account, it assumes values like "FFAABB", but it would be easy to modify.
Upvotes: 28