renanleandrof
renanleandrof

Reputation: 7017

SQL: What is better a Bit or a char(1)

Is there any performance difference on retrieving a bit or a char(1) ?

Just for curiosity =]

UPDATE: Suposing i'm using SQL Server 2008!

Upvotes: 12

Views: 38561

Answers (5)

Aaron Kempf
Aaron Kempf

Reputation: 588

use a bit. ALWAYS use the smallest possible datatype. it is critical once you start getting large tables.

Upvotes: -1

SQLMenace
SQLMenace

Reputation: 135171

a bit and a char(1) will both take a 1 byte to store,assuming you only have 1 bit column in the table, SQL Server will store up tp 8 bit columns in 1 byte. I don't think there is a difference in performance.

One thing to be aware of is that you can't do sum on a bit column

CREATE TABLE #test( a BIT)

INSERT #test VALUES (1)
INSERT #test VALUES (1)

SELECT sum(a) FROM #test

Msg 8117, Level 16, State 1, Line 1
Operand data type bit is invalid for sum operator.

you have to convert it first

SELECT sum(CONVERT(INT,a)) FROM #test

Upvotes: 3

marc_s
marc_s

Reputation: 755401

For SQL Server: up to 8 columns of type BIT can be stored inside a single byte, while each column of type CHAR(1) will take up one byte.

On the other hand: a BIT column can have two values (0 = false, 1 = true) or no value at all (NULL) - while a CHAR(1) can have any character value (much more possibilities)

So really, it comes down to:

  • do you really need a true/false (yes/no) field? If so: use BIT
  • do you need something with more than just two possible values - use CHAR(1)

I don't think it makes any significant difference, from a performance point of view - unless you have tens of thousands of columns. Then of course, using BIT which can store up to 8 columns in a single byte would be beneficial. But again: for your "normal" database case, where you have a few, a dozen of those columns, it really doesn't make a big difference. Pick the column type that suits your needs - don't over-worry about performance.....

Upvotes: 19

Jon Egerton
Jon Egerton

Reputation: 41579

As Adam says, it depends on the database implementing the data types properly, but in theory the following holds:

Bit:

Will store 1 or 0 or null. Only takes a Bit to store the value (by definition!). Usually used for true or false, and many programming languages will interpret a bit as a true or false field automatically.

Char[1]:

A char takes 8 bits, or one byte, so its 8 times larger when stored. You can store (pretty much) any character in there. Will probably be interpreted as a string by programming languages. I think Char[1] will always take the full byte, even when empty, unless you use varchar or nvarchar.

Upvotes: 1

Adam Matan
Adam Matan

Reputation: 136499

It's implementation dependent. One DBMS might have the same performance, while another might have differences.

Upvotes: 2

Related Questions