Reputation:
Does anyone know why a SQL SELECT query returns no rows when SELECTing from an empty table, but when trying to SELECT the AVG from a column in an empty table it returns < null >? The difference in behavior just seems odd to me. I’m using a sqlite database if that makes any difference.
Here are the two queries:
Normal select: SELECT a FROM table1
If table1 is empty I get no rows back
Avg select: SELECT AVG(a) FROM table1
If table1 is empty I get back a < null > row.
Upvotes: 1
Views: 4007
Reputation: 1585
AVG is an aggregate function similiar to COUNT. If you do:
SELECT COUNT(a) FROM table1 you'd expect to get a zero row.
its the same with AVG, SUM, etc. You get the one row with the result of the aggregate function.
Upvotes: -1
Reputation: 131112
From the ANSI 92 spec
b) If AVG, MAX, MIN, or SUM is specified, then
Case: i) If TXA is empty, then the result is the null value.
Read more at: http://www.contrib.andrew.cmu.edu/~shadow/sql/sql1992.txt
Upvotes: 4
Reputation: 108236
You're doing an aggregate. Since the aggregate is defined for 0-n rows (in this case, 0 rows yields null), you will always get one result back (exactly one in this case).
To put it another way, you're not asking for rows from the table--you're asking for the average of one column in the table and that's what you're getting back. Getting anything other than one row in this case would be weirder.
If you had asked for non-aggregated columns, too, e.g.
SELECT Salesperson, AVG(Sale)
FROM Sales
GROUP BY Salesperson
then I would expect you to get no rows back because there wouldn't be anything to satisfy the non-aggregate selects.
Upvotes: 1
Reputation: 680
I'm not positive, but to determine average, you must divide by the number of rows. If the number of rows is zero, dividing by it would be undefined. Thus, the NULL return. Just a guess.
Upvotes: 3