Miguel Manjarrez
Miguel Manjarrez

Reputation: 159

Calculate the percentage or rows that are not NULL for each column in a table grouped by another column

I have a regular table [customer_table] with some null values that looks like this

id | customer | country | col0 | col1 | col2 |
==============================================
1  | foo      | USA     | NULL | foo  | bar  | 
2  | bar      | USA     | foo  | NULL | foo  | 
3  | foo2     | CANADA  | bar  | col1 | NULL | 
4  | bar2     | GERMANY | foo  | NULL | bar  | 
5  | bar3     | CANADA  | foo  | foo  | bar  | 
6  | bar4     | UK      | bar  | foo  | bar  | 
7  | bar5     | UK      | bar  | bar  | bar  | 

And I want to calculate the percentage of non-null values for each column grouped by country

country | col0%  | col1% | col2% |
==================================
USA     | 50%    | 50%  | 100%   |
GERMANY | 100%   | 0%   | 100%   |
CANADA  | 100%   | 100% | 50%    |
UK      | 100%   | 100% | 100%   |

This is what I tried to to

select TOTAL.[country],
[count_col0]*100/[count_total] as [col0%],
[count_col1]*100/[count_total] as [col1%]
from (
    (select [country], COUNT(*) as [count_total] from [customer_table]
     where [country] <> '' group by [country]) TOTAL
    left join
     (select [country], COUNT(*) as [count_col0] from [customer_table] 
     where [country] <> '' and [col0] <> '' group by [country]) T_COL0
     on T_COL0.[country] = TOTAL.[country]
    left join 
     (select [country], COUNT(*) as [count_col1] from [customer_table] 
     where [country] <> '' and [col1] <> '' group by [country]) T_COL1
     on T_COL1.[country] = TOTAL.[country]
)

It works, but I have a lot of columns, and I don't think it is a good solution

Upvotes: 0

Views: 1016

Answers (3)

Thailo
Thailo

Reputation: 1424

You are looking for a COUNT(DISTINCT xxx) / COUNT(*) pattern here.

Now, when you have lots of columns to cover you could look them up in the INFORMATION_SCHEMA.COLUMNS system table and generate your query you want to run like this:

SELECT
  'SELECT country'
UNION ALL
SELECT
      CONCAT(', (100 * COUNT(DISTINCT ', COLUMN_NAME, ')) / COUNT(*) AS [', COLUMN_NAME, '%]')
FROM  INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'customer_table'
      AND TABLE_SCHEMA = 'dbo'
      AND COLUMN_NAME NOT IN ('id', 'customer', 'country')
UNION ALL
SELECT
  'FROM dbo.customer_table GROUP BY country;'

Which would result in:

SELECT country
, (100 * COUNT(DISTINCT col0)) / COUNT(*) AS [col0%]
, (100 * COUNT(DISTINCT col1)) / COUNT(*) AS [col1%]
, (100 * COUNT(DISTINCT col2)) / COUNT(*) AS [col2%]
FROM dbo.customer_table GROUP BY country;

Upvotes: 1

Gordon Linoff
Gordon Linoff

Reputation: 1269753

Just use aggregation. The simplest method is:

select country,
       count(col1) * 1.0 / count(*),
       count(col2) * 1.0 / count(*),
       count(col3) * 1.0 / count(*)
from customertable
group by country

Upvotes: 2

reactnoob2019
reactnoob2019

Reputation: 79

DECLARE @customertable TABLE (country NVARCHAR(100), col1 BIGINT, col2 BIGINT, col3 BIGINT)

INSERT INTO @customertable 
(country, col1, col2, col3)
    VALUES 
    (N'USA', 0, null, 0)
    ,(N'USA', 0, null, 0)
    ,(N'USA', null, null, 0)
    ,(N'USA', 0, 0, null)
    , (N'CA', 0, null, 0)
    ,(N'CA', 0, null, 0)
    ,(N'CA', null, null, 0)
    ,(N'CA', 0, 0, null)
;WITH DistinctCountries AS (
SELECT DISTINCT Country
FROM @customertable
)

SELECT Country
, col1/(total*1.0) as [col1pct]
, col2/(total*1.0) as [col2pct]
, col3/(total*1.0) as [col3pct]
FROM DistinctCountries AS DistinctCountries
OUTER APPLY (
SELECT 
  SUM(CASE WHEN col1 IS NULL THEN 0 ELSE 1 END) col1
  ,SUM(CASE WHEN col2 IS NULL THEN 0 ELSE 1 END) col2
  ,SUM(CASE WHEN col3 IS NULL THEN 0 ELSE 1 END) col3
  ,COUNT(1) as Total
FROM @customertable AS CountApply
WHERE CountApply.Country = DistinctCountries.Country
)MainCount

if you have a unique list of countries, probably best to do that.

if you have a ton of columns, probably best to create a dynamic SQL query to automatically create each CASE and tokenize it.. or .. a dynamic pivot query.

Upvotes: 0

Related Questions