Reputation: 15002
I tried to group by hourly, but failed.
Here my query
self
.where{ name =~ 'MemoryInfo' }
.where{ sony_alarm_test_id.eq(test_id)}
.group("DATE_PART('hour', utc_time )").count
The error message
D, [2014-06-27T15:34:39.713265 #26124] DEBUG -- : (0.6ms) SELECT COUNT(*) AS count_all, DATE_PART('hour', utc_time ) AS date_part_hour_utc_time FROM "sony_alarm_logs" WHERE "sony_alarm_logs"."sony_alarm_test_id" = 1 AND "sony_alarm_logs"."name" ILIKE 'MemoryInfo' GROUP BY DATE_PART('hour', utc_time ) ORDER BY utc_time
E, [2014-06-27T15:34:39.713421 #26124] ERROR -- : PG::UndefinedFunction: ERROR: function date_part(unknown, double precision) does not exist
LINE 1: SELECT COUNT(*) AS count_all, DATE_PART('hour', utc_time ) A...
Data
+-------+------------+------------------+--------------------+--------------------------------------------------------+
| id | name | utc_time | sony_alarm_test_id | brief_content |
+-------+------------+------------------+--------------------+--------------------------------------------------------+
| 25989 | MemoryInfo | 1403512193.57157 | 5 | {"used"=>"156572", "free"=>"150172", "shared"=>"6400"} |
| 25990 | MemoryInfo | 1403512224.60379 | 5 | {"used"=>"156572", "free"=>"150136", "shared"=>"6436"} |
| 25991 | MemoryInfo | 1403512255.63598 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 25992 | MemoryInfo | 1403512286.66835 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 25993 | MemoryInfo | 1403512317.70055 | 5 | {"used"=>"156572", "free"=>"150136", "shared"=>"6436"} |
| 25994 | MemoryInfo | 1403512348.73276 | 5 | {"used"=>"156572", "free"=>"150136", "shared"=>"6436"} |
| 25995 | MemoryInfo | 1403512379.76492 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 25996 | MemoryInfo | 1403512410.7972 | 5 | {"used"=>"156572", "free"=>"150252", "shared"=>"6320"} |
| 25997 | MemoryInfo | 1403512441.82937 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 25998 | MemoryInfo | 1403512472.86155 | 5 | {"used"=>"156572", "free"=>"150224", "shared"=>"6348"} |
| 25999 | MemoryInfo | 1403512503.89374 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 26000 | MemoryInfo | 1403512534.92593 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 26001 | MemoryInfo | 1403512565.95812 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 26002 | MemoryInfo | 1403512596.99028 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
| 26003 | MemoryInfo | 1403512628.02261 | 5 | {"used"=>"156572", "free"=>"150144", "shared"=>"6428"} |
Upvotes: 1
Views: 488
Reputation: 36244
Your utc_time
column is double precision
. All date/time functions works on timestamp[tz]
, date
and time[tz]
types.
You can convert your double precision
fields to timestamp
using the function to_timestamp(double precision)
-- assuming utc_time
counts seconds (and not milliseconds) from the unix epoch.
Upvotes: 0
Reputation: 22663
There is no such function date_part(unknown, double precision)
. Documentation for date_part
is here - it can accept timestamp
or interval
. Column utc_time
is not timestamp
.
You will need to use to_timestamp()
function in your query (docs), something like:
self
.where{ name =~ 'MemoryInfo' }
.where{ sony_alarm_test_id.eq(test_id)}
.group("DATE_PART('hour', to_timestamp(utc_time) )").count
Upvotes: 1