Reputation: 139
I'm trying to retrieve datetime
typed data from postgresql on django.
But if the microsecond has leading 0, like below
| datetime_field |
|2021-06-07 09:22:13.099866+00 |
the result is shown as
datetime.datetime(2021, 6, 7, 9, 22, 13, 998660, tzinfo=<UTC>)
in Python.
Take notice of microsecond 099866. It has been changed to 998660.
If I insert the resulted datetime object without any change to postgresql, it is uploaded as below.
| datetime_field |
| 2021-06-07 09:22:13.99866+00 |
The 0 at the head is disappeared.
It seems this problem is derived from psycopg2, not just django but I can't find any resolution for this.
How can I get the microseconds in its entierty?
p.s. Welcome for editing for just the English. As I'm not an English speaker, I'm not sure whether I wrote correct expressions.
Addition of the exact step.
I'm doing the job on docker containers. Two containers are used now, each for django server and PostgreSQL server.
Version infos:
Python 3.9.1
Django==3.1.4
psycopg2==2.8.6
postgres (PostgreSQL) 13.1 (Debian 13.1-1.pgdg100+1)
I'm testing with a clone of production database.
TEST=# SELECT issued_at FROM table_name WHERE id = 153;
issued_at
-------------------------------
2021-06-18 10:10:49.075392+00
(1 row)
python manage.py shell
, retrieve data with connections.cursor()
>>> with transaction.atomic():
... query = f'''SELECT issued_at FROM table_name WHERE id = 153;'''
... with connections['test_db'].cursor() as cursor:
... cursor.execute(query)
... print(cursor.fetchone())
...
(datetime.datetime(2021, 6, 18, 10, 10, 49, 753920, tzinfo=<UTC>),)
Addition 2.
When I get data with django's Model.objects.get()
, the result comes out good.
>>> data = TableName.objects.get(id=153)
>>> data.issued_at
datetime.datetime(2021, 6, 18, 10, 10, 49, 75392, tzinfo=<UTC>)
It seems a kind of problem of django.db.connections
.
Upvotes: 2
Views: 371
Reputation: 19742
I'm not seeing it:
CREATE TABLE public.dt_test (
id integer,
ts_fld timestamp without time zone,
tsz_fld timestamp with time zone
);
insert into dt_test values (1, '2021-06-07 09:22:13.099866+00', '2021-06-07 09:22:13.099866+00');
insert into dt_test values (2, '2021-06-07 09:22:13.99866+00', '2021-06-07 09:22:13.99866+00');
select * from dt_test ;
id | ts_fld | tsz_fld
----+----------------------------+-------------------------------
1 | 2021-06-07 09:22:13.099866 | 2021-06-07 02:22:13.099866-07
2 | 2021-06-07 09:22:13.99866 | 2021-06-07 02:22:13.99866-07
import psycopg2
cur.execute('select * from dt_test')
rs = cur.fetchall()
rs
Out[23]:
[(1,
datetime.datetime(2021, 6, 7, 9, 22, 13, 99866),
datetime.datetime(2021, 6, 7, 2, 22, 13, 99866, tzinfo=psycopg2.tz.FixedOffsetTimezone(offset=-420, name=None))),
(2,
datetime.datetime(2021, 6, 7, 9, 22, 13, 998660),
datetime.datetime(2021, 6, 7, 2, 22, 13, 998660, tzinfo=psycopg2.tz.FixedOffsetTimezone(offset=-420, name=None)))]
cur.execute('insert into dt_test(id, ts_fld) values(%s, %s)', [3, datetime.datetime(2021, 6, 7, 9, 22, 13, 99866)])
con.commit()
select * from dt_test ;
id | ts_fld | tsz_fld
----+----------------------------+-------------------------------
1 | 2021-06-07 09:22:13.099866 | 2021-06-07 02:22:13.099866-07
2 | 2021-06-07 09:22:13.99866 | 2021-06-07 02:22:13.99866-07
3 | 2021-06-07 09:22:13.099866 | NULL
What you are seeing is normalization to 1000000:
select 998660/1000000.0;
?column?
------------------------
0.99866000000000000000
select 99866/1000000.0;
?column?
------------------------
0.09986600000000000000
Upvotes: 1