Reputation: 3178
I have allocated almost all of my space on the production server to my table space
Now i have an compressed dump of around 20gb which is needed to be dumped into mysql
The problem is the server is not having much space to uncompress the file (which requires around 120 gb)
i have used the beolw command but i am a failure because it is first uncompressing the file and then redirecting the output to mysql
gunzip dbdump.sql.gz | mysql -u root -proot123 -S /home/mysql55/tmp/mysql.sock
Is there any way so that i can dump the compressed file without uncompressing it
any suggestions are really grateful
Upvotes: 14
Views: 47444
Reputation: 114
you can import using this.
zcat dbdump.sql.gz | mysql -u username -p dbname
Upvotes: 1
Reputation: 79
I would recommend getting gunzip. Here is one for windows: http://gnuwin32.sourceforge.net/packages/gzip.htm
Once you unzip, you can upload your .sql
Upvotes: 0
Reputation: 3178
We can achieve the same with the below command as well. Here i am using gzip
gzip -d < dbdump.sql.gz | mysql (args..)
Another way is as given below
gzip -c dbdump.sql.gz | mysql (args..)
Upvotes: 3
Reputation: 61
I know this is ridiculous, but it was gzipped twice, so
filename.sql.gz
filename.sql
to filename.gz
Hope it will work
Upvotes: 5
Reputation: 166919
You may try to uncompress the file on the fly, in example:
$ cat dbdump.sql.gz | gzip -cd | mysql
Upvotes: 0
Reputation: 36940
You should tell gunzip
to write to standard out. What you are doing right now is not going to pipe any output at all.
gunzip -c dbdump.sql.gz | mysql (args...)
Upvotes: 17