Yanting Chen
Yanting Chen

Reputation: 11

Import data to HDFS using Sqoop2

According to the official guide, http://sqoop.apache.org/docs/1.99.2/Sqoop5MinutesDemo.html , I successfully created a job.

However, when I executed the command, submission start --jid 1, I got this error message:

Exception has occurred during processing command 
Server has returned exception: Exception: java.lang.Throwable Message: GENERIC_JDBC_CONNECTOR_0002:Unable to execute the SQL statement

This is the information of my job.

Database configuration

Schema name: invoice
Table name: ds_msg_log
Table SQL statement: 
Table column names: *
Partition column name: 
Boundary query: 

Output configuration

Storage type: HDFS
Output format: TEXT_FILE
Output directory: /user/root/ds_msg_log

Throttling resources

Extractors: 
Loaders: 

Since there is no information in the official guide talking about how to set the values above, does any know anything wrong in my job setting?

This is the log:

Stack trace:
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:59)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  
Caused by: Exception: java.lang.Throwable Message: ERROR: schema "invoice" does not exist
  Position: 46
Stack trace:
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:2102)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:1835)  
     at  org.postgresql.core.v3.QueryExecutorImpl (QueryExecutorImpl.java:257)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:500)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:374)  
     at  org.postgresql.jdbc2.AbstractJdbc2Statement (AbstractJdbc2Statement.java:254)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcExecutor (GenericJdbcExecutor.java:56)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:155)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:48)  
     at  org.apache.sqoop.connector.jdbc.GenericJdbcImportInitializer (GenericJdbcImportInitializer.java:37)  
     at  org.apache.sqoop.framework.FrameworkManager (FrameworkManager.java:447)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:112)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:98)  
     at  org.apache.sqoop.handler.SubmissionRequestHandler (SubmissionRequestHandler.java:68)  
     at  org.apache.sqoop.server.v1.SubmissionServlet (SubmissionServlet.java:44)  
     at  org.apache.sqoop.server.SqoopProtocolServlet (SqoopProtocolServlet.java:63)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:637)  
     at  javax.servlet.http.HttpServlet (HttpServlet.java:717)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:290)  
     at  org.apache.catalina.core.ApplicationFilterChain (ApplicationFilterChain.java:206)  
     at  org.apache.catalina.core.StandardWrapperValve (StandardWrapperValve.java:233)  
     at  org.apache.catalina.core.StandardContextValve (StandardContextValve.java:191)  
     at  org.apache.catalina.core.StandardHostValve (StandardHostValve.java:127)  
     at  org.apache.catalina.valves.ErrorReportValve (ErrorReportValve.java:102)  
     at  org.apache.catalina.core.StandardEngineValve (StandardEngineValve.java:109)  
     at  org.apache.catalina.connector.CoyoteAdapter (CoyoteAdapter.java:293)  
     at  org.apache.coyote.http11.Http11Processor (Http11Processor.java:859)  
     at  org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler (Http11Protocol.java:602)  
     at  org.apache.tomcat.util.net.JIoEndpoint$Worker (JIoEndpoint.java:489)  
     at  java.lang.Thread (Thread.java:724)  

Upvotes: 0

Views: 3708

Answers (2)

Devan M S
Devan M S

Reputation: 702

Table column names: *

You can't use *, use comma separated column names instead. You should give one column name as partition column, you can use any column for partition.(Using for separating/breaking import job in to multiple tasks for parallel processing). You can leave undocumented parameters as null. Give the integer number for selecting hdfs(storage) and file format(Sequence file/text file).

Here is the sample job created (show job --jid yourjob-id)

sqoop:000> show job --jid 146

1 job(s) to show:

Job with id 146 and name ImportJob (Created 10/10/13 3:46 PM, Updated 10/10/13 3:46 PM)

Using Connection id 149 and Connector id 1

Database configuration

Schema name:  xx

Table name:  xxx

Table SQL statement: 

Table column names: one, two, thre

Partition column name: one

Boundary query: 

Output configuration

Storage type: HDFS

Output format: TEXT_FILE

Output directory: /devanms/

Throttling resources

Extractors: 

Loaders: 

Here is my blog for sqoop java client :

http://devslogics.blogspot.in/2013/09/sqoop-java-client.html

Upvotes: 0

Jarek Jarcec Cecho
Jarek Jarcec Cecho

Reputation: 1726

The value "*" in "Table column names" is not necessary as the default value is "all the columns". It would be also helpful if you could share the server logs to see what went wrong.

You can get additional information, such as entire stack trace of the exception by switching the shell into verbose mode:

set option --name verbose --value true

Upvotes: 1

Related Questions