Reputation: 162
I have a project where I have to run the 300 lines of sql query on Teradata. Sql query basically creates a table and using some updates from other tables it will insert the data into the table that is created at the beginning at the end there is a select statement that will basically pull the data in the columns that we required.
My question here is how can I run that 300 lines from Hadoop ?
I tried doing with sqoop eval which looks like sqoop eval -libjars /var/lib/sqoop/terajdbc4.jar,/var/lib/sqoop/tdgssconfig.jar --driver com.teradata.jdbc.TeraDriver --connect (connection parameters) --query "300line of query"
It is throwing me this error as
WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/01/30 15:05:53 INFO manager.SqlManager: Using default fetchSize of 1000
17/01/30 15:05:54 WARN tool.EvalSqlTool: SQL exception executing statement: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.10.00.14] [Error 3576] [SQLState 25000] Data definition not valid unless solitary.
.
What is solitary ?
My plan is to run the query through sqoop eval and using sqoop import I will import the table into Hadoop. Advice me if this is the right way to do this process or if there is any other best approach to do this ?
Upvotes: 0
Views: 359