Reputation: 115
I have installed recent cloudera cluster CDH5.5 Single node and i am facing below error while importing db from mysql to hdfs. am able to run list-dtabases commaned successfully. Please let me know the root cause for the below issue.
[cloudera@quickstart ~]$ sqoop import --connect "jdbc:mysql://127.0.0.1/nvegesn" --username root --password XXXX --table products Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 15/12/10 20:14:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.0 15/12/10 20:14:00 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 15/12/10 20:14:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 15/12/10 20:14:01 INFO tool.CodeGenTool: Beginning code generation 15/12/10 20:14:01 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM
products
AS t LIMIT 1 15/12/10 20:14:01 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@47d0ac94 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@47d0ac94 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:934) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:931) at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2735) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1899) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619) at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569) at com.mysql.jdbc.StatementImpl.executeQuery(StatementImpl.java:1524) at com.mysql.jdbc.ConnectionImpl.getMaxBytesPerChar(ConnectionImpl.java:3003) at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:602) at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:445) at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286) at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241) at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227) at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327) at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1834) at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1646) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 15/12/10 20:14:01 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
Upvotes: 0
Views: 833
Reputation: 362
Incase if the table you are trying to import doesn't have primary key then try giving 'm 1' towards then end. This will trigger only one mapper.
By default sqoop will fire 4 mappers. In this case it might not know on which column range the data needs to split across 4 mappers. Hope it helps.
Upvotes: 0