导入后但在hive导入之前的Sqoop JDBC连接超时


1

我们正在使用Sqoop v1.4.4

14/05/27 13:49:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.0
Sqoop 1.4.4-cdh5.0.0
git commit id 8e266e052e423af592871e2dfe09d54c03f6a0e8

当我从Oracle导入一个表需要超过1小时的时间来提取时,我在Sqoop尝试将数据从临时HDFS位置导入Hive时会收到以下错误消息:

14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Transferred 47.2606 GB in 6,389.4644 seconds (6.7206 MB/sec)
14/05/27 13:05:51 INFO mapreduce.ImportJobBase: Retrieved 98235461 records.
14/05/27 13:05:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@566d0085
14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.inputTable: WAREHOUSE.MY_BIG_TABLE
14/05/27 13:05:51 DEBUG hive.HiveImport: Hive.outputTable: WAREHOUSE.MY_BIG_TABLE
14/05/27 13:05:51 DEBUG manager.OracleManager: Using column names query: SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
14/05/27 13:05:51 DEBUG manager.SqlManager: Execute getColumnTypesRawQuery : SELECT t.* FROM WAREHOUSE.MY_BIG_TABLE t WHERE 1=0
14/05/27 13:05:51 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again

java.sql.SQLException: ORA-02396: exceeded maximum idle time, please connect again

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)
    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:389)

小桌子(1小时以下)一切都很好。

这个问题看起来 究竟 如本文所述 SQOOP-934 问题,它已在版本1.4.4中修复,但正如我所说,我们正在使用v1.4.4

你知道如何解决这个问题吗?


欢迎来到超级用户。请花一点时间阅读 怎么问 , 然后 编辑 你的问题要更清楚了。目前,很难理解你在问什么(因为你没有问过问题)。
CharlieRB

Answers:


By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.