[root@cloud4conf]#sqoopexport--connectjdbc:mysql:192.168.56.1:3306hive--usernameroot--pas
[root@cloud4 conf]# sqoop export --connect jdbc:mysql://192.168.56.1:3306/hive --username root --password root --table pv_info --export-dir /hive/hmbbs.db/result/dat2=2013-05-30 --input-fields-terminated-by '\t';
13/10/31 02:24:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/10/31 02:24:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13/10/31 02:24:43 INFO tool.CodeGenTool: Beginning code generation
13/10/31 02:24:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pv_info` AS t LIMIT 1
13/10/31 02:24:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `pv_info` AS t LIMIT 1
13/10/31 02:24:43 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: /tmp/sqoop-root/compile/4ab5f7e76b5428c59fb2869c08077146/pv_info.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/10/31 02:24:44 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/4ab5f7e76b5428c59fb2869c08077146/pv_info.jar
13/10/31 02:24:44 INFO mapreduce.ExportJobBase: Beginning export of pv_info
13/10/31 02:24:45 INFO input.FileInputFormat: Total input paths to process : 1
13/10/31 02:24:45 INFO input.FileInputFormat: Total input paths to process : 1
13/10/31 02:24:45 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/10/31 02:24:45 WARN snappy.LoadSnappy: Snappy native library not loaded
13/10/31 02:24:45 INFO mapred.JobClient: Running job: job_201310310028_0018
13/10/31 02:24:46 INFO mapred.JobClient: map 0% reduce 0%
13/10/31 02:24:54 INFO mapred.JobClient: map 14% reduce 0%
13/10/31 02:24:56 INFO mapred.JobClient: map 85% reduce 0%
13/10/31 02:24:56 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_0, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
at java.util.AbstractList$Itr.next(AbstractList.java:350)
at pv_info.__loadFromFields(pv_info.java:198)
at pv_info.parse(pv_info.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
13/10/31 02:25:01 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_1, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
at java.util.AbstractList$Itr.next(AbstractList.java:350)
at pv_info.__loadFromFields(pv_info.java:198)
at pv_info.parse(pv_info.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
13/10/31 02:25:06 INFO mapred.JobClient: Task Id : attempt_201310310028_0018_m_000000_2, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
at java.util.AbstractList$Itr.next(AbstractList.java:350)
at pv_info.__loadFromFields(pv_info.java:198)
at pv_info.parse(pv_info.java:147)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
13/10/31 02:25:12 INFO mapred.JobClient: Job complete: job_201310310028_0018
13/10/31 02:25:12 INFO mapred.JobClient: Counters: 20
13/10/31 02:25:12 INFO mapred.JobClient: Job Counters
13/10/31 02:25:12 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=48806
13/10/31 02:25:12 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
13/10/31 02:25:12 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
13/10/31 02:25:12 INFO mapred.JobClient: Rack-local map tasks=7
13/10/31 02:25:12 INFO mapred.JobClient: Launched map tasks=10
13/10/31 02:25:12 INFO mapred.JobClient: Data-local map tasks=3
13/10/31 02:25:12 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
13/10/31 02:25:12 INFO mapred.JobClient: Failed map tasks=1
13/10/31 02:25:12 INFO mapred.JobClient: File Output Format Counters
13/10/31 02:25:12 INFO mapred.JobClient: Bytes Written=0
13/10/31 02:25:12 INFO mapred.JobClient: FileSystemCounters
13/10/31 02:25:12 INFO mapred.JobClient: HDFS_BYTES_READ=801
13/10/31 02:25:12 INFO mapred.JobClient: FILE_BYTES_WRITTEN=371720
13/10/31 02:25:12 INFO mapred.JobClient: File Input Format Counters
13/10/31 02:25:12 INFO mapred.JobClient: Bytes Read=0
13/10/31 02:25:12 INFO mapred.JobClient: Map-Reduce Framework
13/10/31 02:25:12 INFO mapred.JobClient: Map input records=0
13/10/31 02:25:12 INFO mapred.JobClient: Physical memory (bytes) snapshot=205475840
13/10/31 02:25:12 INFO mapred.JobClient: Spilled Records=0
13/10/31 02:25:12 INFO mapred.JobClient: CPU time spent (ms)=2780
13/10/31 02:25:12 INFO mapred.JobClient: Total committed heap usage (bytes)=40501248
13/10/31 02:25:12 INFO mapred.JobClient: Virtual memory (bytes) snapshot=2130132992
13/10/31 02:25:12 INFO mapred.JobClient: Map output records=0
13/10/31 02:25:12 INFO mapred.JobClient: SPLIT_RAW_BYTES=762
13/10/31 02:25:12 INFO mapreduce.ExportJobBase: Transferred 801 bytes in 27.6045 seconds (29.017 bytes/sec)
13/10/31 02:25:12 INFO mapreduce.ExportJobBase: Exported 0 records.
13/10/31 02:25:12 ERROR tool.ExportTool: Error during export: Export job failed!
4 个解决方案
文件名写全就行勒!
真是太大意了,也怪自己不懂这个原理
/hive/hmbbs.db/result/dat2=2013-05-30/000000_0
楼主,我也遇到了这种问题,有4个表,其中3个表直接--export-dir写的是目录,没写具体的文件名,因为该目录下有很多的文件,不会报错,而且数据都正常;只有一个表会报Caused by: java.util.NoSuchElementException的错。。。。。估计不是写全文件名的问题。。。愁死了
我也遇到了类似的报错
Caused by: java.util.NoSuchElementException
同时底下还报错说违反唯一性约束(我建了个同样的表,去除了所有的约束,也会报同样的错误),我看了下 表中的数据不存在重复的,网上也有说是分隔符的问题,我也试了不同的分隔符,还是没搞定
希望前辈不吝赐教
xieyanhevip@163.com
qq: 244212771