mettastar Posted October 6, 2016 Report Posted October 6, 2016 i'm exporting data from one hdfs to nz table using sqoop export na question enti ante naa hdfs lo hive table partitioned so I have data in multiple folders.. so sqoo export lo can I mention multiple folders to export the data ? nenu ekkada chudale idi .. evaranna didded aa? other option is to concat the files and export but ala kakunda emana vere option undemo kanukundam ani esa Quote
Barney_Stinson Posted October 6, 2016 Report Posted October 6, 2016 evaranna reply istey choosi telskuntaa... LTT Quote
kasi Posted October 6, 2016 Report Posted October 6, 2016 27 minutes ago, mettastar said: i'm exporting data from one hdfs to nz table using sqoop export na question enti ante naa hdfs lo hive table partitioned so I have data in multiple folders.. so sqoo export lo can I mention multiple folders to export the data ? nenu ekkada chudale idi .. evaranna didded aa? other option is to concat the files and export but ala kakunda emana vere option undemo kanukundam ani esa Keep all the folders in a warehouse dir and use this sqoop export \ –connect jdbc:oracle:thin:@enkx3-scan:1521:dbm1 \ –username wzhou \ –password wzhou \ –direct \ –export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ \ –table WZHOU.TEST_IMPORT_FROM_SCOOP \ –fields-terminated-by ‘\001’ Quote
mettastar Posted October 6, 2016 Author Report Posted October 6, 2016 1 minute ago, kasi said: Keep all the folders in a warehouse dir and use this sqoop export \ –connect jdbc:oracle:thin:@enkx3-scan:1521:dbm1 \ –username wzhou \ –password wzhou \ –direct \ –export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ \ –table WZHOU.TEST_IMPORT_FROM_SCOOP \ –fields-terminated-by ‘\001’ thats the thing vuncle.. i don't want to add any other process (consolidating the files) before exporting. Multiple folders nunchi at a time export ki option leda? Quote
kasi Posted October 6, 2016 Report Posted October 6, 2016 –export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_* \ ila try chey Quote
kasi Posted October 6, 2016 Report Posted October 6, 2016 never encountered this issue, basically erripuk design idi but try this –export-dir ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop’ ‘/user/hive/warehouse/test_oracle.db/my_all_objects_sqoop1’ \ not sure if it works Quote
kasi Posted October 6, 2016 Report Posted October 6, 2016 try space delimited folder directories Quote
mettastar Posted October 6, 2016 Author Report Posted October 6, 2016 4 minutes ago, kasi said: try space delimited folder directories Ya adi try chesi chustha.. basically hive table patitioned by date.. so oka day ki oka folder untadi. When im pulling the data >= date petti data thechukovali.. so first im thinking of resolving the folders and if there is an option i want to pass the all folders in the export command itself Quote
mettastar Posted October 6, 2016 Author Report Posted October 6, 2016 one more question bro, I'm geting this error while trying to export the data .. this is related to decimal field java.lang.ClassCastException: org.apache.hadoop.io.BytesWritable cannot be cast to java.math.BigDecimal my data in hdfs is in avro format Quote
mettastar Posted October 6, 2016 Author Report Posted October 6, 2016 22 minutes ago, kasi said: what serde are you using?? when bringing data into hadoop not sure bro, but they are converting the data into avro format and placing them in one location and creating hive tables on that.. my team consumes the data from those hive tables/hdfs locations .. so akkada data avro lo undi.. sqoop export lo aithe im not mentioning any serde as sqoop can handle avro files my command: sqoop export -Dsqoop.avro.logical_types.decimal.enable=true --connect jdbc:netezza://abcdefgh:5480/ods --username xyz -P --export-dir /dev/dev_ods/DTA/ASN00/custom_year=2016/custom_month=9/custom_day=4/ --table ASN00 --input-fields-terminated-by "," --batch; Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.