Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 |
8 | 9 | 10 | 11 | 12 | 13 | 14 |
15 | 16 | 17 | 18 | 19 | 20 | 21 |
22 | 23 | 24 | 25 | 26 | 27 | 28 |
29 | 30 | 31 |
Tags
- NPM
- SQL
- Python
- Java
- plugin
- table
- hadoop
- 공정능력
- mapreduce
- xPlatform
- Kotlin
- 보조정렬
- SSL
- mybatis
- GIT
- R
- react
- es6
- IntelliJ
- MSSQL
- JavaScript
- Express
- vaadin
- Eclipse
- Sqoop
- Spring
- SPC
- Android
- window
- tomcat
Archives
- Today
- Total
DBILITY
sqoop2 etl java API 테스트 본문
반응형
sqoop shell에서 생성한 job을 메뉴얼을 참고하여 java api로 실행해 보았다.
uber-jar로 packaging해서 실행.
package com.dbility.bigdata.sqoop.oracle2hdfs;
import java.util.List;
import org.apache.sqoop.client.SqoopClient;
import org.apache.sqoop.model.MJob;
import org.apache.sqoop.model.MLink;
import org.apache.sqoop.model.MSubmission;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
*
* Description
*
*
* @author hyperrookie@gmail.com
*
* @version 1.0.0
* @date 2018. 4. 30.
*=======================================================================
* Date Name Revision History
*=======================================================================
* 2018. 4. 30. hyperrookie@gmail.com Creation
*=======================================================================
*/
public class TestMain {
private static final Logger logger = LoggerFactory.getLogger(TestMain.class);
public static void main(String[] args) {
for (String str : args) {
logger.info("{}", str);
}
String url = "http://localhost:12000/sqoop/";
SqoopClient client = new SqoopClient(url);
logger.info("{} / {}",url,client);
List<MLink> linkList = client.getLinks();
for (int i = 0; i < linkList.size(); i++) {
MLink mLink = (MLink)linkList.get(i);
logger.info("{}. link : {}",i, mLink.getName());
}
List<MJob> jobList = client.getJobs();
for (int i = 0; i < jobList.size(); i++) {
MJob mJob = jobList.get(i);
logger.info("{}. job : {}",i, mJob.getName());
}
MSubmission submission = client.startJob("oracle2hdfs");
if ( submission.getStatus().isRunning() && submission.getProgress() != -1 )
logger.info("progress : {}", String.format("%.2f %%", submission.getProgress() * 100) );
logger.info("job final status : {}",!submission.getStatus().isFailure());
}
}
[sqoop2@big-master ~]$ hdfs dfs -ls /oracle/tb_sample_source
Found 10 items
-rw-r--r-- 3 sqoop2 supergroup 26704407 2018-04-30 13:26 /oracle/tb_sample_source/013be2c5-d677-4567-be50-ad06be261c42.txt
-rw-r--r-- 3 sqoop2 supergroup 26702975 2018-04-30 13:26 /oracle/tb_sample_source/42efaf33-bc25-461f-9ada-a6080c39fb6d.txt
-rw-r--r-- 3 sqoop2 supergroup 26705387 2018-04-30 13:26 /oracle/tb_sample_source/53c775cc-7400-4d2e-a691-9b6750a3a226.txt
-rw-r--r-- 3 sqoop2 supergroup 26702355 2018-04-30 13:26 /oracle/tb_sample_source/679546b0-7c1b-4c64-af8f-04434ec78f94.txt
-rw-r--r-- 3 sqoop2 supergroup 26703584 2018-04-30 13:26 /oracle/tb_sample_source/69fb6de9-4683-4027-baf1-5d4400c55c05.txt
-rw-r--r-- 3 sqoop2 supergroup 26704262 2018-04-30 13:26 /oracle/tb_sample_source/6cecc117-9d0a-42d2-95d9-a1764e76ec8b.txt
-rw-r--r-- 3 sqoop2 supergroup 26704060 2018-04-30 13:26 /oracle/tb_sample_source/bf8828a1-ec6c-4a44-885a-db94ee38f307.txt
-rw-r--r-- 3 sqoop2 supergroup 26703466 2018-04-30 13:26 /oracle/tb_sample_source/f552213b-89c7-433e-8242-8d694e2abac3.txt
-rw-r--r-- 3 sqoop2 supergroup 26705138 2018-04-30 13:26 /oracle/tb_sample_source/f8e95e36-8160-4a2f-b16a-d03b3ba1910f.txt
-rw-r--r-- 3 sqoop2 supergroup 26704927 2018-04-30 13:26 /oracle/tb_sample_source/fc78f465-c0ba-49bf-8701-5b5ba6da43ae.txt
[sqoop2@big-master ~]$ hdfs dfs -rm -r /oracle/tb_sample_source
18/04/30 13:28:14 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
Deleted /oracle/tb_sample_source
[sqoop2@big-master ~]$ clear
[sqoop2@big-master ~]$ java -jar oracle2hdfs.jar
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - http://localhost:12000/sqoop/ / org.apache.sqoop.client.SqoopClient@7eda2dbb
WARN : org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 0. link : oracleLink
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 1. link : test
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 2. link : hdfsLink
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 0. job : oracle2hdfs
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 1. job : testJob
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - 2. job : hdfs2oracle
INFO : com.dbility.bigdata.sqoop.oracle2hdfs.TestMain - job final status : true
[sqoop2@big-master ~]$ sqoop2-shell
Setting conf dir: /sqoop/bin/../conf
Sqoop home directory: /sqoop
Sqoop Shell: Type 'help' or '\h' for help.
sqoop:000>
sqoop:000>
sqoop:000> status job -n oracle2hdfs
Submission details
Job Name: oracle2hdfs
Server URL: http://localhost:12000/sqoop/
Created by: sqoop2
Creation date: 2018-04-30 13:30:07 KST
Lastly updated by: sqoop2
External ID: job_1524527260801_0034
http://0.0.0.0:8089/proxy/application_1524527260801_0034/
2018-04-30 13:34:02 KST: SUCCEEDED
Counters:
org.apache.hadoop.mapreduce.FileSystemCounter
FILE_LARGE_READ_OPS: 0
FILE_WRITE_OPS: 0
HDFS_READ_OPS: 10
HDFS_BYTES_READ: 1463
HDFS_LARGE_READ_OPS: 0
FILE_READ_OPS: 0
FILE_BYTES_WRITTEN: 3009910
FILE_BYTES_READ: 0
HDFS_WRITE_OPS: 10
HDFS_BYTES_WRITTEN: 267040561
org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter
BYTES_WRITTEN: 0
org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter
BYTES_READ: 0
org.apache.hadoop.mapreduce.JobCounter
TOTAL_LAUNCHED_MAPS: 10
MB_MILLIS_MAPS: 656956416
VCORES_MILLIS_MAPS: 641559
SLOTS_MILLIS_MAPS: 641559
OTHER_LOCAL_MAPS: 10
MILLIS_MAPS: 641559
org.apache.sqoop.submission.counter.SqoopCounters
ROWS_READ: 1000000
ROWS_WRITTEN: 1000000
org.apache.hadoop.mapreduce.TaskCounter
SPILLED_RECORDS: 0
MERGED_MAP_OUTPUTS: 0
VIRTUAL_MEMORY_BYTES: 21335420928
MAP_INPUT_RECORDS: 0
SPLIT_RAW_BYTES: 1463
MAP_OUTPUT_RECORDS: 1000000
FAILED_SHUFFLE: 0
PHYSICAL_MEMORY_BYTES: 2865971200
GC_TIME_MILLIS: 17423
CPU_MILLISECONDS: 465300
COMMITTED_HEAP_BYTES: 2014838784
Job executed successfully
sqoop:000>
반응형
'bigdata > sqoop' 카테고리의 다른 글
sqoop2 etl hdfs to oracle (0) | 2018.05.01 |
---|---|
sqoop2 etl oracle to hdfs (0) | 2018.04.22 |
apache sqoop2 설치 (0) | 2018.04.20 |
Comments