这里利用MySQL导入HDFS中创建的hdsf-link和mysql-link。
2.1 创建job
sqoop:000> create job -f 2 -t 1
Creating job for links with from id 2 and to id 1
Please fill following values to create new job object
Name: job-hdfs-to-mysql
From Job configuration
Input directory: /tmp/sqoop-input
Override null value:
Null value:
To database configuration
Schema name: sqoop
Table name: sqoop
Table SQL statement:
Table column names:
Stage table name:
Should clear stage table:
Throttling resources
Extractors: 1
Loaders: 1
New job was successfully created with validation status OK and persistent id 2
|
添加完job如下
2.2 执行job
sqoop:000> start job -j 2
Submission details
Job ID: 2
Server URL: http://localhost:12000/sqoop/
Created by: root
Creation date: 2016-12-22 15:38:00 CST
Lastly updated by: root
External ID: job_1481968387780_0012
http://uhadoop-penomi-
master1:23188/proxy/application_1481968387780_0012/
2016-12-22 15:38:00 CST: BOOTING - Progress is not available
|
2.3 查看MySQL数据
mysql> select * from sqoop;
+----+--------+
| id | value |
+----+--------+
| 1 | hha |
| 2 | zhang |
| 3 | hehe |
| 4 | zhen |
| 5 | u |
| 6 | inspurcloud |
+----+--------+
6 rows in set (0.00 sec)
|