作者:cecillalurw_689 | 来源:互联网 | 2023-05-19 09:42
数据库|mysql教程Alex,Hadoop,菜鸟,教程,8课,Sqoop1,导入数据库-mysql教程接金币h5小游戏源码,vscode远程经常断,ubuntu输入法,tomca
数据库|mysql教程
Alex,Hadoop,菜鸟,教程,8课,Sqoop1,导入
数据库-mysql教程
接金币h5小游戏 源码,vscode远程经常断,ubuntu 输入法,tomcat修改路径,sqlite创建db,网页设计要标注吗,查询数据库 语句,使用代理服务器后看什么网站,wordpress api 插件,web前端框架下载,java爬虫教学,php 单例模式,seo企业站,springboot疫苗,帝国cms 搜索页支持标签,开源程序网站,网页复杂表单,订做座包套 模板,织梦后台添加统计代码,如何实现页面图片轮播,预约挂号管理系统,论坛类小程序lzw
继续写,其实mysql 导入导出 hdfs 对于实际项目开发没啥用的,但是那个可以拿来入门。今天写跟Hbase和Hive的协作。我突然发现我的教学写的顺序很凌乱啊,没有先介绍Hive 的安装,这点向大家道歉,我后面补上。 数据准备 mysql 在mysql 里面建立表 employee
web聊天室.net源码下载,ubuntu能玩QQ吗,ssl 密钥指纹 爬虫,php 缩写,猎场招聘seolzw
访问 phpmyadmin 进入源码,ubuntu怎么启动cmd,爬虫软件分析数据,php awk,哈尔滨seo职业lzw
继续写,其实mysql 导入导出 hdfs 对于实际项目开发没啥用的,但是那个可以拿来入门。今天写跟Hbase和Hive的协作。我突然发现我的教学写的顺序很凌乱啊,没有先介绍Hive 的安装,这点向大家道歉,我后面补上。数据准备
mysql
在mysql 里面建立表 employee 并插入数据
CREATE TABLE `employee` ( `id` int(11) NOT NULL, `name` varchar(20) NOT NULL, PRIMARY KEY (`id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;
insert into employee (id,name) values (1,'michael'); insert into employee (id,name) values (2,'ted'); insert into employee (id,name) values (3,'jack');
Hbase
hbase(main):006:0> create 'employee','info'0 row(s) in 0.4440 secOnds=> Hbase::Table - employee
Hive
不需要数据准备,等等用–create-hive-table会自动建表
从mysql导入到Hbase
# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --hbase-table employee --column-family info --hbase-row-key id -m 1Warning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.Please set $HCAT_HOME to the root of your HCatalog installation.Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.14/12/01 17:36:25 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.0.114/12/01 17:36:25 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.14/12/01 17:36:25 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.14/12/01 17:36:25 INFO tool.CodeGenTool: Beginning code generation14/12/01 17:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 114/12/01 17:36:26 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 114/12/01 17:36:26 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce……中间日志太多了,用省略号代替14/12/01 17:37:12 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 37.3924 seconds (0 bytes/sec)14/12/01 17:37:12 INFO mapreduce.ImportJobBase: Retrieved 3 records.
去检查下hbase
hbase(main):001:0> scan 'employee'SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]ROW COLUMN+CELL 1 column=info:name, timestamp=1417426628685, value=michael 2 column=info:name, timestamp=1417426628685, value=ted 3 column=info:name, timestamp=1417426628685, value=jack 3 row(s) in 0.1630 seconds
成功插入3条数据
从mysql导入hive
# sqoop import --connect jdbc:mysql://localhost:3306/sqoop_test --username root --password root --table employee --hive-import --hive-table hive_employee --create-hive-tableWarning: /usr/lib/sqoop/../hive-hcatalog does not exist! HCatalog jobs will fail.Please set $HCAT_HOME to the root of your HCatalog installation.Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.……………………14/12/02 15:12:13 INFO hive.HiveImport: Loading data to table default.hive_employee14/12/02 15:12:14 INFO hive.HiveImport: Table default.hive_employee stats: [num_partitions: 0, num_files: 4, num_rows: 0, total_size: 23, raw_data_size: 0]14/12/02 15:12:14 INFO hive.HiveImport: OK14/12/02 15:12:14 INFO hive.HiveImport: Time taken: 0.799 seconds14/12/02 15:12:14 INFO hive.HiveImport: Hive import complete.14/12/02 15:12:14 INFO hive.HiveImport: Export directory is empty, removing it.
这里说下真实环境中mysql的jdbc链接不要用localhost,因为这个任务会被分布式的发送不同的hadoop机子上,要那些机子真的可以通过jdbc连到mysql上才行,否则会丢数据
检查下hive
hive> select * from hive_employee;OK1 michael2 ted3 jackTime taken: 0.179 seconds, Fetched: 3 row(s)
还有一点要声明下:目前sqoop只能从mysql导入数据到hive的原生表(也就是基于hdfs存储的),无法导入数据到外部表(比如基于hbase建立的hive表)
下课!下次讲导出!