标签:dmi href error extends 最好 str 代码 support man
Hadoop/Hive自带权限控制延续数据仓库之Hive快速入门 - 离线&实时数仓架构一文,本文将介绍一下Hadoop/Hive自带的权限控制,权限控制是大数据平台非常重要的一部分,关乎数据安全。
集群安全下需求:
现有方案:
Hadoop权限:
Hive权限:
首先添加一个系统用户:
[root@hadoop01 ~]# useradd hive
将test
这个表的查询权限赋予给hive
这个用户:
0: jdbc:hive2://localhost:10000> grant select on table test to user hive;
No rows affected (0.12 seconds)
0: jdbc:hive2://localhost:10000>
切换到hive
用户:
[root@hadoop01 ~]# sudo su - hive
进入交互命令终端,可以正常执行查询语句:
[hive@hadoop01 ~]$ beeline -u jdbc:hive2://localhost:10000 -n hive
...
0: jdbc:hive2://localhost:10000> select user_name from test;
+------------+
| user_name |
+------------+
| Tom |
| Jerry |
| Jim |
| Angela |
| Ann |
| Bella |
| Bonnie |
| Caroline |
+------------+
8 rows selected (0.075 seconds)
0: jdbc:hive2://localhost:10000>
但是如果执行其他操作则会报错提示不支持该操作:
0: jdbc:hive2://localhost:10000> delete from test where user_id=‘f4914b91c5284b01832149776ca53c8d‘;
Error: Error while compiling statement: FAILED: SemanticException [Error 10294]: Attempt to do update or delete using transaction manager that does not support these operations. (state=42000,code=10294)
0: jdbc:hive2://localhost:10000>
如此一来,我们就可以限制Hive中用户对于某些表的操作权限。但之前也提到了,Hive中没有超级管理员,任何用户都可以进行Grant/Revoke操作,这使得权限管理失去意义。为了解决这个问题,就需要我们开发实现自己的权限控制类,确保某个用户为超级用户。
首先创建一个空Maven项目,然后添加hive-exec
依赖,完整的pom
文件内容如下:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>hive-security-test</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>3.1.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
实现自定义权限控制类:
package com.example.hive.security;
import com.google.common.base.Joiner;
import org.apache.hadoop.hive.ql.parse.*;
import org.apache.hadoop.hive.ql.session.SessionState;
/**
* 自定义Hive的超级用户
*
* @author 01
* @date 2020-11-09
**/
public class HiveAdmin extends AbstractSemanticAnalyzerHook {
/**
* 定义超级用户,可以定义多个
*/
private static final String[] ADMINS = {"root"};
/**
* 权限类型列表
*/
private static final int[] TOKEN_TYPES = {
HiveParser.TOK_CREATEDATABASE, HiveParser.TOK_DROPDATABASE,
HiveParser.TOK_CREATEROLE, HiveParser.TOK_DROPROLE,
HiveParser.TOK_GRANT, HiveParser.TOK_REVOKE,
HiveParser.TOK_GRANT_ROLE, HiveParser.TOK_REVOKE_ROLE,
HiveParser.TOK_CREATETABLE
};
/**
* 获取当前登录的用户名
*
* @return 用户名
*/
private String getUserName() {
boolean hasUserName = SessionState.get() != null &&
SessionState.get().getAuthenticator().getUserName() != null;
return hasUserName ? SessionState.get().getAuthenticator().getUserName() : null;
}
private boolean isInTokenTypes(int type) {
for (int tokenType : TOKEN_TYPES) {
if (tokenType == type) {
return true;
}
}
return false;
}
private boolean isAdmin(String userName) {
for (String admin : ADMINS) {
if (admin.equalsIgnoreCase(userName)) {
return true;
}
}
return false;
}
@Override
public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast) throws SemanticException {
if (!isInTokenTypes(ast.getToken().getType())) {
return ast;
}
String userName = getUserName();
if (isAdmin(userName)) {
return ast;
}
throw new SemanticException(userName +
" is not Admin, except " +
Joiner.on(",").join(ADMINS)
);
}
}
将代码打包,上传到服务器上:
[root@hadoop01 ~]# ls jars/
hive-security-test-1.0-SNAPSHOT.jar
[root@hadoop01 ~]#
将其拷贝到Hive的lib
目录下:
[root@hadoop01 ~]# cp jars/hive-security-test-1.0-SNAPSHOT.jar /usr/local/apache-hive-3.1.2-bin/lib/
在hive的hive-site.xml
文件中,增加如下配置:
[root@hadoop01 ~]# vim /usr/local/apache-hive-3.1.2-bin/conf/hive-site.xml
<configuration>
...
<property>
<name>hive.users.in.admin.role</name>
<value>root</value>
<description>定义超级管理员,启动的时候会自动创建</description>
</property>
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
<description>开启权限</description>
</property>
<property>
<name>hive.security.authorization.createtable.owner.grants</name>
<value>ALL</value>
<description>表的创建者对表拥有所有权限</description>
</property>
<property>
<name>hive.security.authorization.task.factory</name>
<value>org.apache.hadoop.hive.ql.parse.authorization.HiveAuthorizationTaskFactoryImpl</value>
<description>进行权限控制的配置</description>
</property>
<property>
<name>hive.semantic.analyzer.hook</name>
<value>com.example.hive.security.HiveAdmin</value>
<description>使用钩子程序,识别超级管理员,进行授权控制</description>
</property>
</configuration>
重启Hive:
[root@hadoop01 ~]# jps
12401 ResourceManager
12898 RunJar
22338 Jps
12500 NodeManager
11948 NameNode
12204 SecondaryNameNode
12047 DataNode
[root@hadoop01 ~]# kill -15 12898
[root@hadoop01 ~]# nohup hiveserver2 -hiveconf hive.execution.engine=mr &
进入Hive中查看一下角色列表,看看配置是否生效:
[root@hadoop01 ~]# beeline -u jdbc:hive2://localhost:10000 -n root
...
0: jdbc:hive2://localhost:10000> set role admin; # 将当前用户角色设置为admin
No rows affected (0.027 seconds)
0: jdbc:hive2://localhost:10000> show roles; # 查看角色列表
+---------+
| role |
+---------+
| admin |
| public |
+---------+
2 rows selected (0.026 seconds)
0: jdbc:hive2://localhost:10000>
测试授权操作:
0: jdbc:hive2://localhost:10000> use hive_test;
No rows affected (0.028 seconds)
0: jdbc:hive2://localhost:10000> grant select on table bucket_table to user hive;
No rows affected (0.146 seconds)
0: jdbc:hive2://localhost:10000>
切换到hive
用户:
[root@hadoop01 ~]# sudo su - hive
进入交互命令终端,此时执行grant
语句就会报错,从报错提示可以看到该错误是从我们实现的Hook类里抛出来的:
[hive@hadoop01 ~]$ beeline -u jdbc:hive2://localhost:10000 -n hive
...
0: jdbc:hive2://localhost:10000> grant select on table partition_table to user hive;
Error: Error while compiling statement: FAILED: SemanticException hive is not Admin, except root (state=42000,code=40000)
0: jdbc:hive2://localhost:10000>
标签:dmi href error extends 最好 str 代码 support man
原文地址:https://blog.51cto.com/zero01/2549441