Reputation: 3367
I am trying to configure SQL Standard Based Authorization in Spark 1.4.0 that I did for Hive 0.13.1 by adding following properties.
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
<description>enable or disable the Hive client authorization</description>
</property>
<property>
<name>hive.security.authorization.manager</name>
<value>org.apache.hadoop.hive.ql.security.authorization.plugin.sqlstd.SQLStdHiveAuthorizerFactory</value>
<description>The Hive client authorization manager class name. The user defined authorization class should implement interface org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.</description>
</property>
<property>
<name>hive.security.authenticator.manager</name>
<value>org.apache.hadoop.hive.ql.security.SessionStateUserAuthenticator</value>
<description>hive client authenticator manager class name. The user defined authenticator should implement interface org.apache.hadoop.hive.ql.security.HiveAuthenticationProvider.</description>
</property>
<property>
<name>hive.users.in.admin.role</name>
<value>hduser</value>
<description>Comma separated list of users who are in admin role forbootstrapping. More users can be added in ADMIN role later.</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
<description>Setting this property to true will have HiveServer2 execute Hive operations as the user making the calls to it.</description>
</property>
In Hive its fine, but in Spark its not working properly mean when i try to set some rules for tables or try to create some roles, returning some exception.
Upvotes: 4
Views: 1720
Reputation: 979
Spark does not support Hive authorization as of now. See https://issues.apache.org/jira/browse/SPARK-12008
Upvotes: 1