For Hue 3.9 and BigInsights 4.1 have a look to https://developer.ibm.com/hadoop/blog/2015/10/27/how-to-install-hue-3-9-on-top-of-biginsights-4-1/
This article will walk you through the steps required to deploy/setup HUE on IBM BigInsights version 4.0 and above.
HUE or Hadoop User Experience is a Web interface for analyzing data with Apache Hadoop. With Big Data, you need a tool to navigate through your data, query your data and even search it. This is all tied up together in one place with HUE.
To deploy HUE on BigInsights, you need an up and running BigInsights Version 4.x cluster. For the purpose of this article, we can use BigInsights V4 Quickstart edition that is available on IBM website for free. You can download the Quick Start Edition here. It is assumed that your OS is redhat 6.x. If not, you will need to change the package installation commands as per your linux distro.
A couple of dependencies are required by HUE to run. So lets start with downloading the required packages. Launch terminal and download the required packages.
[root@rvm /]# yum install ant
[root@rvm /]# yum install python-devel.x86_64
[root@rvm /]# yum install krb5-devel.x86_64
[root@rvm /]# yum install krb5-libs.x86_64
[root@rvm /]# yum install libxml2.x86_64
[root@rvm /]# yum install python-lxml.x86_64
[root@rvm /]# yum install libxslt-devel.x86_64
[root@rvm /]# yum install mysql-devel.x86_64
[root@rvm /]# yum install openssl-devel.x86_64
[root@rvm /]# yum install libgsasl-devel.x86_64
[root@rvm /]# yum install sqlite-devel.x86_64
[root@rvm /]# yum install openldap-devel.x86_64
We will download the latest version of HUE as of today which is version 3.7.1 and extract it. In the terminal, run the following commands:
[root@rvm /]# wget https://cdn.gethue.com/downloads/releases/3.7.1/hue-3.7.1.tgz
[root@rvm /]# sudo echo “JAVA_HOME=\”/usr/lib/jvm/java-7-openjdk-184.108.40.206.x86_64/jre\”” >> /etc/environment
[root@rvm /]# tar zxvf hue-3.7.1.tgz
Add User And Group for HUE
[root@rvm /]# groupadd hue
[root@rvm /]# useradd hue -g hue
[root@rvm /]# passwd hue
Now give ownership of extracted hue folder to user hue by executing the following command.
|[root@rvm /]# chown hue:hue hue-3.7.1|
You will also need to add user hue to sudoers file as a sudoer.
1. As user hue, start the installation as shown below.
[root@rvm /]#sudo make install
2. By default, HUE installs to ‘/usr/local/hue’ in your Management node’s local filesystem as shown below. Make user hue, the owner of /usr/local/hue folder by executing
sudo chown –R hue:hue /usr/local/hue
Setting up hadoop properties for HUE
1. Configure properties in core-site.xml
i. Enable Webhdfs
Go to Ambari, select HDFS on the left side and then select config as shown.
Then scroll down and make sure webdfs is check marked as shown below:
ii. Add the following 2 properties under custom core-site.xml with value “*” as shown below:
2. Configure properties in oozie-site.xml
3. Configure properties in webcat-site.xml
Configure HUE.ini file to point to your Hadoop cluster
Note: In this article, the cluster is small-one node, therefore services like Hive Server, Hive Metastore, HBase Master, Zookeepers etc are deployed on one node itself. In case of bigger cluster, put the correct node information for the respective services that we are editing next. The screenshots below are just example to help you configure.
i. Edit Hdfs and webhdfs parameters to point to your cluster. Make the changes as shown. Don’t forget to uncomment these parameters after adding values.
ii. Configure YARN parameters and don’t forget to uncomment these parameters as shown:
– Save all the changes.
In your browser, go to
When prompted for userid/password, use user hue and its password that you created earlier to login.
You should see the following screen making sure that HUE is working properly.
In this article we have successfully deployed HUE 3.7.1 on top of BigInsights V4.0 using Quick Start edition.This setup would allow an end user to browse/copy/delete HDFS files, fire queries to hive/hbase and even create a dashboard for data analysis. This interface can also be used as a front end for your enterprise search application powered by Solr.