Hue 3.9 with some general overall improvements is out!

Hue 3.9 with some general overall improvements is out!

Hi Big Data Aficionados,

 

The Hue Team is glad to thanks all the contributors and release Hue 3.9! hue-logo (copy)

The focus of this release was to improve the experience everywhere (no new app were added) and the stability. More than 700 commits on top of 3.8 are in and some apps like the Notebook Editor and Spark Job Server got a serious lift! Go grab the tarball release and give it a spin!


Download

You can find below a detailed description of what happened. For all the changes, check out the release notes or the documentation.

 

Tutorials

Explore San Francisco Bike share data with a dynamic visual dashboard

Build a real time Tweet dashboard with Search and Spark

 

Main improvements

 

Spark (beta)

notebook

  • Revamp of Notebook UI
  • Support for closing session and specifying Spark properties
  • Support for Spark 1.3, 1.4, 1.5
  • Impersonation with YARN
  • Support for R shell
  • Support for submitting jars or python apps

Learn more about the Notebook and the REST Spark Job Server Livy.

 

Search

search-full-mode

  • Live filtering when moving on the map
  • Refresh only widgets that changed, refresh every N seconds
  • Edit document
  • Link to original document
  • Export/import saved dashboards
  • Share dashboards
  • Save and reload the full query search definition
  • Fixed or rolling time window filtering
  • Marker clustering on Leaflet Map widget
  • Support 2-letter country code in gradient map widget
  • Full mode Player display
  • Simpler Mustache integration to enhance your result style
  • Big IDs support
  • Preview of nested analytics facets

Read more in this post…

 

Stability/performance

  • Fix deadlock fetching Thrift clients and waiting for Thrift connections
  • New set of integrations tests
  • Add optional /desktop/debug/check_config JSON response
  • MariaDB support
  • Configuration check to confirm that MySql engine is InnoDB
  • Faster Home page
  • Series of Oracles and DB migration fixes

 

Security

 

Oozie

ignore

Read more in this post…

 

SQL

Metastore Partition View

 

HBase

Screenshot 2015-08-20 16.34.44

  • Upload binary into cells
  • Allow to empty a cell

Read more in this post…

 

Sentry

sentry-multi-cols

  • Better support of URI scope privilege
  • Support COLUMN scope privilege for finer grain permissions on tables
  • Support HA
  • Easier navigation between sections
  • Support new sentry.hdfs.integration.path.prefixes hdfs-site.xml property

 

Indexer

  • Directly upload configurations without requiring the solrctl command

 

ZooKeeper

  • Creation of a lib for easily pulling or editing ZooKeeper information

 

Pig

pig-editor-declare

  • Support %default parameters in the submission popup
  • Do not show %declare parameters in the submission popup
  • Automatically generate hcat auth credentials

 

Sqoop

  • Support Kerberos authentication

 

Conferences

It was a pleasure to present at Big Data Budapest Meetup, Big Data Amsterdam, Hadoop Summit San Jose and Big Data LA.

 

New distributions

 

Team Retreat

Hummus and yogurt were at the menu in Israel!

 

Next!

 

Next release (3.10) will focus on making the v1 of the Spark Notebook and adding simpler Solr indexing on top of the general improvements.

Hue 4 design is also getting kicked in with the goal of becoming the equivalent of “Excel for Big Data”. A fresh new look, a unification of all the apps, wizards for ingesting data… will let you use the full platform (Ingest, Spark, SQL, Search) in a single UI for fast Big Data querying and prototyping!

 

Onwards!

 

As usual thank you to all the project contributors and for sending feedback and participating on the hue-user list or @gethue!

 

32 Comments

  1. lonely7345 2 years ago

    we are using hue 3.7 for few months, and generated a lot of datas. how can i upgrade it to hue 3.9 and keep the data from lost.

    • Hue Team 2 years ago

      We would recommend first to back up your database in case there are any problems. Then simply run “./build/env/bin/hue migrate”, which should take care of updating the database. Hue 3.9 should then be ready for use.

  2. bg2 2 years ago

    I got the error message like this when I ran “sudo make install” in Ubuntu 12.04 alongside with HDP 2.3.


    byte-compiling build/bdist.linux-x86_64/egg/tablib/packages/xlwt3/antlr.py to antlr.pyc
    SyntaxError: (‘invalid syntax’, (‘build/bdist.linux-x86_64/egg/tablib/packages/xlwt3/antlr.py’, 1946, 46, ‘ print(fmt % (line,col,text), file=sys.stderr)\n’))

    byte-compiling build/bdist.linux-x86_64/egg/tablib/packages/markup.py to markup.pyc

    make[2]: *** [ext-env-install] Error 1
    make[2]: Leaving directory `/usr/local/hue/desktop/core’
    make[1]: *** [.recursive-env-install/core] Error 2
    make[1]: Leaving directory `/usr/local/hue/desktop’

    I think I have completely installed all prerequisites. Do you have any ideas about this problem?

    Thank you.

    • Hue Team 2 years ago

      Seems like your have a Python 2.x and it thinks you have Python 3. What is your Python version? (python –version)

  3. Jorge 2 years ago

    I’ve installed Hue 3.9 and I have a problem when I try to run pig code Pig Editor or an Job Designer example but it works if I run “oozie job” command. Error in oozie.log:

    2015-10-08 04:36:07,155 WARN ActionStartXCommand:523 – SERVER[m1.novalocal] USER[bbvoop] GROUP[-] TOKEN[] APP[Shell] JOB[0000001-151008034921005-oozie-bbvo-W] ACTION[0000001-1510
    [email protected]] Error starting action [Shell]. ErrorType [TRANSIENT], ErrorCode [JA009], Message [JA009: Cannot initialize Cluster. Please check your configuration
    for mapreduce.framework.name and the correspond server addresses.]
    org.apache.oozie.action.ActionExecutorException: JA009: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses
    .
    at org.apache.oozie.action.ActionExecutor.convertExceptionHelper(ActionExecutor.java:454)
    at org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:434)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1130)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1299)
    at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:250)
    at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:64)
    at org.apache.oozie.command.XCommand.call(XCommand.java:286)
    at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:175)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
    at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
    at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:82)
    at org.apache.hadoop.mapreduce.Cluster.(Cluster.java:75)
    at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475)
    at org.apache.hadoop.mapred.JobClient.(JobClient.java:454)
    at org.apache.oozie.service.HadoopAccessorService$3.run(HadoopAccessorService.java:452)
    at org.apache.oozie.service.HadoopAccessorService$3.run(HadoopAccessorService.java:450)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.oozie.service.HadoopAccessorService.createJobClient(HadoopAccessorService.java:450)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.createJobClient(JavaActionExecutor.java:1342)
    at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:1078)
    … 8 more

    Ozzie configuration in hue.ini

    [liboozie]
    oozie_url=http://m1.novalocal:11000/oozie

    Yarn configuration in Hue.ini

    [[yarn_clusters]]

    [[[default]]]
    # Enter the host on which you are running the ResourceManager
    ## resourcemanager_host=m2.novalocal

    # The port where the ResourceManager IPC listens on
    ## resourcemanager_port=8032

    # Whether to submit jobs to this cluster
    submit_to=True

    # Resource Manager logical name (required for HA)
    logical_name=rm1

    # Change this if your YARN cluster is Kerberos-secured
    security_enabled=true

    # URL of the ResourceManager API
    resourcemanager_api_url=http://m2.novalocal:8088

    # URL of the ProxyServer API
    proxy_api_url=http://m2.novalocal:8088

    # URL of the HistoryServer API
    history_server_api_url=http://m3.novalocal:19888

    # In secure mode (HTTPS), if SSL certificates from YARN Rest APIs
    # have to be verified against certificate authority
    ## ssl_cert_ca_verify=True

    # HA support by specifying multiple clusters
    # e.g.

    [[[ha]]]
    # Resource Manager logical name (required for HA)
    logical_name=rm2
    # URL of the ResourceManager API
    resourcemanager_api_url=http://m3.novalocal:8088

    # URL of the HistoryServer API
    history_server_api_url=http://m3.novalocal:19888

    # URL of the ProxyServer API
    proxy_api_url=http://m2.novalocal:8088

    # Change this if your YARN cluster is Kerberos-secured
    security_enabled=true

    # Whether to submit jobs to this cluster
    submit_to=True

    Other hue applications (job browser, file browser, hive editor, …) works fine. Any idea?

    • Jorge 2 years ago

      I fixed it!!
      Hue send the logical_name variable to oozie and change job properties. I changed hui.ini
      [[hdfs_clusters]]
      [[[default]]]
      logical_name=hdfs://hdfsha # HA HDFS cluster-id
      [[yarn_clusters]]
      [[[default]]]
      logical_name=yarnha # HA Yarn cluster-id

  4. Alex Laverty 2 years ago

    It looks like the tgz is corrupted, I downloaded HUE from the url :

    https://dl.dropboxusercontent.com/u/730827/hue/releases/3.9.0/hue-3.9.0.tgz

    on the 02/11/2015 11:50am, when I double click the tgz file it justs unzips recursively it to another tar ball.. can someone have a look.

    I’m on a Mac OS Yosemite 10.10.5

    Regards Alex

    • Hue Team 2 years ago

      I just tried and it was good for me:

      69741428 bytes
      sha1 49acd9dd38f69440e7c09829cefbb3db411e667b

  5. Tom 2 years ago

    Hi, I have installed hue 3.7 by following this guide http://gethue.com/hadoop-tutorial-how-to-create-a-real-hadoop-cluster-in-10-minutes/
    I now want to test out the spark-notebook but think i need to update to a later version of hue. how do I update my version of hue?

    thanks

    • Hue Team 2 years ago

      This will use CDH, and CDH5.5 will come out with the latest Hue in one week or two.
      Note that current CDH installed Hue 3.8 even if is shows 3.7.

  6. tim fei 1 year ago

    The download link doesn’t work in China Mainland. Is there a mirror or alternative site for use ?

  7. JigarS 1 year ago

    Is there a way to download binary distribution of Hue 3.9 for Linux without going through build steps? I also cannot download from dropbox, which is blocked at work.

    • Hue Team 1 year ago

      Any reason you can’t use a packaged version of Hue? CDH5.5 is the more recent for example with Hue 3.9 + a few more commits. There is also BigTop and other, but it seems like they are older currently. Hue project currently only offers tarballs as we would need a lot more infrastructure to build packages and other companies are already building them. We put a copy here for you: http://gethue.com/downloads/hue-3.9.0.tgz

  8. Nikitha 1 year ago

    Hello, After Installing HUE3.9 I am getting the hue home page with full of errors. can u plz give me the configuration files for /home/nikitha/hue/desktop/conf/pseudo-distributed.ini.

    Please. There are no documents which explain the configuration and connection process between hadoop and hue clearly.

    Please help me out . If possible plz send the documents to my mail id [email protected]

    And also to install HUE what are all the necessary versions of hive, hadoop should i have?

    plz shed some light there too.

  9. anand 1 year ago

    Read one of the comments regarding Hue on windows. Still no support for windows? Any possibility or near future plan?

  10. Dong Chen 1 year ago

    Your comment is awaiting moderation.
    Hi Hue Team Guys,

    I am currently using HUE. HUE makes our work easier. I like it.

    But I want more when I found even modify a table can not be done by HUE. Is it a misunderstanding or can you please help with how to deal with it?

    Thanks!

    I changed my e-mail address

    • Hue Team 1 year ago

      Do you want to modify a Hive tables or prevent users from modifying them?

      • Dong Chen 1 year ago

        Modifying a Hive table as a user.

        Change the column data type or change the location of file or change the attribute of a Hive table.
        These cases usually happen when you create a table on digest and after a few SQL on it. You are looking for a change on your table.

        Thanks.

        • Hue Team 1 year ago

          Right now the interface only supports editing the comments of a table. Name and type editions will come at some point.

  11. Dong Chen 1 year ago

    Hi Hue Team,

    Is it possible to incorporate different `SerDe` and more functions, aggregation functions into HUE? Or provide a plugin market place which can easily manage, configure, update, users really need them.

    Hue as a development tool of web interface, does it have limitation comparing to the C/S development tool such as Eclipse and SQLDeveloper. Real developer will like the functionality much such as auto-complete, source code management, syntax correction, package dependencies which means starting development immediately and efficiently.

    But it is much better than login on the backend to manage the file system, submit a SQL, control the workflow. Thanks you guys.

    Regards!

    Dong

    • Hue Team 1 year ago

      Hue is improving and focusing on the core of the SQL editor right now. Auto-complete, syntax correction are improving… and more is coming as it mature. Especially check Hue 3.10 that is coming out this month. The good news to is that if you canalready register Hive serders at the HiveServer2 level and they will show up in Hue.

      Hue is also open source and welcomes any contributions in this area.

  12. Toni 11 months ago

    We migrated our Hue instances from Hue 2.6 to Hue 3.9. During the migration we had to manually create the following tables:
    `desktop_settings`
    `useradmin_userprofile`
    `useradmin_huepermission`
    `desktop_document`
    `beeswax_session`
    `beeswax_savedquery`
    `desktop_documenttag`
    `desktop_document_tags`
    `beeswax_queryhistory`
    `useradmin_grouppermission`

    The application is running and working. However, the oozie console does not allow users to save workflows. Upon investigating we realized that more tables were missing. It looks like the desktop_document2* tables were not created.

    I traced these to:
    -rw-rw-r–. 1 hue hue 13246 May 8 2015 0010_auto__add_document2__chg_field_userpreferences_key__chg_field_userpref.py
    -rw-rw-r–. 1 hue hue 8668 Jun 4 2015 0011_auto__chg_field_document2_uuid.py
    -rw-rw-r–. 1 hue hue 8715 Aug 7 2015 0012_auto__chg_field_documentpermission_perms.py
    -rw-rw-r–. 1 hue hue 9633 Aug 7 2015 0013_auto__add_unique_documenttag_owner_tag.py
    -rw-rw-r–. 1 hue hue 9756 Aug 7 2015 0014_auto__add_unique_document_content_type_object_id.py
    -rw-rw-r–. 1 hue hue 9800 Aug 7 2015 0015_auto__add_unique_documentpermission_doc_perms.py
    -rw-rw-r–. 1 hue hue 11512 Aug 7 2015 0016_auto__add_unique_document2_uuid_version_is_history.py

    Is there a way that we can invoke these scripts to get the necessary tables created?

    • Hue Team 11 months ago

      You could use
      ./build/env/bin/hue syncdb
      ./build/env/bin/hue migrate

      • Toni 11 months ago

        If I run /build/env/bin/hue migrate or syncdb will that overwrite the tables currently in the Hue database? I dont want to lose the data I have now.

        • Hue Team 11 months ago

          No, it will add the missing tables and fields. We recommend to backup you DB just in case for you peace of mind.

  13. ligang 11 months ago

    I want to use hue-3.9.0-cdh5.4.7,because other hadoop module are cdh5.4.7,but in this site http://archive.cloudera.com/cdh5/cdh/5/, hue3.9 start with cdh5.5.0,so I want to ask that how can I build hue-3.9.0-cdh5.4.7?

Leave a reply

Your email address will not be published. Required fields are marked *

*