Oozie Dashboard Improvements in Hue 3.9

Oozie Dashboard Improvements in Hue 3.9

Hello Oozie users,

Hue 3.9 release comes with several improvements for Oozie dashboard making it more robust and scalable.

Here is a video demoing the new features:

The new feature list:

  • Paginate all the Workflow / Coordinator / Bundle dashboard lists. Now you can see them all and filter by status.

  • Paginate all the Coordinator actions.
  • Moved Coordinator action filters to backend. This will result in more accurate filtering and easy navigation across all actions.

coord-actions

  • Update Concurrency and PauseTime of running Coordinator.

edit-coord

  • Ignore a terminated Coordinator action.

ignore

Next!

Our focus in the near future will be on improving the usability like Easier update of a scheduled workflow and Better Workflow action dashboard navigation. So, feel free to suggest new improvements or comment on the hue-user list or @gethue!

 

8 Comments

  1. Hector 10 months ago

    Oozie hive workflow cannot start with java.lang.StackOverflowError in hue(HA mode)
    I met a java.lang.StackOverflowError with hue-oozie :
    I’m going to use oozie(4.1.0-cdh5.4.1) in hue(3.7.0), but after I create a hive workflow just to run “select * from table”, the error occured as below:

    2016-04-27 22:59:55,408 ERROR ActionStartXCommand:517 – SERVER[ISHDWS006] USER[hdfs] GROUP[-] TOKEN[] APP[My_Workflow] JOB[0000001-160427224800779-oozie-oozi-W] ACTION[0000001-160427224800779-oozie-oozi-W@shell-89de] Error,
    java.lang.StackOverflowError
    at java.util.Hashtable.hash(Hashtable.java:239)
    at java.util.Hashtable.get(Hashtable.java:434)
    at java.util.Properties.getProperty(Properties.java:951)
    at java.util.Properties.getProperty(Properties.java:970)
    at org.apache.hadoop.conf.Configuration.get(Configuration.java:1146)
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:444)
    at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:470)

    The generated configurations are as below :
    Name Value
    hue-id-w 50001
    jobTracker yarn-rm-ha
    mapreduce.job.user.name hdfs
    nameNode dwparallelspace

    I think maybe the cause is nameNode value has no prefix hdfs://, but I don’t know where should I set that configuration.
    How can I resolve this problem ? Thanks very much.

    • Hue Team 10 months ago

      Hum, this error seems internal to Oozie. Would you have a giant list of Hadoop properties?

      • Hector 10 months ago

        Some related properties :
        core-site.xml

        fs.defaultFS
        hdfs://dwparallelspace

        hfds-site.xml

        dfs.nameservices
        dwparallelspace

        dfs.ha.namenodes.dwparallelspace
        nn1,nn2

        dfs.namenode.rpc-address.dwparallelspace.nn1
        S001:8020

        dfs.namenode.rpc-address.dwparallelspace.nn2
        S002:8020

        dfs.namenode.http-address.dwparallelspace.nn1
        S001:50070

        dfs.namenode.http-address.dwparallelspace.nn2
        S002:50070

        yarn-site.xml

        yarn.resourcemanager.cluster-id
        yarn-rm-ha

      • Hector 10 months ago

        any suggestion ? Thanks very much.

  2. Hector 10 months ago

    Some related properties :
    core-site.xml

    fs.defaultFS
    hdfs://dwparallelspace

    hfds-site.xml

    dfs.nameservices
    dwparallelspace

    dfs.ha.namenodes.dwparallelspace
    nn1,nn2

    dfs.namenode.rpc-address.dwparallelspace.nn1
    S001:8020

    dfs.namenode.rpc-address.dwparallelspace.nn2
    S002:8020

    dfs.namenode.http-address.dwparallelspace.nn1
    S001:50070

    dfs.namenode.http-address.dwparallelspace.nn2
    S002:50070

    yarn-site.xml

    yarn.resourcemanager.cluster-id
    yarn-rm-ha

  3. Enn 6 months ago

    Oozie dashboard only keeps failed jobs for longer time in the coordinator calendar (succeeded old jobs are not shown).
    Is there a way to configure this or some other way to see the history of old succeeded jobs?

    • Author
      Sai 6 months ago

      Using these property in oozie-default.xml, you can control the life of dashboard jobs
      “oozie.service.AuthorizationService.authorization.enabled”
      “oozie.service.AuthorizationService.default.group.as.acl”

      One of the cases where workflows are not cleaned up is when it’s coordinator is still running.

Leave a reply

Your email address will not be published. Required fields are marked *

*