Use the Shell Action in Oozie

Use the Shell Action in Oozie

The following steps will successfully guide you to execute a Shell Action form Oozie Editor.

If the executable is a standard Unix command, you can directly enter it in the Shell Command field and click Add button.


Arguments to the command can be added by clicking the Arguments+ button.


${VARIABLE} syntax will allow you to dynamically enter the value via Submit popup.


If using Hue version less than 4.3 (it is automated from then):

If the executable is a script instead of a standard UNIX command, it needs to be copied to HDFS and the path can be specified by using the File Chooser in Files+ field.

#!/usr/bin/env bash




Additional Shell-action properties can be set by clicking the settings button at the top right corner.

Next version will support direct script path along with standard UNIX commands in Shell Command field making it even more intuitive.


  1. Sarthak Saxena 3 years ago


    I am trying to run a simple shell script that has the following content:
    spark-submit –class org.apache.spark.examples.SparkPi –master yarn-client /cloudera/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark-examples.jar 100

    In the shell command I write and then I add my file “” from my HDFS directory.

    log4j:WARN No appenders could be found for logger (
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See for more info.

    Kindly if you can assist.

    • Author
      Sai 3 years ago

      Hi Sarthak,
      You are supposed to give ‘–class’ and ‘–master’. Also, please confirm on CLI that the command is working before running the shell action.

      • Nitish 3 years ago


        I’m trying to connect to hive via shell script in a kerberos cluster. I’m passing hcat credentials also but the job is stuck with continuous msgs logged as

        Kindly let me know how to go forward with the same.

  2. dante 3 years ago

    Hi Sarthak,
    I had successfully run the shell script above.In additional, can I connect mysql (on remote machine) with shell action? pleasure to see your reply

  3. Manoj 2 years ago

    how read arguments from a property file while creating workflow using oozie editor

    • Hue Team 2 years ago

      The workflow editor does not read any property file. The properties needs to be entered in the settings of the workflow instead.

  4. Sushant 2 years ago

    I am trying to run the basic script in which I want to create a directory in hdfs but I am unable to do it.

    • Hue Team 2 years ago

      Could you explain more your problem?

  5. CrazyRen 2 years ago

    I had some sqoop import/export commands written in one script and made it as a ShellAction, then I submited the workflow with “admin” hue-user and got a exception like this:
    Job init does not exists: hdfs://xxxxx/user/admin/.staging/job_xxxxxxxxxxxxx_xxxx/job.splitmetainfo
    But it works well if I submit with “yarn” hue-user.I tried adding a variable “” under “oozie.use.system.libpath” in the workflow settings(Oozie Editor UI), but it’s useless.

    • Hue Team 2 years ago

      Is the sqoop command working on the CLI as the ‘admin’ user?
      Also in non kerberos clusters, Oozie will run the shell jobs as ‘yarn’ and not the actual user, hence potential above problems.

      • CrazyRen 2 years ago

        The script of batch sqoop commands run successfully on ssh CLI as ‘root’ user.I haven’t tried it as ‘admin’ user.Actually, I’m not sure that there is a user named ‘admin’ on the linux host now.

        • Hue Team 2 years ago

          If you go in Job Designer and run the Sqoop example, does it work?

          • CrazyRen 2 years ago

            Yes. There is nothing wrong when running sqoop example from Job Designer.

  6. Gerald Preston 2 years ago

    Hi, I ‘am trying to start a SreamSets pipeline by the following ‘sh’ with “#!/bin/bash $SS/bin/streamsets cli -U http://localhost:18630 -u xxx -p xxx \–dpmURL manager start -n PipeLineID53846a76-74-b0491be4d42d8789″ inside the file, I get
    “DPM Login failed, status code ‘403’: {“message”:”Authentication failed”}”. What is wrong?
    From the command line: “$SS/bin/streamsets cli -U http://localhost:18630 -u xxx -p xxx \–dpmURL manager start -nPipeLineID53846a76-74-b0491be4d42d8789″ I get “ Connection refused”

    I see not to run it as a bash. I have no idea about the java?

    Any ideas?


    • Hue Team 2 years ago

      Depending on your cluster, could you use the full hostname instead of localhost?

  7. Mahesh 1 year ago


    Can you guys please show a bit advanced example where reading a properties file in side the shell script and print the variables??

    I am trying to do this but having real troubles and the Oozie job always getting failed with an error from Shell command.

    Appreciate your help here.

    • Hue Team 1 year ago

      How about?

      #!/usr/bin/env bash


      • satish 1 year ago

        Hi Hue Team,
        It would be helpful if you can provide details like source a file(present in hdfs) in the shell script and read the variable

  8. yss 1 year ago

    I followed this page by creating my script in /tmpt on hdfs. But when I run my script I get the following error.
    USER[hue] GROUP[-] TOKEN[] APP[My_Workflow] JOB[0000020-171208174942737-oozie-oozi-W] ACTION[[email protected]] Exception in check(). Message[JA017: Could not lookup launched hadoop Job ID [job_1512662704502_0021] which was associated with action [[email protected]]. Failing this action!]
    org.apache.oozie.action.ActionExecutorException: JA017: Could not lookup launched hadoop Job ID [job_1512662704502_0021] which was associated with action [[email protected]]. Failing this action!
    at org.apache.oozie.action.hadoop.JavaActionExecutor.check(
    at org.apache.oozie.service.CallableQueueService$
    at java.util.concurrent.ThreadPoolExecutor.runWorker(
    at java.util.concurrent.ThreadPoolExecutor$

    Do you know how to solve this ?

  9. Cosmin Miron 1 year ago

    Hi Hue Team,

    I have a problem creating a Hue Shell Workflow – i’m not able to browse the sh script in hdfs, in the first window (first picture in your tutorial) showed after i dragged the shell action into my newly created workflow. The option i have there is just to choose one of my documents from a combo, but no ‘…’ button to browse the script in hdfs.

    I tried to create a shell document (saved into my workspace); here i have the option to select a script path from hdfs, but when running the document, i encountered this error:
    Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.ShellMain], main() threw exception, Cannot run program “” (in directory “/hadoop/yarn/local/usercache/cosmin.miron/appcache/application_1513247626893_0013/container_e191_1513247626893_0013_01_000002”): error=2, No such file or directory is a python script saved in my home directory (/user/cosmin.miron with drwxr-x— rights)

    Even this document failed running, i tried to select it in my workflow (when adding the shell action), but in that combo i did not find my document.

    I should mention here the cosmin.miron user is a ldap user imported in Hue with full Hue rights (like the admin user). I tried to create a workflow with the admin user and when adding the shell action, i have the option to browse in hdfs for my shell script (like it is in your first picture from tutorial).

    How i can fix this for my ldap user cosmin.miron?

    Thank you!

    • Hue Team 1 year ago

      Document Shell action, in the Editor: probably the script needs to be added as a File parameter, and the shell command should just be its ‘name’. Need to click on the cog icon of the snippet to add one.

      Soon it will be automatic:

      • Cosmin Miron 1 year ago

        Thank you for your reply…

        I solved the problem of not having the browsing button (…) to select the shell script (when dragging a shell action in my workflow) by setting enable_document_action=false in hue.ini config file.

        What you suggested it’s working good when configuring a shell action in an workflow…. but i had spoken about the case when you have just a shell script that want to be executed (not dragging it in a workflow) – in this case i set the file as you suggested and kept the script file as ‘name’ (even in interface, the name of edit box is ‘Script path’)… after submitting, i gut this error: unsupported operand type(s) for +: ‘dict’ and ‘str’ (for this error i found this issue

        Thank you again!

  10. Priya 12 months ago

    I want to run ssh action, log into to edge node and run the shell script there. I did set up password less ssh and tried running it from CLI, but want to try it using oozie UI. Can anyone help me with this?

  11. Parist 9 months ago

    Hi Hue Team,
    I am trying to run a simple shell script that has the following content:
    1. Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.ShellMain], exit code [1]
    2. ERROR is considered as FAILED for SLA

    I hava put my script to HDFS directory.

    Kindly Thanks.

    • Hue Team 9 months ago

      You should see the stderr logs by clicking on the log icon on the action?

  12. Surender 4 months ago

    Hi Hue Team,

    I am trying to run Hive command from shell. It is unable to instantiate SessionHiveMetaStoreClient. Got following error
    Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0

    Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p2818.3063/jars/hive-common-1.1.0-cdh5.12.1.jar!/
    FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

    Please help me to fix this issue.

    Thank You

    • Hue Team 4 months ago

      Using Hive or even better HiveServer2 is recommended. If not in your case, you would need to add a proper hive-site.xml into a ‘lib’ folder in the workspace of the workflow.

Leave a reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.