Use the Shell Action in Oozie

Use the Shell Action in Oozie

The following steps will successfully guide you to execute a Shell Action form Oozie Editor.

If the executable is a standard Unix command, you can directly enter it in the Shell Command field and click Add button.

1

Arguments to the command can be added by clicking the Arguments+ button.

2

${VARIABLE} syntax will allow you to dynamically enter the value via Submit popup.

34

If the executable is a script instead of a standard UNIX command, it needs to be copied to HDFS and the path can be specified by using the File Chooser in Files+ field.

#!/usr/bin/env bash

sleep[\code]

5

Additional Shell-action properties can be set by clicking the settings button at the top right corner.

Next version will support direct script path along with standard UNIX commands in Shell Command field making it even more intuitive.

17 Comments

  1. Sarthak Saxena 2 years ago

    Hi,

    I am trying to run a simple shell script that has the following content:
    spark-submit –class org.apache.spark.examples.SparkPi –master yarn-client /cloudera/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark-examples.jar 100

    In the shell command I write script.sh and then I add my file “script.sh” from my HDFS directory.

    log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

    Kindly if you can assist.

    • Author
      Sai 2 years ago

      Hi Sarthak,
      You are supposed to give ‘–class’ and ‘–master’. Also, please confirm on CLI that the command is working before running the shell action.

      • Nitish 1 year ago

        Hi,

        I’m trying to connect to hive via shell script in a kerberos cluster. I’m passing hcat credentials also but the job is stuck with continuous msgs logged as
        HeartBeat
        HearBeat

        Kindly let me know how to go forward with the same.

  2. dante 2 years ago

    Hi Sarthak,
    I had successfully run the shell script above.In additional, can I connect mysql (on remote machine) with shell action? pleasure to see your reply

  3. Manoj 1 year ago

    how read arguments from a property file while creating workflow using oozie editor

    • Hue Team 1 year ago

      The workflow editor does not read any property file. The properties needs to be entered in the settings of the workflow instead.

  4. Sushant 11 months ago

    I am trying to run the basic script in which I want to create a directory in hdfs but I am unable to do it.

    • Hue Team 11 months ago

      Could you explain more your problem?

  5. CrazyRen 9 months ago

    I had some sqoop import/export commands written in one script and made it as a ShellAction, then I submited the workflow with “admin” hue-user and got a exception like this:
    Job init failed:org.apache.hadoop.yarn.exceptions.YarnRuntimeException:java.io.FileNotFoundException:File does not exists: hdfs://xxxxx/user/admin/.staging/job_xxxxxxxxxxxxx_xxxx/job.splitmetainfo
    But it works well if I submit with “yarn” hue-user.I tried adding a variable “user.name=yarn” under “oozie.use.system.libpath” in the workflow settings(Oozie Editor UI), but it’s useless.

    • Hue Team 9 months ago

      Is the sqoop command working on the CLI as the ‘admin’ user?
      Also in non kerberos clusters, Oozie will run the shell jobs as ‘yarn’ and not the actual user, hence potential above problems.

      • CrazyRen 9 months ago

        The script of batch sqoop commands run successfully on ssh CLI as ‘root’ user.I haven’t tried it as ‘admin’ user.Actually, I’m not sure that there is a user named ‘admin’ on the linux host now.

        • Hue Team 9 months ago

          If you go in Job Designer and run the Sqoop example, does it work?

          • CrazyRen 8 months ago

            Yes. There is nothing wrong when running sqoop example from Job Designer.

  6. Gerald Preston 4 months ago

    Hi, I ‘am trying to start a SreamSets pipeline by the following ‘sh test.sh’ with “#!/bin/bash $SS/bin/streamsets cli -U http://localhost:18630 -u xxx -p xxx \–dpmURL https://cloud.streamsets.com manager start -n PipeLineID53846a76-74-b0491be4d42d8789″ inside the file, I get
    “DPM Login failed, status code ‘403’: {“message”:”Authentication failed”}”. What is wrong?
    From the command line: “$SS/bin/streamsets cli -U http://localhost:18630 -u xxx -p xxx \–dpmURL https://cloud.streamsets.com manager start -nPipeLineID53846a76-74-b0491be4d42d8789″ I get “java.net.ConnectException: Connection refused”

    I see not to run it as a bash. I have no idea about the java?

    Any ideas?

    Thanks!

    • Hue Team 4 months ago

      Depending on your cluster, could you use the full hostname instead of localhost?

  7. Mahesh 3 weeks ago

    Hi,

    Can you guys please show a bit advanced example where reading a properties file in side the shell script and print the variables??

    I am trying to do this but having real troubles and the Oozie job always getting failed with an error from Shell command.

    Appreciate your help here.

    • Hue Team 3 weeks ago

      How about?

      #!/usr/bin/env bash

      sleep

Leave a reply

Your email address will not be published. Required fields are marked *

*