Export and import your Oozie workflows

11 March 2015 in Scheduling - 2 minutes read

August 7th 2015 update: this post is now deprecated as of Hue 3.9: https://gethue.com/exporting-and-importing-oozie-workflows/


There is no handy way to import and export your Oozie workflows until Hue 4 and HUE-1660, but here is a manual workaround possible since Hue 3.8/CDH5.4 and its new Oozie Editor.

The previous methods were very error prone as they required to insert data in multiple tables at the same time. Now, there is only one record by workflow.


Export all workflows

./build/env/bin/hue dumpdata desktop.Document2 -indent 2 -natural > data.json


Export specific workflows

20000013 is the id you can see in the URL of the dashboard.

./build/env/bin/hue dumpdata desktop.Document2 -indent 2 -pks=20000013 -natural > data.json

You can specify more than one id:



Load the workflows


./build/env/bin/hue loaddata data.json


Refresh the documents

Until we hit Hue 4, this step is required in order to make the imported documents appear:

./build/env/bin/hue sync_documents


And that’s it, the dashboards with the same IDs will be refreshed with the imported ones!



If the document with the same id already exists in the database, just set its id to null in data.json and it will be inserted as a new document.

vim data.json

then change

"pk": 16,


"pk": null,



If using CM, export this variable in order to point to the correct database:




Where is the most recent ID in that process directory for hue-HUE_SERVER.

or even quicker

export HUE_CONF_DIR="/var/run/cloudera-scm-agent/process/\`ls -alrt /var/run/cloudera-scm-agent/process | grep HUE | tail -1 | awk '{print $9}'\`"


Have any questions? Feel free to contact us on hue-user or @gethue!

comments powered by Disqus

More recent stories

13 November 2019
Visually surfacing SQL information like Primary Keys, Foreign Keys, Views and Complex Types
Read More
31 October 2019
Missing some color? How to improve or add your own SQL syntax Highlighter
Read More
24 October 2019
How to create a HBase table on Kerberized Hadoop clusters
Read More