Create Dataflows with Clicks, Not Code
To use the editor, head over to the data manager and locate your dataflow on the Dataflows & Recipes tab. Open the actions menu, and click Edit.
If you’re creating a new dataflow, you’re taken to the editor automatically.
The dataflow editor visually displays the dataflow you’re editing.
The canvas (1) shows the individual nodes in your dataflow and the links between them. If you can’t find a node, enter the node name in the search box (2) to focus on that node. The node palette (3) has buttons for each type of node. Click one to add it to the canvas. Use the buttons at the top of the editor (4) to work with the dataflow JSON. Preview and download the underlying JSON or upload an existing JSON file to work with it in the editor.
The dataflow in this example extracts data from the Salesforce Opportunity, Account, and User objects, joins it together, and creates a registered dataset.
Here Are Some Dataflow Essentials to Get You Started
Use a dataflow to extract data into Wave, transform it, and load it into a dataset. Each step in this process is performed by a node, with data flowing from one node to the next through links. Each node performs one of three main functions.
- Extract nodes bring in data from a Salesforce object, a replicated object, or an
existing dataset. Start your dataflow with one or more of these.
These nodes output data to other nodes. In the editor, look for the output arrow on the right of these nodes.
- Prepare nodes transform the data. For example, you can join data from different
nodes, add calculated fields, and filter rows.
These nodes receive data from and output data to other nodes. Look for the input and output arrows at each end of these nodes.
- The register node creates a dataset with the resulting data and makes it
available for use. End your dataflow with one of these.
This node receives data from one other node only. Look for the input arrow on the left of this node.
Take a Walk Through Editing a Dataflow
Imagine you’re the Wave admin at Blue Sky Solar, a company selling and installing solar panels in California and Arizona. You’re using Wave connectors to replicate opportunity data from your Arizona org, and you want to merge this with your local California data. These are the steps you take.
- You start editing your dataflow, and the first thing you see is the new visual view. Here, your dataflow is already extracting sales data from your local California org.
- In the node palette, click the digest node button (). This node extracts data from replicated objects.
- Give the node a unique name and click Continue.
- Complete the node attributes and click Save.
- Add an append node in the same way. The append node has a Sources attribute
where you select the input nodes.
The nodes are connected on the canvas.
- Add a register node, specifying the append node as the source.
- Stand back and marvel at how quickly you edited your dataflow to extract and append external data, and register the result as a new dataset. And no code!
Don’t Forget to Update the Dataflow
When you’re done editing, click Update Dataflow to update the definition file with your changes. Wave validates your dataflow to ensure that you provided all the required attribute values, and you can’t continue if there are errors. Correct the errors it finds before trying again.
Updating the dataflow does not run it. You only see the results when the dataflow next runs.