How to Automate Tableau Prep Flow

Cody Schneider8 min read

Building a data cleaning workflow in Tableau Prep is incredibly satisfying. You take messy, inconsistent data from multiple sources and transform it into a perfectly clean, analysis-ready dataset. But running that flow manually every week? That satisfaction quickly turns into a chore. This article will show you how to automate your Tableau Prep flows, so your data stays fresh without you lifting a finger.

Why Automate Your Tableau Prep Flows?

While running a flow on demand is fine for one-off projects, automation is where you unlock the real power of Tableau Prep. If you’re still manually clicking “Run Flow” every Monday morning, here’s why you'll want to stop.

  • Saves Time and Effort: The most obvious benefit is reclaiming your time. Automating repetitive data prep tasks means you and your team can focus on analyzing the data and finding insights, not curating it. No more setting a reminder to refresh data before a big meeting.
  • Ensures Consistency: Manual processes are prone to human error. Someone might forget to run the flow, use the wrong version of a file, or apply a filter incorrectly. Automation guarantees the exact same cleaning steps are performed in the exact same order, every single time.
  • Delivers Timely, Fresh Data: In today’s fast-paced environment, decisions based on last week's data can be costly. Automated flows can run overnight, a few times a day, or even hourly, ensuring that your Tableau dashboards and reports are always powered by the most up-to-date information available.
  • Reduces Dependencies: When a single person is responsible for manually running a data flow, their sick days or vacations can create a bottleneck. Automation makes the process independent of any one person, keeping the data flowing even when you’re out of the office.

What You’ll Need to Get Started

Before you can automate a flow, you need a few things in place. There are two primary methods for automating Tableau Prep, and your setup will determine which one you can use.

  1. A Completed Tableau Prep Flow: You’ll need a finished flow (.tfl or .tflx file) that connects to your desired inputs, performs all the cleaning steps, and specifies the output locations.
  2. One of Two Automation Environments:

We'll walk through both methods step by step.

Method 1: Automating with Tableau Prep Conductor

If your organization has the Data Management Add-on for Tableau Server or Tableau Cloud, this is the most straightforward way to automate your flows. The entire process is handled through a visual interface, complete with dashboards for monitoring performance and failures.

Step 1: Publish Your Flow

First, you need to publish your flow from Tableau Prep Builder to your server environment.

  1. In Tableau Prep Builder, open the flow you want to automate.
  2. Go to the Server menu at the top and select Publish Flow.
  3. If you are not already logged in, you will be prompted to sign into your Tableau Server or Tableau Cloud account.
  4. In the publish dialog, select a Project to save the flow in, give it a name, and add a helpful description.
  5. Important: For data sources that require credentials (like a SQL database), make sure to either embed the credentials in the connection or ensure the server is configured to handle them. This allows the flow to run without needing manual password entry.
  6. Click Publish.

Step 2: Schedule the Flow in Tableau Server/Cloud

Once your flow is published, you can set its schedule through the web interface.

  1. Log in to your Tableau Server or Tableau Cloud instance and navigate to the project where you published your flow.
  2. Click on the flow to open its overview page.
  3. You should see a Scheduled Tasks tab. Click on it.
  4. Click the New Task button. This will open the scheduling dialog.
  5. In the menu, you can select from a list of predefined schedules (e.g., Daily at Midnight, Weekly on Sunday). For more control, click Create a new schedule to define a custom cadence like "Every other Tuesday at 6 AM."
  6. Select the schedule you want, and click Create Task.

That’s it! Your flow is now scheduled to run automatically. You can monitor its success and runtime from the Run History tab, where Tableau will show you a log of every time the task ran, whether it succeeded, and any errors it encountered.

Method 2: Automating with the Command Line Interface (CLI)

If you don't have the Data Management Add-on, you can still automate your flows for free using the command-line utility that comes bundled with Tableau Prep Builder. This method is perfect for individuals or smaller teams and offers immense flexibility for integrating with other batch processes.

Step 1: Locate the Command Line Utility

The first step is finding the command-line script. Its location depends on your operating system and where you installed Tableau Prep Builder. On Windows, it’s typically found here:

C:\Program Files\Tableau\Tableau Prep Builder <version>\scripts\

The file is named tableau-prep-cli.bat.

Step 2: Create a Secure Credentials File

To run a flow that connects to a database or a published data source, you need to provide credentials. Hardcoding passwords directly in a script is a major security risk. Instead, Tableau lets you use a separate JSON file.

Create a simple text file and save it as something like credentials.json. The structure will look like this:

{
  "databaseConnections": [{
    "hostname": "your_database_server_address",
    "port": 5432,
    "username": "your_db_username",
    "password": "your_db_password"
  }]
}

This file securely stores the information needed to connect to your data sources. Save this file in a restricted folder that only you and the service account running the task can access.

Step 3: Build Your Command

Now, open a text editor and build the command that will execute your flow. The command has two key parts: the path to the credentials file (-c) and the path to the flow file (-t).

A typical command will look like this:

"C:\Program Files\Tableau\Tableau Prep Builder 2023.2\scripts\tableau-prep-cli.bat" -c "C:\TableauScripts\credentials.json" -t "C:\TableauScripts\MyFlow.tfl"

Test this command by opening Command Prompt, pasting it in, and hitting Enter. If it runs correctly, you’re ready to schedule it.

Step 4: Use a Scheduler to Run the Command

Now you can use your operating system's built-in scheduler to run this command automatically. Here’s how to do it with Windows Task Scheduler:

  1. Open the Start Menu and search for "Task Scheduler."
  2. In the Actions pane on the right, click Create Basic Task...
  3. Give your task a clear name (e.g., "Refresh Weekly Sales Data - Tableau Prep") and a description. Click Next.
  4. Choose your Trigger, which is how often you want the task to run (e.g., Daily, Weekly, Monthly). Set the start time and date. Click Next.
  5. For the Action, select Start a program. Click Next.
  6. In the “Program/script” box, paste the full path to the executable file: "C:\Program Files\Tableau\Tableau Prep Builder 2023.2\scripts\tableau-prep-cli.bat"
  7. In the “Add arguments (optional)” box, paste the rest of your command (the credentials and flow paths): -c "C:\TableauScripts\credentials.json" -t "C:\TableauScripts\MyFlow.tfl"
  8. Click Next, review your settings, and click Finish.

Your task is now scheduled to run. You can go back into Task Scheduler at any time to test it, view its run history, or adjust the schedule.

Best Practices for Automated Flows

Whether you use Prep Conductor or the CLI, following a few best practices will make your automated workflows more reliable and easier to manage.

  • Keep It Simple: Try to break down very complex data prep work into logical, smaller flows. A single monolithic flow that takes hours to run can be difficult to debug if it fails.
  • Document Everything: Add descriptions to your flows in Tableau Server. For CLI methods, keep a readme file in your scripts folder explaining what each flow does, where it gets data, and where it sends it.
  • Set Up Notifications: Tableau Server can send email notifications when a flow fails. For the CLI method, you can add steps to your script to send a notification (e.g., using a PowerShell command) so you know immediately if something went wrong.
  • Use Consistent Naming Conventions: Adopt a clear naming system for your flows, schedules, output files, and data sources. Something like [DataSource]_[DataContent]_[Cadence]_Flow (e.g., Salesforce_Opportunity_Daily_Flow) makes it easy to understand what's happening at a glance.

Final Thoughts

By automating your Tableau Prep flows, you transform it from a manual data hygiene tool into a powerful, automated ETL engine that works for you in the background. Whether you choose the elegant, integrated solution of Prep Conductor or the flexible, free Command Line Interface, taking the time to set up automation will pay for itself with fresher data and hours of reclaimed time.

While automating data preparation is a huge step, many teams find the next bottleneck is turning that clean data into valuable reports. With solutions from Graphed, we can automate the entire reporting process for you. Instead of spending hours in a dashboard editor, you can connect your data sources and simply ask for the reports you need in plain English. Your charts appear instantly on real-time dashboards, letting you and your team go from raw data to actionable insights in seconds, not hours.

Related Articles

How to Connect Facebook to Google Data Studio: The Complete Guide for 2026

Connecting Facebook Ads to Google Data Studio (now called Looker Studio) has become essential for digital marketers who want to create comprehensive, visually appealing reports that go beyond the basic analytics provided by Facebook's native Ads Manager. If you're struggling with fragmented reporting across multiple platforms or spending too much time manually exporting data, this guide will show you exactly how to streamline your Facebook advertising analytics.

Appsflyer vs Mixpanel​: Complete 2026 Comparison Guide

The difference between AppsFlyer and Mixpanel isn't just about features—it's about understanding two fundamentally different approaches to data that can make or break your growth strategy. One tracks how users find you, the other reveals what they do once they arrive. Most companies need insights from both worlds, but knowing where to start can save you months of implementation headaches and thousands in wasted budget.