Scheduling delivery of Performance Analytics data

 Important

Best Practice

Since Performance Analytics is an on-demand system, it is important that your scheduled requests do not adversely impact other customers. When scheduling an export of user-level data:

  1. Filter appropriately so that results are not excessive or a larger data set than necessary.
  2. Select a weekend day to limit your impact on other users.

If you are interested in integrating with other business systems, we highly recommend using our Real-Time Data StreamingA service that delivers engagement events in real time via the Data Streaming API or an Airship partner integration. integrations, which will scale to your needs. See: Integrations.


Excessive Exports

Event data, such as messages, clicks, and tag events, is updated about every hour, depending on the volume of data. Channel state data, such as tags and attributes associated to a channel, is updated every night. Any scheduled data exports exceeding these time frames will be considered excessive and could be removed.

Scheduling Look Delivery

You can only schedule delivery of saved Looks. If you edited a Look and want to schedule delivery, first save it as a new Look, then complete these steps. Refer to Scheduling Options below for detail.

From a Look:

  1. Hover over the Look, then click and select Schedule.
  2. Enter a name for the schedule.
  3. Select and configure the destination:
    • Email — Your account’s email address is populated by default. For each additional recipient, enter the email address and click Add. You can also include a custom message that will appear above the data in the email.
    • Webhook — Enter the webhook URL.
    • Amazon S3 — Enter the S3 values for the bucket and optional path, access key, and secret key, then select a region.
    • SFTP — Enter the host address, e.g., sftp://example.com/home/ftpuser/, and the SFTP user username and password, then select your preferred key exchange algorithm.
  4. Select a data format.
  5. Set the recurrence interval, or select Datagroup update and select a datagroup.
  6. (Optional) Edit or configure Filters: Specify the date range and project name and add more filters.
  7. (Optional) Configure Advanced options. These options control the delivery logic based on data presence and changes, limits on output, visual output, paper size, and time zone.
  8. (Optional) Click Send Test to preview before saving. It will send according to the current settings.
  9. Click Save All.

Scheduling Dashboard Delivery

Refer to Scheduling Options below for detail.

From a Dashboard:

  1. Click and select Schedule delivery.
  2. Configure the Settings tab:
    1. Enter a name for the schedule.
    2. Set the recurrence interval, or select Datagroup update and select a datagroup.
    3. Select and configure the destination:
      • Email — Your account’s email address is populated by default. For each additional recipient, enter the email address and hit Enter or click outside the field.
      • Webhook — Enter the webhook URL.
      • Amazon S3 — Enter the S3 values for the bucket and optional path, access key, and secret key, then select a region.
      • SFTP — Enter the host address, e.g., sftp://example.com/home/ftpuser/, and the SFTP user username and password, then select your preferred key exchange algorithm.
    4. Select a data format. Available options depend on your selected destination and whether you are scheduling a Look or a Dashboard.
  3. (Optional) Edit or configure the Filters tab: Specify the date range and project name and add more filters.
  4. Configure the Advanced options tab. These options control the visual output, paper size, and time zone.
  5. Click Save now.

Scheduling Options

These options are available when configuring scheduling.

Destinations

  • Email — The data is delivered to the email addresses entered.

  • Webhook — Webhooks are a modern, increasingly common way to trigger exchanges between internet based services. They generally require some technical or developer knowledge to use, but with a product like Zapier, webhooks can let Performance Analytics data be delivered to a wide range of locations. Only a webhook URL is required.

  • Amazon S3 — Amazon S3 buckets are a common way to store large amounts of data. Options include:

    • Limits — If you choose “Results in Table”, whatever row limitations you’ve set up in the saved Look will be obeyed. If you choose “All Results” all the rows of the query will return, regardless of the saved Look settings, and regardless of the typical 5,000 row limit for Performance Analytics. This can be desirable for retrieving very large datasets, but you should use caution to ensure the query is not too large for your database.
  • SFTP — The data is uploaded according to your server.

Formats

Dashboards support these output formats:

  • CSV
  • PDF
  • PNG visualization

For Looks, supported formats depend on the destination:

FormatEmailWebhookS3SFTP
CSV
Text
XLSX
JSON — Simple
HTML
JSON — Detailed, Inline
JSON — Simple, Inline
JSON - Label
Data Table1
Visualization1

1. Sent in the body of the email.