Documentation

Developers

API References
Data Subject Request API

Data Subject Request API Version 1 and 2

Data Subject Request API Version 3

Platform API

Platform API Overview

Accounts

Apps

Audiences

Calculated Attributes

Data Points

Feeds

Field Transformations

Services

Users

Workspaces

Warehouse Sync API

Warehouse Sync API Overview

Warehouse Sync API Tutorial

Warehouse Sync API Reference

Data Mapping

Warehouse Sync SQL Reference

Warehouse Sync Troubleshooting Guide

ComposeID

Warehouse Sync API v2 Migration

Custom Access Roles API

Bulk Profile Deletion API Reference

Data Planning API

Group Identity API Reference

Calculated Attributes Seeding API

Pixel Service

Profile API

Events API

mParticle JSON Schema Reference

IDSync

Client SDKs
AMP

AMP SDK

Android

Initialization

Configuration

Network Security Configuration

Event Tracking

User Attributes

IDSync

Screen Events

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

WebView Integration

Logger

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting the Android SDK

API Reference

Upgrade to Version 5

Cordova

Cordova Plugin

Identity

Direct Url Routing

Direct URL Routing FAQ

Web

Android

iOS

Flutter

Getting Started

Usage

API Reference

iOS

Initialization

Configuration

Event Tracking

User Attributes

IDSync

Screen Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

Webview Integration

Upload Frequency

App Extensions

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting iOS SDK

Social Networks

iOS 14 Guide

iOS 15 FAQ

iOS 16 FAQ

iOS 17 FAQ

iOS 18 FAQ

API Reference

Upgrade to Version 7

React Native

Getting Started

Identity

Roku

Getting Started

Identity

Media

Unity

Upload Frequency

Getting Started

Opt Out

Initialize the SDK

Event Tracking

Commerce Tracking

Error Tracking

Screen Tracking

Identity

Location Tracking

Session Management

Xbox

Getting Started

Identity

Web

Initialization

Configuration

Content Security Policy

Event Tracking

User Attributes

IDSync

Page View Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Custom Logger

Persistence

Native Web Views

Self-Hosting

Multiple Instances

Web SDK via Google Tag Manager

Preventing Blocked HTTP Traffic with CNAME

Facebook Instant Articles

Troubleshooting the Web SDK

Browser Compatibility

Linting Data Plans

API Reference

Upgrade to Version 2 of the SDK

Xamarin

Getting Started

Identity

Web

Alexa

Media SDKs

iOS

Web

Android

Quickstart
Android

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Step 9. Test your local app

HTTP Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

iOS Quick Start

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Java Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Node Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Python Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Web

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Server SDKs

Node SDK

Go SDK

Python SDK

Ruby SDK

Java SDK

Tools

mParticle Command Line Interface

Linting Tools

Smartype

Guides
Partners

Introduction

Outbound Integrations

Outbound Integrations

Firehose Java SDK

Inbound Integrations

Kit Integrations

Overview

Android Kit Integration

JavaScript Kit Integration

iOS Kit Integration

Compose ID

Data Hosting Locations

Glossary

Migrate from Segment to mParticle

Migrate from Segment to mParticle

Migrate from Segment to Client-side mParticle

Migrate from Segment to Server-side mParticle

Segment-to-mParticle Migration Reference

Rules Developer Guide

API Credential Management

The Developer's Guided Journey to mParticle

Guides

Getting Started

Create an Input

Start capturing data

Connect an Event Output

Create an Audience

Connect an Audience Output

Transform and Enhance Your Data

Platform Guide
The New mParticle Experience

The new mParticle Experience

The Overview Map

Observability

Observability Overview

Observability User Guide

Observability Span Glossary

Introduction

Data Retention

Connections

Activity

Live Stream

Data Filter

Rules

Tiered Events

mParticle Users and Roles

Analytics Free Trial

Troubleshooting mParticle

Usage metering for value-based pricing (VBP)

Analytics

Introduction

Setup

Sync and Activate Analytics User Segments in mParticle

User Segment Activation

Welcome Page Announcements

Settings

Project Settings

Roles and Teammates

Organization Settings

Global Project Filters

Portfolio Analytics

Analytics Data Manager

Analytics Data Manager Overview

Events

Event Properties

User Properties

Revenue Mapping

Export Data

UTM Guide

Query Builder

Data Dictionary

Query Builder Overview

Modify Filters With And/Or Clauses

Query-time Sampling

Query Notes

Filter Where Clauses

Event vs. User Properties

Group By Clauses

Annotations

Cross-tool Compatibility

Apply All for Filter Where Clauses

Date Range and Time Settings Overview

Understanding the Screen View Event

Analyses

Analyses Introduction

Segmentation: Basics

Getting Started

Visualization Options

For Clauses

Date Range and Time Settings

Calculator

Numerical Settings

Segmentation: Advanced

Assisted Analysis

Properties Explorer

Frequency in Segmentation

Trends in Segmentation

Did [not] Perform Clauses

Cumulative vs. Non-Cumulative Analysis in Segmentation

Total Count of vs. Users Who Performed

Save Your Segmentation Analysis

Export Results in Segmentation

Explore Users from Segmentation

Funnels: Basics

Getting Started with Funnels

Group By Settings

Conversion Window

Tracking Properties

Date Range and Time Settings

Visualization Options

Interpreting a Funnel Analysis

Funnels: Advanced

Group By

Filters

Conversion over Time

Conversion Order

Trends

Funnel Direction

Multi-path Funnels

Analyze as Cohort from Funnel

Save a Funnel Analysis

Explore Users from a Funnel

Export Results from a Funnel

Cohorts

Getting Started with Cohorts

Analysis Modes

Save a Cohort Analysis

Export Results

Explore Users

Saved Analyses

Manage Analyses in Dashboards

Journeys

Getting Started

Event Menu

Visualization

Ending Event

Save a Journey Analysis

Users

Getting Started

User Activity Timelines

Time Settings

Export Results

Save A User Analysis

Dashboards

Dashboards––Getting Started

Manage Dashboards

Dashboard Filters

Organize Dashboards

Scheduled Reports

Favorites

Time and Interval Settings in Dashboards

Query Notes in Dashboards

User Aliasing

Analytics Resources

The Demo Environment

Keyboard Shortcuts

Tutorials

Analytics for Marketers

Analytics for Product Managers

Compare Conversion Across Acquisition Sources

Analyze Product Feature Usage

Identify Points of User Friction

Time-based Subscription Analysis

Dashboard Tips and Tricks

Understand Product Stickiness

Optimize User Flow with A/B Testing

User Segments

APIs

User Segments Export API

Dashboard Filter API

IDSync

IDSync Overview

Use Cases for IDSync

Components of IDSync

Store and Organize User Data

Identify Users

Default IDSync Configuration

Profile Conversion Strategy

Profile Link Strategy

Profile Isolation Strategy

Best Match Strategy

Aliasing

Data Master
Group Identity

Overview

Create and Manage Group Definitions

Introduction

Catalog

Live Stream

Data Plans

Data Plans

Blocked Data Backfill Guide

Personalization
Predictive Attributes

Predictive Attributes Overview

Create Predictive Attributes

Assess and Troubleshoot Predictions

Use Predictive Attributes in Campaigns

Predictive Audiences

Predictive Audiences Overview

Using Predictive Audiences

Introduction

Profiles

Calculated Attributes

Calculated Attributes Overview

Using Calculated Attributes

Create with AI Assistance

Calculated Attributes Reference

Audiences

Audiences Overview

Real-time Audiences

Standard Audiences

Journeys

Journeys Overview

Manage Journeys

Download an audience from a journey

Audience A/B testing from a journey

Journeys 2.0

Warehouse Sync

Data Privacy Controls

Data Subject Requests

Default Service Limits

Feeds

Cross-Account Audience Sharing

Approved Sub-Processors

Import Data with CSV Files

Import Data with CSV Files

CSV File Reference

Glossary

Video Index

Analytics (Deprecated)
Identity Providers

Single Sign-On (SSO)

Setup Examples

Settings

Debug Console

Data Warehouse Delay Alerting

Introduction

Developer Docs

Introduction

Integrations

Introduction

Rudderstack

Google Tag Manager

Segment

Data Warehouses and Data Lakes

Advanced Data Warehouse Settings

AWS Kinesis (Snowplow)

AWS Redshift (Define Your Own Schema)

AWS S3 Integration (Define Your Own Schema)

AWS S3 (Snowplow Schema)

BigQuery (Snowplow Schema)

BigQuery Firebase Schema

BigQuery (Define Your Own Schema)

GCP BigQuery Export

Snowflake (Snowplow Schema)

Snowplow Schema Overview

Snowflake (Define Your Own Schema)

APIs

Dashboard Filter API (Deprecated)

REST API

User Segments Export API (Deprecated)

SDKs

SDKs Introduction

React Native

iOS

Android

Java

JavaScript

Python

Object API

Developer Basics

Aliasing

Warehouse Sync Troubleshooting Guide

When setting up Warehouse Sync, it is possible to encounter problems with one or more components of your configuration:

  • Initial data warehouse connectivity
  • SQL query syntax
  • Pipeline issues related to security or how mParticle parses your database
  • Problems with mapping data from your warehouse to user profiles in mParticle

Following are several troubleshooting guides for each of these problem categories. If you are still encountering issues after following the appropriate steps below, contact mParticle support.

Data warehouse connectivity

Connectivity issues are often the result of an incomplete or incorrect first-time configuration.

Common symptoms

  • Receiving an error code when using the API to create a connection to a warehouse

Before troubleshooting, verify the following:

  1. Your mParticle account representative has enabled Warehouse Sync for the account you are using.
  2. You can successfully connect to your data warehouse outside of mParticle, using the same username and password.
  3. You followed the set-up steps specific to your data warehouse. Simple mistakes or typos made during this phase may prevent Warehouse Sync from working.
  4. All of the relevant mP IP addresses are whitelisted.

Troubleshooting steps

  • Validate all the data warehouse parameters in the POST {baseURI}/connections API call. Are they correct for the data warehouse instance you are trying to connect to?
  • Compare your actual data location pod, organization, account, and workspace ID values with the values you are supplying in your API calls.
  • From another application, connect to your warehouse using the username and password you created or specified. Ensure those credentials are permitted to access the pertinent datasets, tables, compute warehouses, and storage integrations.

SQL syntax

Errors or incompatibility in the SQL syntax of your data model will return errors and prevent the sync from succeeding.

Common symptoms

  • The report API returns an error message indicating there is a syntax issue in the SQL statement provided in the data model.

Before troubleshooting, run the SQL query outside of mParticle. If it doesn’t run successfully or return the expected results, the issue is likely in your query, independent of Warehouse Sync.

Troubleshooting steps

  • If you receive an error after running the SQL query, remove the part of the query highlighted in the error message.
  • Verify your SQL syntax. While most data warehouses support common SQL syntax, it is possible to encounter exceptions in SQL extension for your warehouse. For example:

    • Snowflake doesn’t match case-sensitive, explicit identifiers to case-insensitive statements. For example, the statement SELECT current_timestamp AS \"tstamp\" FROM tableXYZ ... " in your SQL query will fail if iterator_field is tstamp in your data model.
  • Workaround 1, Remove the explicit identifier " ":

    • SELECT current_timestamp AS tstamp FROM tableXYZ ... "
    • iterator_field": "tstamp"
  • Workaround 2, force UPPER CASE:

    • SELECT current_timestamp AS \”TSTAMP\" FROM tableXYZ ... "
    • "iterator_field": "TSTAMP"
  • If the error is related to the timestamp field in the query, ensure that:

    • You specified the correct column name and data type in the data model configuration.
    • You are not using dynamically generated timestamp values. Each data warehouse and environment may treat these values differently in terms of data type.

Pipeline issues

Pipeline issues are typically caused by security problems, the timestamp field provided in the data model, or other factors with the environment.

Common symptoms

  • The report API returns some type of error message. For example:

    • Error assuming the AWS_ROLE. Please verify the role and externalId are configured correctly in your AWS policy.
    • Insufficient permission to extract records
    • Insufficient privileges to operate on integration ‘MP_US2_5000170_244_S3’
    • Validation Error: Missing required columns scanned_timestamp_ms in source query
    • SQL compilation error: error line 1 at position 36 invalid identifier ‘TIMESTAMP’
    • The dag’s data_interval_start is more than 7 days in the past. Found 14 days to back-fill in a ScheduleInterval.Hourly schedule.
    • Too many rows in the source query. Found 100000000 rows
  • The report API returns "successful_records": 0. For example:

    {
      "pipeline_id": "string",
      "status": "idle",
      "connection_status": "healthy",
      "data_model_status": "valid",
      "latest_pipeline_run": {
        "id": 0,
        "pipeline_id": "string",
        "type": "scheduled",
        "status": "success",
        "errors": [
          {
            "message": "string"
          }
        ],
        "logical_date": "2023-10-25T18:11:57.321Z",
        "started_on": "2023-10-25T18:11:57.321Z",
        "ended_on": "2023-10-25T18:11:57.321Z",
        "range_start": "2023-10-25T18:11:57.321Z",
        "range_end": "2023-10-25T18:11:57.321Z",
        "successful_records": 0,
        "failed_records": 0
      }
    }

Before troubleshooting, verify the following:

  1. You followed the configuration steps specific to your data warehouse. Any small mistake or typo will prevent Data Warehouse Sync from working.
  2. You specified the correct datatype for the timestamps of the rows you are syncing.
  3. The timestamps of the database rows you are syncing are not set in the future.
  4. You are not exceeding the Warehouse Sync API limits

Troubleshooting steps

  • If you are dynamically generating timestamp values, try using a literal value in the table or view you are querying.

Data import or mapping issues

Importing and mapping problems usually result from incorrect mapping between data rows in the warehouse and user profiles or attributes in mParticle.

Common symptoms

  • mParticle created new profiles for users instead of updating existing profiles
  • mParticle added new attributes to a profile instead of updating existing attributes

Before troubleshooting, verify the following:

  1. The column names in your SQL query match the column names on your user profiles in mParticle.
  2. The column names match the reserved mParticle user or device identity column names. For more information, see reserved mParticle user.

Troubleshooting steps

  1. Correlate the row to the event batch according to your profile strategy.
  2. Provide mParticle support or your account representative with the event batch JSON object from your mParticle Livestream, or the MPID and batch ID for the event, as well as a CSV of the source data

    • You can run the query manually against your data warehouse to simulate what mParticle extracted.
    • mParticle support can then confirm that the data lines up the expected behavior based on your data model.

Specific table schema changes in Google BigQuery

If a table schema changes and validation is still occurring, you may need to wait 24 hours for the cache in BigQuery to clear and reset before trying again.

Synchronizing a specific interval of data again for incremental pipelines

The incremental sync mode uses the specified iterator field to track what data has been synchronized in a monotonically increasing fashion. If you need to synchronize a specific window of time, you can create a new full, once pipeline and use the from and to parameters to capture the desired data interval. You may reuse your existing connection, field transformation, and data model.

First, use the Get a Specific Pipeline endpoint to retrieve details of an existing pipeline for the specific time window synchronization. You may want to reuse the following parameters:

  • pipeline_type
  • connection_id
  • field_transformation_id
  • data_model_id
  • partner_feed_id
  • iterator_field
  • iterator_data_type
  • environment
  • data_plan_id
  • data_plan_version

Then, use the Create a Pipeline endpoint to create your new pipeline. In this example, we create a new full, once pipeline to retrieve data from 2022-07-01T16:00:00Z to 2022-08-01T16:00:00Z.

{
  "id": "sync-specific-time-window",
  "name": "Sync Specific Time Window",
  "pipeline_type": "events",
  "connection_id": "existing-connection-id",
  "field_transformation_id": "existing-field-transformation-id",
  "data_model_id": "existing-data-model-id",
  "partner_feed_id": 1234,
  "state": "active",
  "sync_mode": {
    "type": "full",
    "iterator_field": "updated_at",
    "iterator_data_type": "timestamp",
    "from": "2022-07-01T16:00:00Z",
    "until": "2022-08-01T16:00:00Z"
  },
  "schedule": {
    "type": "once"
  },
  "environment": "development",
  "data_plan_id": "example-data-plan-id",
  "data_plan_version": 2
}

Was this page helpful?

    Last Updated: November 20, 2024