Documentation

Developers

API References
Data Subject Request API

Data Subject Request API Version 1 and 2

Data Subject Request API Version 3

Platform API

Platform API Overview

Accounts

Apps

Audiences

Calculated Attributes

Data Points

Feeds

Field Transformations

Services

Users

Workspaces

Warehouse Sync API

Warehouse Sync API Overview

Warehouse Sync API Tutorial

Warehouse Sync API Reference

Data Mapping

Warehouse Sync SQL Reference

Warehouse Sync Troubleshooting Guide

ComposeID

Warehouse Sync API v2 Migration

Bulk Profile Deletion API Reference

Calculated Attributes Seeding API

Data Planning API

Custom Access Roles API

Group Identity API Reference

Pixel Service

Profile API

Events API

mParticle JSON Schema Reference

IDSync

Client SDKs
AMP

AMP SDK

Android

Initialization

Configuration

Network Security Configuration

Event Tracking

User Attributes

IDSync

Screen Events

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

WebView Integration

Logger

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting the Android SDK

API Reference

Upgrade to Version 5

Cordova

Cordova Plugin

Identity

Direct Url Routing

Direct URL Routing FAQ

Web

Android

iOS

Flutter

Getting Started

Usage

API Reference

iOS

Initialization

Configuration

Event Tracking

User Attributes

IDSync

Screen Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

Webview Integration

Upload Frequency

App Extensions

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting iOS SDK

Social Networks

iOS 14 Guide

iOS 15 FAQ

iOS 16 FAQ

iOS 17 FAQ

iOS 18 FAQ

API Reference

Upgrade to Version 7

React Native

Getting Started

Identity

Roku

Getting Started

Identity

Media

Unity

Upload Frequency

Getting Started

Opt Out

Initialize the SDK

Event Tracking

Commerce Tracking

Error Tracking

Screen Tracking

Identity

Location Tracking

Session Management

Xbox

Getting Started

Identity

Web

Initialization

Content Security Policy

Configuration

Event Tracking

User Attributes

IDSync

Page View Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Custom Logger

Persistence

Native Web Views

Self-Hosting

Multiple Instances

Web SDK via Google Tag Manager

Preventing Blocked HTTP Traffic with CNAME

Facebook Instant Articles

Troubleshooting the Web SDK

Browser Compatibility

Linting Data Plans

API Reference

Upgrade to Version 2 of the SDK

Xamarin

Getting Started

Identity

Web

Alexa

Quickstart
Android

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Step 9. Test your local app

HTTP Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

iOS Quick Start

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Java Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Node Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Web

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Python Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Media SDKs

Android

iOS

Web

Server SDKs

Node SDK

Go SDK

Python SDK

Ruby SDK

Java SDK

Tools

Linting Tools

mParticle Command Line Interface

Smartype

Guides
Partners

Introduction

Outbound Integrations

Outbound Integrations

Firehose Java SDK

Inbound Integrations

Kit Integrations

Overview

Android Kit Integration

JavaScript Kit Integration

iOS Kit Integration

Compose ID

Glossary

Data Hosting Locations

Migrate from Segment to mParticle

Migrate from Segment to mParticle

Migrate from Segment to Client-side mParticle

Migrate from Segment to Server-side mParticle

Segment-to-mParticle Migration Reference

Rules Developer Guide

API Credential Management

The Developer's Guided Journey to mParticle

Guides

Getting Started

Create an Input

Start capturing data

Connect an Event Output

Create an Audience

Connect an Audience Output

Transform and Enhance Your Data

Platform Guide
The New mParticle Experience

The new mParticle Experience

The Overview Map

Observability

Observability Overview

Observability User Guide

Observability Span Glossary

Introduction

Data Retention

Connections

Activity

Live Stream

Data Filter

Rules

Tiered Events

mParticle Users and Roles

Analytics Free Trial

Troubleshooting mParticle

Usage metering for value-based pricing (VBP)

Analytics

Introduction

Setup

Sync and Activate Analytics User Segments in mParticle

User Segment Activation

Welcome Page Announcements

Settings

Project Settings

Roles and Teammates

Organization Settings

Global Project Filters

Portfolio Analytics

Analytics Data Manager

Analytics Data Manager Overview

Events

Event Properties

User Properties

Revenue Mapping

Export Data

UTM Guide

Query Builder

Data Dictionary

Query Builder Overview

Modify Filters With And/Or Clauses

Query-time Sampling

Query Notes

Filter Where Clauses

Event vs. User Properties

Group By Clauses

Annotations

Cross-tool Compatibility

Apply All for Filter Where Clauses

Date Range and Time Settings Overview

Understanding the Screen View Event

Analyses

Analyses Introduction

Segmentation: Basics

Getting Started

Visualization Options

For Clauses

Date Range and Time Settings

Calculator

Numerical Settings

Segmentation: Advanced

Assisted Analysis

Properties Explorer

Frequency in Segmentation

Trends in Segmentation

Did [not] Perform Clauses

Cumulative vs. Non-Cumulative Analysis in Segmentation

Total Count of vs. Users Who Performed

Save Your Segmentation Analysis

Export Results in Segmentation

Explore Users from Segmentation

Funnels: Basics

Getting Started with Funnels

Group By Settings

Conversion Window

Tracking Properties

Date Range and Time Settings

Visualization Options

Interpreting a Funnel Analysis

Funnels: Advanced

Group By

Filters

Conversion over Time

Conversion Order

Trends

Funnel Direction

Multi-path Funnels

Analyze as Cohort from Funnel

Save a Funnel Analysis

Explore Users from a Funnel

Export Results from a Funnel

Cohorts

Getting Started with Cohorts

Analysis Modes

Save a Cohort Analysis

Export Results

Explore Users

Saved Analyses

Manage Analyses in Dashboards

Journeys

Getting Started

Event Menu

Visualization

Ending Event

Save a Journey Analysis

Users

Getting Started

User Activity Timelines

Time Settings

Export Results

Save A User Analysis

Dashboards

Dashboards––Getting Started

Manage Dashboards

Dashboard Filters

Organize Dashboards

Scheduled Reports

Favorites

Time and Interval Settings in Dashboards

Query Notes in Dashboards

User Aliasing

Analytics Resources

The Demo Environment

Keyboard Shortcuts

Tutorials

Analytics for Marketers

Analytics for Product Managers

Compare Conversion Across Acquisition Sources

Analyze Product Feature Usage

Identify Points of User Friction

Time-based Subscription Analysis

Dashboard Tips and Tricks

Understand Product Stickiness

Optimize User Flow with A/B Testing

User Segments

APIs

User Segments Export API

Dashboard Filter API

IDSync

IDSync Overview

Use Cases for IDSync

Components of IDSync

Store and Organize User Data

Identify Users

Default IDSync Configuration

Profile Conversion Strategy

Profile Link Strategy

Profile Isolation Strategy

Best Match Strategy

Aliasing

Data Master
Group Identity

Overview

Create and Manage Group Definitions

Introduction

Catalog

Live Stream

Data Plans

Data Plans

Blocked Data Backfill Guide

Personalization
Predictive Attributes

Predictive Attributes Overview

Create Predictive Attributes

Assess and Troubleshoot Predictions

Use Predictive Attributes in Campaigns

Predictive Audiences

Predictive Audiences Overview

Using Predictive Audiences

Introduction

Profiles

Calculated Attributes

Calculated Attributes Overview

Using Calculated Attributes

Create with AI Assistance

Calculated Attributes Reference

Audiences

Audiences Overview

Real-time Audiences

Standard Audiences

Journeys

Journeys Overview

Manage Journeys

Download an audience from a journey

Audience A/B testing from a journey

Journeys 2.0

Warehouse Sync

Data Privacy Controls

Data Subject Requests

Default Service Limits

Feeds

Cross-Account Audience Sharing

Approved Sub-Processors

Import Data with CSV Files

Import Data with CSV Files

CSV File Reference

Glossary

Video Index

Analytics (Deprecated)
Identity Providers

Single Sign-On (SSO)

Setup Examples

Settings

Debug Console

Data Warehouse Delay Alerting

Introduction

Developer Docs

Introduction

Integrations

Introduction

Rudderstack

Google Tag Manager

Segment

Data Warehouses and Data Lakes

Advanced Data Warehouse Settings

AWS Kinesis (Snowplow)

AWS Redshift (Define Your Own Schema)

AWS S3 Integration (Define Your Own Schema)

AWS S3 (Snowplow Schema)

BigQuery (Snowplow Schema)

BigQuery Firebase Schema

BigQuery (Define Your Own Schema)

GCP BigQuery Export

Snowflake (Snowplow Schema)

Snowplow Schema Overview

Snowflake (Define Your Own Schema)

APIs

Dashboard Filter API (Deprecated)

REST API

User Segments Export API (Deprecated)

SDKs

SDKs Introduction

React Native

iOS

Android

Java

JavaScript

Python

Object API

Developer Basics

Aliasing

BigQuery (Snowplow Schema)

Prerequisites

You must grant ‘bigquery.dataViewer’ access to Analytics’ service account for your BigQuery project. To perform the following steps, you must have administrative access to the BigQuery console and your BigQuery database.

For this self-service integration, we also have some data requirements:

  1. All of your events must be unified into one table instead of having separate tables for each event type.
  2. There can only be a maximum of one authenticatedID and one unauthenticated ID for aliasing.
  3. The event timestamp must be in UTC.
  4. All joins must be done beforehand.
  5. Shared tables, meaning if your BigQuery tables end with the _MMDDYYYY format, are not currently supported.

We can still support any integrations that do not meet the above requirements, but you must contact a product specialist. If additional enrichments are required, such as joining with user property tables or deriving custom user_ids, please contact us.

Instructions

Adding a Data Source In Analytics

  1. In Analytics, click on the gear icon and select Project Settings. Project Settings
  2. Select the Data Sources tab. Data Sources
  3. Select New Data Source. New Data Source
  4. Select Connect via Data Warehouse or Lake. Connect via Data Warehouse or Lake
  5. Select BigQuery as your data connection and Snowplow as the connection schema and click Connect. BigQuery and Snowplow
  6. You should see this Google + Snowplow Overview screen. Click Next. Google + Snowplow Overview

Connection Information

Connection Information

  1. Open the BigQuery console on Google Cloud Platform and Select a project.
  2. Enter the GCP Project ID containing your Snowplow data. GCP Project ID
  3. Enter the Dataset Name. Dataset Name
  4. Enter the Table Name and click Next in Analytics. Table Name

Grant Permissions

Grant Permissions

This integration works by sharing the dataset with Analytics’ service account and only requires read-only access to that dataset. Analytics takes on the cost of the query and caches this data in Analytics’ proprietary analytics engine.

  1. Within the BigQuery Console, select your Project and your dataset from the previous section.
  2. Click on Share Dataset.

BigQuery Data Viewer

  1. In the Dataset Permissions panel, in the Add Members field, place the user below.

integrations@indicative-988.iam.gserviceaccount.com

  1. In the Select a Role dropdown, select BigQuery Data Viewer and click Add. BigQuery Data Viewer

Data Loading

Data Loading

  1. Start Date
    Select the date from where Analytics should load your data from.

    If your data history exceeds 1 billion events, a Solutions Engineer will contact you to assist with the integration.

  2. Schedule Interval
    Select the frequency to make new data available in Analytics.
  3. Processing Delay
    Select when we should start extracting your data in UTC. This time should be when all of your previous day’s data is fully available in your table for extraction.
  4. Load Timestamp Field
    Select the field used to identify new data. We recommend using a timestamp that denotes when the event was published, not the actual event timestamp to allow for late data to be collected. This will not impact your analyses since we reference the event timestamp for our queries. For Snowplow, we recommend etl_tstamp. If you select to load data every 3, 6, or 12 hours, make sure to select a load timestamp field with at least hour precision (not a date only field).

For example, if an event with an event timestamp of 12/1 was published to the table on 12/3, this will not be collected unless we use the publishing timestamp since every daily extract would look for events that occurred on 12/3. Using the publishing timestamp will allow us to extract all new data that was published to the table on a nightly basis.

Event Modeling

Event Modeling

  1. In the Structured Event Name section, select the field that should be used to derive Analytics event names. Typically, most customers use the se_action field, but it completely depends on your implementation. We will first look at this field’s value to use as the event name in Analytics. If this value is null, then we will use the event_name field. If this field’s value is also null, we will then use the event field. If you are not using Snowplow structured events, select none.
  2. For Timestamp, select the field that represents the time that the event was performed. Analytics will use this field to run its queries. If unsure, leave as derived_tstamp.
  • collector_tstamp - Timestamp for the event recorded by the collector.
  • dvce_created_tstamp - Timestamp for the event recorded on the client device.
  • dvce_sent_tstamp - When the event was actually sent by the client device.
  • etl_tstamp - Timestamp for when the event was validated and enriched. Note: the name is historical and does not mean that the event is loaded at this point (this is further downstream).
  • derived_tstamp - Timestamp making allowance for inaccurate device clock.
  • true_tstamp - User-set “true timestamp” for the event.
  1. For Vendor Name, input the Snowplow vendor names used so we can simplify your event property names.
  2. Click Next.

After this step, we will perform a few checks on your data with the model that you provided. The checks are:

  • Valid event field (Do at least 80% of your records have a value for the event field?)
  • Valid timestamp field (Do at least 80% of your records have a value for the timestamp field?)
  • Total number of unique events. We recommend 20-300 unique events and limit it to 2000.

User Modeling

User Modeling

After some basic checks, we can define your users within your data. For more information on User Identification (Aliasing), please refer to this article.

  1. If you choose to enable Aliasing, click on Enabled:
  2. Type - Select the Snowplow field type
  • Atomic - If the anonymous ID field is an atomic field, select this option.

    • Field Name - Select the field that should be used to identify anonymous users.
  • Context - If your anonymous ID is contained within the Contexts field, choose this option.

    • Field Name - Select the context field that contains your anonymousID.
  1. If you choose to disable Aliasing, press Disabled:
  • Type - Select the Snowplow field type

    • Atomic - If the anonymous ID field is an atomic field, select this option.

      • Field Name - Select the field that should be used to identify anonymous users.
    • Context - If your anonymous ID is contained within the Contexts field, choose this option.

      • Field Name - Select the context field that contains your anonymousID.

If you have a non-null value that represents null UserID values, please click on the Show Advanced button. In this field, please enter these non-null values.

After this step, we will perform additional checks on your data with the user model that you provided. The checks are:

  • User Hotspot (Is there a single UserID that represents over 40% of your records?)
  • Anti-Hotspot (Does your data have too many unique userIDs? A good events table contains multiple events per user)
  • Aliasing
    • Too many unauthenticated IDs for a single authenticated userID
    • Too many authenticated IDs for a single anonymous ID

Assisted Modeling

Assisted Modeling

You should see a summary of your data based on the last 7 days in three main blocks.

You should only be concerned if the margin of error is significant. If so, please reach out to a product specialist:

  1. Events Summary
    You should see a daily breakdown of your Total Event Count, and the number of Unique Event Names. If there are certain events to exclude, please click on the Exclude checkbox for those events.

If you would like to exclude any events by regex or property value, please contact a product specialist.

If this section looks good, click Next.

  1. Properties Summary
    Here you will see the number of Unique Event Property Names. If there are certain properties to exclude, please click on the Exclude checkbox for those events.

If you require more advanced configurations such as parsing out JSON fields, creating derived properties, or excluding properties based on regex, please contact a product specialist.

If this section looks good, click Next.

  1. Users Summary
    This section lists the number of Unique users seen. If the numbers do not look correct, please go back to the User Modeling section to confirm that the correct ID was chosen. Please note that the counts may not reflect exactly what gets loaded into Analytics due to aliasing and other event modeling configurations.

If this section looks good, click Next.

Waiting For Data

Waiting For Data

If you see this screen, you’re all done! You should see your data in Analytics within 48-72 hours and will be notified by email.

Was this page helpful?

    Last Updated: December 20, 2024