Documentation

Developers

API References
Data Subject Request API

Data Subject Request API Version 1 and 2

Data Subject Request API Version 3

Platform API

Platform API Overview

Accounts

Apps

Audiences

Calculated Attributes

Data Points

Feeds

Field Transformations

Services

Users

Workspaces

Warehouse Sync API

Warehouse Sync API Overview

Warehouse Sync API Tutorial

Warehouse Sync API Reference

Data Mapping

Warehouse Sync SQL Reference

Warehouse Sync Troubleshooting Guide

ComposeID

Warehouse Sync API v2 Migration

Bulk Profile Deletion API Reference

Calculated Attributes Seeding API

Custom Access Roles API

Data Planning API

Group Identity API Reference

Pixel Service

Profile API

Events API

mParticle JSON Schema Reference

IDSync

Client SDKs
AMP

AMP SDK

Android

Initialization

Configuration

Network Security Configuration

Event Tracking

User Attributes

IDSync

Screen Events

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

WebView Integration

Logger

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting the Android SDK

API Reference

Upgrade to Version 5

Cordova

Cordova Plugin

Identity

Direct Url Routing

Direct URL Routing FAQ

Web

Android

iOS

Flutter

Getting Started

Usage

API Reference

React Native

Getting Started

Identity

Roku

Getting Started

Identity

Media

iOS

Initialization

Configuration

Event Tracking

User Attributes

IDSync

Screen Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Push Notifications

Webview Integration

Upload Frequency

App Extensions

Preventing Blocked HTTP Traffic with CNAME

Linting Data Plans

Troubleshooting iOS SDK

Social Networks

iOS 14 Guide

iOS 15 FAQ

iOS 16 FAQ

iOS 17 FAQ

iOS 18 FAQ

API Reference

Upgrade to Version 7

Xbox

Getting Started

Identity

Unity

Upload Frequency

Getting Started

Opt Out

Initialize the SDK

Event Tracking

Commerce Tracking

Error Tracking

Screen Tracking

Identity

Location Tracking

Session Management

Web

Initialization

Content Security Policy

Configuration

Event Tracking

User Attributes

IDSync

Page View Tracking

Commerce Events

Location Tracking

Media

Kits

Application State and Session Management

Data Privacy Controls

Error Tracking

Opt Out

Custom Logger

Persistence

Native Web Views

Self-Hosting

Multiple Instances

Web SDK via Google Tag Manager

Preventing Blocked HTTP Traffic with CNAME

Facebook Instant Articles

Troubleshooting the Web SDK

Browser Compatibility

Linting Data Plans

API Reference

Upgrade to Version 2 of the SDK

Xamarin

Getting Started

Identity

Web

Alexa

Server SDKs

Node SDK

Go SDK

Python SDK

Ruby SDK

Java SDK

Tools

mParticle Command Line Interface

Linting Tools

Smartype

Media SDKs

Android

Web

iOS

Quickstart
Android

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Step 9. Test your local app

HTTP Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

iOS Quick Start

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Java Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Node Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Python Quick Start

Step 1. Create an input

Step 2. Create an output

Step 3. Verify output

Web

Overview

Step 1. Create an input

Step 2. Verify your input

Step 3. Set up your output

Step 4. Create a connection

Step 5. Verify your connection

Step 6. Track events

Step 7. Track user data

Step 8. Create a data plan

Guides
Partners

Introduction

Outbound Integrations

Outbound Integrations

Firehose Java SDK

Inbound Integrations

Kit Integrations

Overview

Android Kit Integration

JavaScript Kit Integration

iOS Kit Integration

Data Hosting Locations

Compose ID

Glossary

Migrate from Segment to mParticle

Migrate from Segment to mParticle

Migrate from Segment to Client-side mParticle

Migrate from Segment to Server-side mParticle

Segment-to-mParticle Migration Reference

Rules Developer Guide

API Credential Management

The Developer's Guided Journey to mParticle

Guides

Getting Started

Create an Input

Start capturing data

Connect an Event Output

Create an Audience

Connect an Audience Output

Transform and Enhance Your Data

Platform Guide
The New mParticle Experience

The new mParticle Experience

The Overview Map

Observability

Observability Overview

Observability User Guide

Observability Span Glossary

Introduction

Data Retention

Connections

Activity

Live Stream

Data Filter

Rules

Tiered Events

mParticle Users and Roles

Analytics Free Trial

Troubleshooting mParticle

Usage metering for value-based pricing (VBP)

Analytics

Introduction

Setup

Sync and Activate Analytics User Segments in mParticle

User Segment Activation

Welcome Page Announcements

Settings

Project Settings

Roles and Teammates

Organization Settings

Global Project Filters

Portfolio Analytics

Analytics Data Manager

Analytics Data Manager Overview

Events

Event Properties

User Properties

Revenue Mapping

Export Data

UTM Guide

Query Builder

Data Dictionary

Query Builder Overview

Modify Filters With And/Or Clauses

Query-time Sampling

Query Notes

Filter Where Clauses

Event vs. User Properties

Group By Clauses

Annotations

Cross-tool Compatibility

Apply All for Filter Where Clauses

Date Range and Time Settings Overview

Understanding the Screen View Event

Analyses

Analyses Introduction

Segmentation: Basics

Getting Started

Visualization Options

For Clauses

Date Range and Time Settings

Calculator

Numerical Settings

Segmentation: Advanced

Assisted Analysis

Properties Explorer

Frequency in Segmentation

Trends in Segmentation

Did [not] Perform Clauses

Cumulative vs. Non-Cumulative Analysis in Segmentation

Total Count of vs. Users Who Performed

Save Your Segmentation Analysis

Export Results in Segmentation

Explore Users from Segmentation

Funnels: Basics

Getting Started with Funnels

Group By Settings

Conversion Window

Tracking Properties

Date Range and Time Settings

Visualization Options

Interpreting a Funnel Analysis

Funnels: Advanced

Group By

Filters

Conversion over Time

Conversion Order

Trends

Funnel Direction

Multi-path Funnels

Analyze as Cohort from Funnel

Save a Funnel Analysis

Explore Users from a Funnel

Export Results from a Funnel

Cohorts

Getting Started with Cohorts

Analysis Modes

Save a Cohort Analysis

Export Results

Explore Users

Saved Analyses

Manage Analyses in Dashboards

Journeys

Getting Started

Event Menu

Visualization

Ending Event

Save a Journey Analysis

Users

Getting Started

User Activity Timelines

Time Settings

Export Results

Save A User Analysis

Dashboards

Dashboards––Getting Started

Manage Dashboards

Organize Dashboards

Dashboard Filters

Scheduled Reports

Favorites

Time and Interval Settings in Dashboards

Query Notes in Dashboards

User Aliasing

Analytics Resources

The Demo Environment

Keyboard Shortcuts

Tutorials

Analytics for Marketers

Analytics for Product Managers

Compare Conversion Across Acquisition Sources

Analyze Product Feature Usage

Identify Points of User Friction

Time-based Subscription Analysis

Dashboard Tips and Tricks

Understand Product Stickiness

Optimize User Flow with A/B Testing

User Segments

APIs

User Segments Export API

Dashboard Filter API

IDSync

IDSync Overview

Use Cases for IDSync

Components of IDSync

Store and Organize User Data

Identify Users

Default IDSync Configuration

Profile Conversion Strategy

Profile Link Strategy

Profile Isolation Strategy

Best Match Strategy

Aliasing

Data Master
Group Identity

Overview

Create and Manage Group Definitions

Introduction

Catalog

Live Stream

Data Plans

Data Plans

Blocked Data Backfill Guide

Personalization
Predictive Attributes

Predictive Attributes Overview

Create Predictive Attributes

Assess and Troubleshoot Predictions

Use Predictive Attributes in Campaigns

Predictive Audiences

Predictive Audiences Overview

Using Predictive Audiences

Introduction

Profiles

Calculated Attributes

Calculated Attributes Overview

Using Calculated Attributes

Create with AI Assistance

Calculated Attributes Reference

Audiences

Audiences Overview

Real-time Audiences

Standard Audiences

Journeys

Journeys Overview

Manage Journeys

Download an audience from a journey

Audience A/B testing from a journey

Journeys 2.0

Warehouse Sync

Data Privacy Controls

Data Subject Requests

Default Service Limits

Feeds

Cross-Account Audience Sharing

Approved Sub-Processors

Import Data with CSV Files

Import Data with CSV Files

CSV File Reference

Glossary

Video Index

Analytics (Deprecated)
Identity Providers

Single Sign-On (SSO)

Setup Examples

Settings

Debug Console

Data Warehouse Delay Alerting

Introduction

Developer Docs

Introduction

Integrations

Introduction

Rudderstack

Google Tag Manager

Segment

Data Warehouses and Data Lakes

Advanced Data Warehouse Settings

AWS Kinesis (Snowplow)

AWS Redshift (Define Your Own Schema)

AWS S3 Integration (Define Your Own Schema)

AWS S3 (Snowplow Schema)

BigQuery (Snowplow Schema)

BigQuery Firebase Schema

BigQuery (Define Your Own Schema)

GCP BigQuery Export

Snowflake (Snowplow Schema)

Snowplow Schema Overview

Snowflake (Define Your Own Schema)

APIs

Dashboard Filter API (Deprecated)

REST API

User Segments Export API (Deprecated)

SDKs

SDKs Introduction

React Native

iOS

Android

Java

JavaScript

Python

Object API

Developer Basics

Aliasing

mParticle Command Line Interface

npm version GitHub version

Overview

The mParticle Command Line Interface (CLI) can be used to communicate with various mParticle services and functions through simple terminal commands.

Through the CLI, an engineer can directly interface with many of mParticle’s services without needing to make requests directly, (such as commands via cUrl or Postman). Also, many of these requests can be integrated in any Continuous Integration/Continuous Deployment (CI/CD) systems.

Installation

The CLI installs as a simple npm package. Simply install globally and then type mp once installed.

$ npm install -g @mparticle/cli
$ mp [COMMAND]
running command...
$ mp (-v|--version|version)
@mparticle/cli/1.X.X darwin-x64 node-v10.XX.X
$ mp --help [COMMAND]
USAGE
  $ mp COMMAND

Verifying installation

To verify your installation and version, use the mp --version

$ mp --version
@mparticle/cli/1.X.X darwin-x64 node-v10.XX.X

Getting Started

Simply use mp help to view a list of the available commands.

$ mp help
mParticle Command Line Interface

VERSION
  @mparticle/cli/1.X.X darwin-x64 node-v10.XX.X

USAGE
  $ mp [COMMAND]

COMMANDS
  autocomplete  display autocomplete installation instructions
  help          display help for mp
  planning      Manages Data Planning

Setting up autocomplete

As a convenience, we provide a simple autocomplete feature, where you can type in part of a command, then press <TAB> to autocomplete a command.

Simply type mp autocomplete for instructions on configuring this feature.

Staying up to date

Simply use npm install -g @mparticle/cli to upgrade to the latest version.

Configuration

To perform commands on the CLI, you pass in flags such as authentication credentials or record identifiers. Some of these parameters can be added to an optional configuration file, mp.config.json, to be shared between commands or other mParticle applications.

The CLI will automatically search in the current working directory for a valid json filed named mp.config.json.

Alternatively, a json file can be passed in with the --config=/path/to/config flag.

For example, if you need to store configs for multiple projects, you could store them in a central location and pass in either a relative or absolute path to the cli:

$> mp planning:data-plan-versions:fetch --config=~/.myconfigs/custom.config.json

It is recommended to have a single mp.config.json file at the root of your project and always run the CLI from the root. If you are using our data planning linters, you must name your file mp.config.json and keep it at the root of your folder.

Example mp.config.json file

{
  "global": {
    "workspaceId": "XXXXX",
    "clientId": "XXXXXX",
    "clientSecret": "XXXXXXXXX"
  },
  "planningConfig": {
    "dataPlanVersionFile": "./path/to/dataPlanVersionFile.json"
  }
}

global

This contains settings that would pertain to your account credentials and application.

  • workspaceId: The workspace identifier for your team’s workspace
  • clientId: A unique Client Identification string provided by your Customer Success Manager
  • clientSecret: A secret key provided by your Customer Success Manager

It is recommended that you always have these three credentials in your configuration as they are used by other Platform API services, such as Data Planning

planningConfig

These are configurations pertaining to your project’s Data Master resources, such as data plans and data plan versions. planningConfig is required if you use our data plan linting tools, which you can learn more about here. Note that from the UI under Data Master/Plans, the json you download is a specific data plan version.

  • dataPlanVersionFile: A relative or absolute path file to your desired data plan version (used in place of dataPlanFile and versionNumber)
  • dataPlanId: The ID of your current Data Plan
  • dataPlanFile: A relative or absolute path to your data plan file (must be used with versionNumber below)
  • versionNumber: The Current Version Number for your Data Plan (must be used with dataPlanFile)

Workflow

At its core, the CLI exposes services in a manner that is consistent with our REST APIs. Each command will offer a unique set of sub commands, as well as arguments and flags.

The CLI also provides universal command flags for global functions, such as --help or --outfile.

The CLI command structure is as follows:

mp [COMMAND]:[SUBCOMMAND]:[subcommand] --[flag]=[value][args...]

By default, every command will output to the CLI’s standard out. By adding a flag of --outFile=/path, the user can output the response to a log file (or json file) depending on the use case.

The CLI provides a --help flag which reveals all acceptable parameters and flags for a command, as well as a list of commands. Furthermore, top level commands will reveal their respective sub commands upon execution.

Authentication

Any CLI command that requires any mParticle HTTP API resources allows two options for authentication. You can pass credentials via either 1. command line or 2. an mp.config.json file in the root of your project.

Both of these methods will internally generate a bearer token on your behalf, as describe in Platform API Authentication.

Credentials Required:

  • Workspace ID: Managing Workspace
  • Client ID & Client Secret: In Managing Workspace, after you click on the specific workspace, there will be a pop-up with a Key and Secret. Fill this in as your Client ID and Client Secret.

via CLI

Simply pass your authentication credentials via the following CLI flags:

$ mp [COMMAND]:[SUBCOMMAND] --workspaceID=XXXX --clientId=XXXXX --clientSecret=XXXXXX

via Configuration File

To integrate with various services, we recommend adding an mp.config.json file to the root of your project. This will allow you to set various properties related to your mParticle account as well as other project settings, such as your data plan directory.

For more information on mp.config.json.

For example, to authenticate, make sure the following is in your mp.config.json file:

// mp.config.json
{
  "global": {
    "workspaceId": "XXXXX",
    "clientId": "XXXXXX",
    "clientSecret": "XXXXXXXXX"
  }
}

This configuration file can then be referenced via the cli flag --config. Additionally, the cli will search your current working directory for mp.config.json.

Services

Data Planning

For customers subscribed to Data Master, the CLI exposes commands to allow for Creating, Fetching, Updating, and Deleting data plans, as well as validating your events against a downloaded Data Plan.

Please be aware that all of these services require Platform API authentication credentials via mp.config.json or via CLI arguments: clientId, clientSecret and workspaceId as well as Data Planning access.

Fetching Data Plans and Data Plan Versions

Fetching a Data Plan requires that a data plan exists on the server. Simply pass the dataPlanId as a flag to fetch this resource.

The Resource must exist on the server, otherwise this will fail.

$ mp planning:data-plans:fetch --dataPlanId=XXXXXX

To fetch a Data Plan Version, simply use mp planning:data-plan-versions:fetch and pass a dataPlanId and versionNumber.

Creating a Data Plan and Data Plan Versions

Use the following command to create a Data Plan Resource (or Data Plan Version) on the server.

$ mp planning:data-plans:create --dataPlan="{ // Data plan as string //}"

You can also replace dataPlan with dataPlanFile to use a path to a locally stored data plan if that is more convenient.

For example:

$ mp planning:data-plans:create --dataPlanFile=/path/to/dataplan/file.json

To create a Data Plan Version, simply use mp planning:data-plan-versions:create and pass a dataPlanId as a secondary flag.

Editing a Data Plan and Data Plan Versions

To edit an existing Data Plan (or Data Plan Version) on the server, use the following:

$ mp planning:data-plans:update --dataPlanId=XXXX --dataPlan="{ // Data plan as string //}"

You can also replace dataPlan with dataPlanFile to use a path to a locally stored data plan if that is more convenient.

For example:

$ mp planning:data-plans:update --dataPlanId=XXXXX --dataPlanFile=/path/to/dataplan/file

To create a Data Plan Version, simply use mp planning:data-plan-versions:update, --dataPlanVersionFile, and pass a dataPlanId as a secondary flag.

Deleting a Data Plan and Data Plan Versions

To delete a data plan, simply pass the dataPlanId into the delete command.

$ mp planning:data-plans:delete --dataPlanId=XXXX

Deleting a Data Plan version is similar, only requiring an additional flag of versionNumber

$ mp planning:data-plans:delete --dataPlanId=XXXXX --versionNumber=XX

Validating against a Data Plan

Validating an Event Batch is a more complex task and the CLI provides flexibility by allowing validation to be done either locally or via the server, depending on your needs. Running a validation locally does not make a request on our servers, and as such is faster and ideal for a CI/CD environment.

$ mp planning:batches:validate --batch="{ // batch as string}" --dataPlanVersion="{ // data plan version as string }"

This will locally run your batch against a data plan version and return any validation results to the console.

This command also supports an --outfile flag that will write the validation results to a file in your local directory, in case you’d like to save the results for future use.

Both batch and dataPlanVersion support a batchFile and dataPlanVersionFile parameter (as well as dataPlan/dataPlanFile and versionNumber) options for less verbose validation commands.

Was this page helpful?

    Last Updated: December 5, 2024