Product and Solution Marketing Metrics
North Star Metrics (CEO (Sid) ask on North Star Metrics)
Every marketing function should have at least one north star metric that they aspire to improve over time to help GitLab’s business growth. Below is a list of metrics by different marketing functions in Product and Solution Marketing.
Team | Primary Metric | Secondary Metric | Eventual Metric Goal |
---|---|---|---|
PMM | Views | Content Created (pages, docs, talks, blogs, etc) | IACV Attribution |
TMM | Views | Content Created (demos, videos, etc) | IACV Attribution |
AR | Views | Analyst Coverage (reports, mentions, MQs, Waves, etc.) | IACV Attribution |
Cust Ref | Views | Content Created (Case Studies, Quotes, References, etc) | IACV Attribution |
Comp Intel | Views | Content Created (Comparison Pages, etc) | IACV Attribution |
Working Definitions
- Impressions: A total count of how many people viewed our content.
- Web pages, this is page views. (source: google analytics?)
- Downloadable assets, it’s downloads. (source:??)
- YouTube Videos - views. (source: YouTube?…)
- Webinars, Workshops, etc (Live Views). (source) (note: a webinar becomes a YouTube video and accrues “views”).
- In person events where we are speaking - audience count.
- Content Created: A total count of content that was NEWly created or significantly improved/updated. (source: SM Inventory)
- IACV Attribution: - our contribution to pipeline (source: Sisense - attribution dashboard)?? Note: campaigns contain a Product and Solution Marketing Field which should be used to document our campaign contribution.
- SM Field in SFDC: - a new field has been created in SFDC campaigns to measure our “active contribution” to campaigns. We use the following rubric to determine our contribution:
- None or Blank: No active Contribution
- Low: Review content, validating messaging
- Medium: Revising content and /or customizing existing content
- High: Developing net new content and / or delivering content (whether new or existing content)
Product and Solution Marketing (Ashish) thoughts on metrics we should consider
Multiple lenses of metrics are needed to build the narrative such as:
- WHAT do we produce?
- Example: 5 WPs, 16 Demos, 2 customer case studies
- Assign to right category (Outbound, Campaign, Internal)
- Content created vs. curated
- WHERE does it show up?
- What are the various ways we activate content?
- List of activation methods example: Website, SFDC, Email campaigns, Event booths, Campaign design, Conferences, enablement sessions,
- How MANY people see (view) it? Frequency
- Example: 3000 website visitors per quarter, 45 sales people/month, 25,000 prospects for demo x in 12 events in a quarter, Handbook views, How many sales/channel got enabled?
- Why: Indicates overall activity for a given asset in a given time frame. It is an indicator about the effectiveness of the Title, description, landing page, and campaigns related to the asset. It does not tell us ANYTHING about the quality or effectiveness of the internal content. What will it tell us: The degree to which a given asset is active or not.
- What HAPPENS with the content?
- Linear attribution of content in programs?
- How many converted?
- How many MQLs did the asset create?
- Why - Why: Indicates how an asset is contributing to MQL. This indicates the frequency that a given asset is contributing to overall marketing qualification of leads. MQL What it will tell us: This metric probably does not provide any more insight than Asset Views. MQL simply is a cumulative set of activity by a given lead/contact to reach MQL threshold. MQL may be simply a refined view of Views.
- How many SAOs did the asset create?
- How many opportunities did the asset help create?
- How much $ (ARR) did the asset help create?
- Why: Indicates that a given asset supported in revenue. We might be able to imply something about the quality of asset content, because end users have consumed the asset and then continued the buying process, however will require more analysis of other data points. What it will tell us: This will help us understand which assets are most connected with generating revenue and should therefore be promoted and re-used.
Challenge: Why do we measure? What are our next steps / actions based on what we learn?
Other considerations:
- Program Budget spend analytics
CMO (Todd) Challenge on how to measure
- View of output metrics:
- Views/reads - Google Analytics dashboard is great because I can set goals. Some suggested fixes:
- Tagging YouTube and blog content so you can filter
- Changing Use case pages to public website pages
- Linear Attribution of content in programs and campaigns
- Value of Product and Solution Marketing to other teams
- Impact to sales (Survey?)
- Rest of marketing (Survey?)
- Views/reads - Google Analytics dashboard is great because I can set goals. Some suggested fixes:
- Other metrics to consider:
- Website first (MRs?)
- Metrics on views, etc.
- Maybe update BOM colors/links at every stand up
Asset Inventory
Product and Solution Marketing tracks an inventory about what it creates and publishes. This inventory originated as part of the Learn@GitLab project but is being extended to other groups, as the more complete our inventory is the more efficient we can be as a company in using them. The goal is to have other groups inside and outside of SM also adding and consuming from this inventory as the SSoT of available assets.
- Product and Solution Marketing Content Inventory Issue
- MVC1 on Google Sheets
- Google DataStudio Dashboard
- YAML based inventory model (MVC) - Details on making additions/changes are detailed in the asset_inventory page. If you’d like to add your own team inventory file please follow the information on that page.
Web Traffic Analysis
- Website and handbook pages
- YouTube Metrics
- Learning Links:
Marketing Attribution Model
Customer Reference Analytics
Current metric - number of cases published per month - current target - 2 / month
- Metric not most efficient or in control of customer reference team
- Number of case studies published not always in control - example, several “completed” case studies stuck in pipeline for reasons such as - hold at pre-publish stage for PR purposes (example - EMEA), still to be approved by customer legal or marketing department
- Does not reflect true value to business. Examples:
- References help obtain results in important AR reports - Gartner MQs or Forrester Waves
- References help close opportunities (sales references)
- References help generate pipeline (example - customer speakers for webinars or conferences)
Need additional metrics to measure value of customer reference activities and value delivered to the business.
- Ideas to consider:
- Overall growth of number of champions/logos included in program
- Number of reference requests
- Revenue influenced in pipeline
- Closed Won revenue influenced
Additional sources of metrics to consider:
Competitive Intelligence Analytics
- Ideas to consider:
- Track usage of DevOps Landscape comparisons page
- Page visits
- Number of sessions
- New visitors
- Return visitors
- Avg. Time on page
- Bounce Rate
- Click Through Rate
- Conversion Rate
- Track usage of competitive resources page
- Number of persons accessing
- Number of inputs/comments
- Number of inbound requests
- Competitive Intelligence Material attribution to sales
- Impact of competitive resources to sales close - Deal Support and ROI Analysis
- Number of deal specific enquiries responded to
- Opportunity value of deal specific enquiries
- Number of ROI analysis
- Opportunity value of ROI analysis conducted
- Opportunity value of deals won that included ROI analysis
- Outbound (Blogs, videos):
- Visits (bounce rate, time on page)
- Engagement (low threshold CTAs clicked)
- Leads (clicked on a register/signup)
- Campaign (gated asset)
- Visits (to landing page) - (isn’t this also a valid measure, yes)
- Leads
- Internal (tied to sales ops)
- Opportunities influenced (decks, showing up, ROI, cust ref, etc)
- Internal views/downloads (enablement)
- Track usage of DevOps Landscape comparisons page
Analyst References Analytics
What do we measure today?
- Baseline survey data at FY2021 SKO on sales usage of analyst assets
- Gartner reprint info slides, and historical (views, etc.) Forrester reprint info (views, d/l, dwell)
- Customer Reference metrics - slide 2 is an example of current metrics (pool and attributes, requests and attributes, starting to measure some revenue attribution/impact, etc)
- Ideas to consider:
- Analyst Newsletter metrics
- Number of briefings and inquiries led
- Coverage heat map - overall
- Coverage heat map - changes over time
- Analyst Relations web traffic
- Impact on sales (pipeline, conversion, win/loss) - SFDC touch points
- TEI Metrics
- Satisfaction - a regular internal and external surveying/measurement metric
Examples to consider
- ChangeLog
- Surveys (Sales team)
- Analytics on issues managed - open/closed, internal vs. external
- Pathfactory
- SM Issues Analysis
- SM Budget Spend analysis
- CAB Data (to consider)
- x% that have case studies
- y% are active in sales
- z% are speaking in events (GitLab, industry, etc) for/with GitLab
- a% are active in analyst reference calls
- b% provide messaging, positioning, and persona feedback
- c% “decline” for our requests?
- STRETCH Goal: Can we analyze if we have been able to sell more into these organizations since they joined the CAB
Useful links
- Marketing KPIs
- Slack Channel - #keymonthlymarketingmetric
- Tech Stack
455376ee
)