Vivek Baskaran, VP of Revenue at ChargeBee provides the following framework required at a high level.
So let's move forward and now take a look at the Data Tools and Infrastructure required to accomplish all these goals.
It's worth noting that Google Analytics primarily serves Marketing Managers, and the reports mainly focus on how the user got to your site. The tool offers optional seamless integration with SEO and Google Adwords, and it can provide valuable information about what devices, geographies, or demographics the users belong to. However, it's essential to keep in mind that some people classify Pageviews as a vanity metric. If you're new to Google Analytics, it might be helpful to connect with a consulting company to help you get the most out of the tool. And the best part? It's free!
On the other hand, Product Managers and Growth Managers rely on product analytics tools such as Amplitude and MixPanel to identify different paths users take within the product, and pinpoint where these users may be dropping off. The ultimate goal is to improve the product and remove any barriers to entry that may prevent users from experiencing the full value of the product. These tools also help make onboarding easier and faster for new users, ultimately leading to that "aha!" moment that keeps them engaged.
Segment, Amplitude and MixPanel are three such companies that offer tools that are designed to help you build a great product based on tracking specific user actions (events).
When you are tracking these events you can create a retention cohort like the one shown here. where users who sign up at a particular time (typically weekly) and do certain events are put into the same cohort and then you can track how many of these users come back each week. This allows you to see what is working and what is not.
This user event data is the magic juice that needs to be fed into different other tools for the other three things we talked about earlier - correlation analytics, entitlements and triggering actions.
There are two tools here that are most popular one is Segment and the other is Rudder Stack. It allows you to instrument events once and pipe the data to multiple analytics tools, avoiding the need for reinstrumentation as your needs evolve. This is thanks to Segment's easy to configure system that can pipe events to multiple destinations.
Also, to further improve the user experience, a product manager can utilize a heat-map tool, such as FullStory. It's easy to use, provides real-time data, and offers actionable insights. If you see an anomalous experience, you can report it to the QA and customer success teams by directly linking to Jira and Slack. Because it records the user's entire session it is valuable additional data to event data captured already.
In addition to this, survey tools like Delighted or Uservoice can be employed to capture qualitative metrics. Delighted is simple to use and can be utilized to get Net Promoter Scores (NPS), whereas Uservoice is more versatile in nature.
Another good option that can be considered is to integrate Intercom and spend some time understanding customer issues during the onboarding period. The data points may be small, but they can provide valuable insights to help product managers understand customer pain points.
When it comes to capturing events, I want to underscore the importance of instrumenting early and smartly.
A smart idea is to start instrumenting based on a few hypotheses. This approach is much more efficient than trying to achieve 100% instrumentation, which can take weeks, months or even years to accomplish. It also results in so much data that there is a need to filter this out to events that are significant.
At Adobe Photoshop, a dedicated telemetry team was set up with a PM and engineers who were responsible for creating data collection frameworks and policies, as well as coordinating implementing tracking.
To have a dedicated PM is not a common practice, often the weakest member of the engineering team is given the task of instrumentation and he/she has close to zero business context of what they are doing and why they are doing and what an positive/negative impact there work can have on the conversion and growth metrics.
Therefore, I highly recommend having a PM lead the instrumentation effort that has the appropriate business context.
Databases and BI Tools are the go to tool for investigating correlations.
As Growth Advisor Hila Qi from Reforge notes, "Product-Led can also be called Data-Led."
That's why a variety of cloud-based warehouses are available to help businesses store and manage vast amounts of data effectively. Some popular options include Google's Big Query, Amazon RedShift and Snowflake and many others.
Of course, raw data alone is not enough to drive insights. That's where Business Intelligence (BI) tools come in, offering powerful visualization and data analysis capabilities. Common BI tools like PowerBI and Tableau have traditionally been owned and operated by dedicated groups within a company, generating reports for everyone else.
However, newer tools like Mode Analytics, ThoughtSpot, Looker, and Periscope Data have emerged, aiming to democratize data correlation analysis for everyone. These tools make it easy for someone without SQL knowledge to perform sophisticated correlation analysis. In the case of ThoughtSpot it's based on NLP so you can ask questions for enterprise data in the same way you ask questions on Google.
To be able to predict in a reliable manner you need to specify exactly what you want to predict.
As depicted here the holy grail of PLG GTM motion is to identify and move casual users to become core users and core users to become power users. We talked about this earlier in Module 2 also.
To make this more tangible one recommendation is to start with the pricing page for your PLG company. What you need is to identify the cohorts of users that are at the boundaries of each of your pricing tiers.
Almost all PLG companies have pricing tiers based on a usage metric. For example number of emails sent, or number of api calls or storage requirements etc. You want to find out how many users are above the 90% percentile of this usage metric for each of the pricing tiers.
You should then also find out how many users are at the bottom percentile (say less than 20%) and the others will be in-between.
Now you configure an outbound campaign offering a free trial for 14 days to all users that are in the 90% bucket and above!
Next you want to build a predictive model that identifies the common features used by the users that belong to the 90% and above bucket. This can help you identify others that are similar and can be nudged to use more.
Creating personas based on what features are correlated with uptick and those that are not are both good predictive intelligence to build. Payment data or Plan/tier name when available should be used as a signal to train the models properly.