Why revenue-oriented metrics are ultimate KPIs, and always need to be present when talking about conversion rate.

Introduction

I have been requested several times to create “dashboards” (mind the quotes) containing powerless metrics. Powerless because the client has not thought about the business question/s behind the dashboard in the right context, so data be easily misinterpreted.  

Context is so important. For example if I don’t know the impact of an improvement in the conversion rate (CR) in the revenue, then I cannot know if that’s good or bad news. In theory more CR leads to more revenue, but that’s not true necessarily. That´s why CR should go together with a revenue metric, that can be the revenue itself or at least something like average order value .

I think it was at Congreso Web, in Zaragoza where I heard from one the best analysts I know (Xavier Colomes) that he can “believe” any excuse we analysts claim to justify our lack of action. But there is an exception, he cannot believe that your boss does not want to make more money (more revenue). And I agree with him.

Why revenue is that important?

Because it can tell you whether the business (or business unit etc.) is improving over time or not. Any other metric is not that powerful. Specially CR. I have seen too many times the conversion rate going up and the revenue going down, and the other way around.

Revenue gives you the right context to analyse conversion rate.

Explanation is simple. If you sell i.e. theatre tickets, customers buy normally two tickets (couple) or let´s say four (group of friends). Always one transaction, but units and AOV (average orde value) are going to be twice bigger in the second option.

It also happens when you sell i.e. an offer for a cheap meal, that makes an improvement in transactions and CR of i.e. 10%, but in terms of AOV, or revenue could be just i.e. 2% if that´s product is way cheaper than your average. Still good, as any improvement, but the spike is not as good as we would think if we would only look at the CR.

This does not need to happen in every company. I don’t think anyone buys six pairs of boots in different colours at the same time. But units (number of product being sold) is a very important metric in some industries. 

The key idea is that focusing on just improving the conversion rate (without any context on the real impact in the business) is like focusing on improving just the bounce rate.

Why should we work to optimize the revenue and not the conversion rate?

Optimizing the conversion rate should be a means to an end, and not the end. A means to improve the revenue. Ask your CEO if in doubt 🙂

Some products (or categories, or packages etc.) may have a smaller conversion rates but generate more revenue. And we should identify them by segmenting our data, and then try to improve the conversion rate in these specific segments.

That´s why I think that when we create a dashboard or talk about CR, we should look at the evolution of the CR and its impact in the final goal of the company (that is, making money).

I am not a football fan, but will use it for a clear example.

What´s the ultimate “conversion” KPI in football?

Goals. Full stop.

What micro conversions lead to get more conversions or goals?
Let´s mention a few key football metrics: corner kicks, shots on target. ball possession, dangerous attacks, avg. kilometers per player etc.

 

 

 

Passes are very important, and as a KPI can be segmented

 

 

 

 

 

 

These metrics above may make us think that there wasn’t a big difference between both teams. Let´s take a look to goals, which is the ultimate “Key Football Metric”

Something to say? 🙂

Back to Analytics

I would have been grotesque (even more…:) that after the match, Brazil would have given some importance to the metric total shots, or claimed that their players runned more kilometres than in the previous match, so there’s a positive trend there…

Anyone cares if you have improved your “shots on target” rate if you lose 7 -1. And anyone cares about an improvement in <insert your fave metric here> if the revenue goes down. Specially the CR.

Data need context. And part of the context of a micro conversion is how the macro conversion is affected. Thus, part of the context for the i.e. customer retention metric is how the revenue is affected. Same thing, in a football match. Goals needs to be related with the metric dangerous attacks. Any other thing is pointless and may make us think that something is going well when actually it´s not (or the other way around).

We should care about the revenue and a few KPIs, like CR. But these KPIs need to be segmented and measured together with their impact on the revenue.

Two benefits of giving a special treatment to the revenue are:
– We can know very quickly if we are performing better or nor as a business
And then segment and look the other KPIs in order to understand why, and what groups of customers / products we should focus on
– We will catch & keep more easily the attention of the stakeholders
We are talking about what they care. The language of money is always understood.

Last thought

This same idea should apply when we focus of our analysis in a specific part of the funnel. We should look at the CR, or the next step to get the whole context.

It may happen that the Transition Rate (TR) from step A to step B is working better for a specific product, device etc. but the Conversion Rate (CR) is worst. Or the other way around. And that´s something we need to know.

First steps with the tagging solution Signal (formerly BrightTag)

There are three basic concepts that you need to understand to start working with Signal:  data elements, inputs y outputs

  • The Data Elements are the variables we want to collect
    The set of Data Elements is the Data Dictionary
  • The Inputs are what we want to know about the user and its visit to answer our business questions (in plain English)
    It can be based on page loads or user interactions. Also can be for a web browser or an app.
  • The Outputs are the tags we use to send data to digital analytics tools, like Google or Adobe Analytics

Let’s see how they work 🙂

1) Data Elements

The data elements are the variables. The data we want to get from our web or app and send to our analytics tool. Just what in other tools are dubbed as “variables”.

The way to define them and creating the tagging map is what Signal names Data Binding.
That´s the way for us to tell Signal: in this input, you have to collect these data elements (variables)

For example, for every page load (that is an input) we want to collect (among other stuff):
– The page of the name.
– The page primary primary category.
– The page sub sub category.

So, in the Data Dictionary, we have to create a Data Element for all the variables.
1

2) Inputs

The inputs are what we want to know / get about the navigation & behaviour of our users. The pages that are being loaded or the key interactions during the visit.

For example:
– I want to know that the product pages are being loaded
> I create an input “product pages”

– I want to know that as the page loads, we collect the product name, category and subcategory
> I create a Data Element for each (previous example) and inlcude them within the input “product pages”

We should create an input for each type of page load or user interaction.
I mean that we need an input for product pages, another for home page, another for checkout page and so on. The reason is that some data elements are common to every page type (i.e. page name) but some are just for a specific type or some o them, but not all (i.e. product name is not something we want to collect in the product page, category, checkout etc. but not in the homepage…)

To track user interactions, we just follow the same process as page loads. If we want to know if i.e. users sign in or sign up (and both have two steps – first click + actual success) we would create an input for each of the interactions.

2

3) Outputs

Keeping in mind what has been explained so far, Signal already knows what actions we want to collect (product pages load, sign up etc.) and what we want to know about it (the name of the product, the category etc.)

The third step here is actually sending the data to the analytics tool we use. To do so, we need to create an output for each input, specifying the vendor (in the example Adobe Analytics).

3

 

Last idea

There are two steps:
– Signal getting data from the browser or app (data elements & inputs)
&
– Signal sending the data to the analytics tool (outputs)

Both can be the reason of not having data in the analytics tool, so we need to keep in mind this idea in case we detect data collection issues during the QA process.

Also we need the same inputs and outputs for each data element. I would explain in more detail in another post these three concepts and other functionalities of Signal.

Any idea? Any comment? Any complaint? Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

 

Data QA with Charles Debugging Proxy (basic level)

Digital Analytics is intended to transform data into actionable insights. Sure, you have heard it a million of times. However if the data collection has not been previously audited (so we can validate it) we may be taking decisions based in data that are just wrong.

This first and necessary step that is forgotten in many cases is a bit one of geeky obsessions and priorities in every analytics project. It’s something we should use to make sure we collect properly the data we want. But also it can be used to check and see what other companies do track i.e. the side filters the user may apply when looking for some content.

1 – Tag Assistant: check easily and quickly if there’s something wrong in your Google Analytics implementation

You just need to install the plugging Google Tag Assistant for Chrome (developed by Google itself) and click in the blue label appearing in the right corner of the browser. This label will be red if any error is detected, green if everything is ok and blue if there are things that can be improved.

Let’s see an example:

p1 i1

Oh, it seems that the GA code has been placed outside the <head> tag. This means that the “tracking beacon” could not been sending data before the user leave the page, having a logic and negative impact in the quality of the data we are collecting. To fix it, you just need to click in ‘more info’ and follow the recommendations from Google Analytics Support. In this case, the solution would be just pasting the GA code in the head tag.

It’s needless to say that the UA code showed by Tag Assistant and the one in your GA property should be the same.

Just taking this little step, we would be doing a basic -but important- step to understand what’s wrong with our data collection and get a first idea of what we need to do it to fix it.

2 – Debugging tools to check what we are actually sending to Google Analytics

My favorite tools for performing a basic analysis are Fiddler y Charles . I will use Charles in this example. You just need to download it (it’s free), open the browser and visit the website you want to audit.

Then, in Charles you need to look in “google-analytics.com” in the menu on the left (structure o sequence), y click in “request” in the right.

Let’s see a basic example with a downloads marketplace called Fileplaza

i2i3

What can we see here? In the screenshots above

  • What is being collected?
    A pageview
  • What page is being collected?
    http://www.fileplaza.com (I have just visited the homepage)
    The URI (URL without domain) being collected in Google Analytics will be: /
  • What’s the tittle of the page we are collecting?
    Free Software Download and Tech News – File Plaza
  • What’s the referrer that is sending the visit to fileplaza.com?
    google.co.uk (I am in the The UK and have searched “Fileplaza.com” in Google)
  • What’s the Google Analytics UA code of the property in which FilePlaza is collecting its data?
    Actually, there are two UA-48223160-1 & UA-23547102-20 (a different UA code in each screenshot)
  • Why are there two screenshots shoing the same data but having different UA codes?
    Because they are sending traffic to two GA properties.
  • Are they using Google Tag Manager?
    No, otherwise we would see GTM-XXXXXX

Now, we know what is being sent. But is is correct?

This is quite simple to answer. What we see should be the same as we want to collect with GA.
Otherwise, there’s something wrong. For example, if I do a specific interaction that is tracked with an event, but Charles doesn’t show that event, it means that the interaction is not collecting the event. And that’s something we should fix.

Let’s navigate a bit more …

1) I visit a product page: /home_education/teaching_tools/kid_pix_deluxe/

i4i5

In the screenshots above, I see:

  • A page view is being collected
  • The page being collected has changed
    Now is http://www.fileplaza.com/home_education/teaching_tools/kid_pix_deluxe/
    So the URI in Google Analytics will be /home_education/teaching_tools/kid_pix_deluxe/
  • The data is being sent again to the two properties.

2) I click in the button ‘download’ that leads me to a downloading page:  windows/home___education/teaching_tools/download/kid_pix_deluxe/

3 l click to download a software:

i6i7Oh! Now there’s something new… Looks like my click to download has triggered an event (first red arrow) that is sending the data below to the two GA properties.

1- Event category (ec): descarga
2- Event action (ea): final -I assume is the button type, since it’s the ‘final’ button to download
3- Event label (el): Kid Pix Deluxe -that’s the name of the software i have just downloaded-

As Fileplaza is a marketplace of downloads, so the buttons to download are key. What Charles shows is the fact that they use events to measure the success of this specific user interaction.

Let’s sumarise what Charles is showing:

  1. Fileplaza measure the success of the downloads using events.
    Do they do it correctly? Yes
  2. The events contain relevant data about the interaction: action being perfored by the user (download), button being clicked and name of the software
    Do they do it correctly? Yes
  3. For a reason, FilePlaza wants to collect the data in two different properties of GA
    Do they do it correctly? Yes

I like Charles, but which tool you use is the less important thing here. I am quite tool agnostic and there are other that are also very good, like Wasp (recommended for a data layer) Google Analytics Debugger or Data Slayer (recomended for ecommerce). Normally, I use the console, in which you can install pluggings for most tools like Google Analytics or Adobe Analytics.

Any idea? Any comment? Any complaint? 🙂 Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

 

Análisis de cohortes con Adobe Analytics

El análisis de cohortes permite dar un enfoque de largo plazo al análisis de datos. Y no quedarnos con el corto plazo simplemente, sin visión estratégica.

En el post explico porqué es importante este enfoque y como llevarlo a cabo con Adobe Analytics. El original -en inglés- lo he publicado en este mismo blog. Y para la versión española, nuevamente Carlos Lebron me ha prestado amablemente su blog www.AnalísisWeb.es

Puedes leerlo haciendo click en el enlace Análisis de Cohortes con Adobe Analytics

Adobe Analytics – Cohort Analysis

I heard for the first time about using Cohort Analysis in Adobe Analytics during the talk “The Chef’s Table” from Ben Gaines at Adobe Summit EMEA in London last May (2016). Ben explained that Cohort Analysis was one of the cool things coming with Workspace.

I immediately thought that it’s an “ingredient” that should be present in any analyst’s table in which meaningful insights are to be prepared.

What is Cohort Analysis?

Wikipedia says that “Cohort analysis is a subset of Behavioral Analytics
that takes the data from a given dataset”.

To put it clear and adapt the definition to the context, I will say that a cohort is a group of users who performed a specific action at the same time.

For example: users who came to our site from a PPC campaign and created an account on the first week of May. That’s the cohort, and the cohort analysis will let us take a digging about i.e. the amount of purchase orders generated by that cohort during the next ten weeks.

It will enable us to segment a bit further very easily, and know some characteristics about those users who actually purchased (what do they have in common?) and those who don’t (again, what do they have in common?)

Why is it important?

Looking at conversions / user behaviour over time, cohort analysis helps us to understand more easily the long term relationship with our users or customers.

We can use Cohort Analysis for business questions like:
– How effective was a PPC campaign aiming for new accounts in terms of orders over time?
– How is the purchasing loyalty for a specific product category?
– What segments (cohorts) generate more revenue over time?
– How is the engagement in terms of returning visits generated by a specific action?

We can now easily evaluate the impact and effectiveness of specific actions (or campaigns etc.) on user engagement, conversion content retention etc.

And last but not least, we can apply segments, so we can focus only on a section, referrer, device etc. that is key for us.

How Cohort Analysis can be done in WorkSpace?

Just go to Workspace and select a project. Then select “Visualizations” and drang “Cohort Table” into a “Freedom Table”

 

 

 

 

 

 

A table will appear containing three elements:

  • Granularity

Day, Week, Month etc.

  • Inclusion Metric

The metric that places a user in a cohort.
For example, if I choose Create Account, only users who have created an account during the time range of the cohort analysis will be included in the initial cohorts.

  • Return Metric

The metric that indicates the user has been retained.
For example, if I choose Orders, only users who performed an order after the period in which they were added to a cohort will be represented as retained.

The inclusion and return metric can be the same. For example Orders, to watch the purchasing loyalty.


Cohort Analysis in place

I am changing the example to just visits. Among the users who visited us in a specific day, how many of them returned during the next secuencial days? (remember that this question can be about orders or any other metric/s)

For the question above, I have the table below

 

 

 

 

 

 

 

Every cell, depending on if the numbers are bigger or smaller, show a brighter or softer green colour. And we can be “curious” about both groups (what seems to be working well, to take a digging about “why” who etc. and same thing to understand what seems not to be working as expected)

To know what have in common the users behind a specific cell we want to analyste further, just right click on a specific cell and it will open the segment manager tool containing the cohort.

 

 

 

 

 

 

 

 

Save the segment and take a digging, looking to the entry page, referrer, devices etc. of this specific cohort in which in the example we are not engaging, to know more about that specific type of user

Conclusion

Cohort Analysis can help to analyse the long run. It´s very common analysing how many sales have been generated by the different campaigns and “that´s all”. But Cohort Analysis can help us to know what happened over time with customers who bought for the first time from campaign a and the ones who did for b, o who purchased product type a or b.

And that´s an improvement 🙂

And you? How do you watch your key segments over the time?

Any idea? Any comment? Any complaint? 🙂 Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

 

Ce qu’il faut faire quand Google Analytics “ne montre plus de données”

Le problème: Google Analytics a arrêté de collecter de données ‘aujourd’hui’.

Normalement, on nous dit que ‘on a rien change, absolument pas!’ et / ou ‘on peut voir que le tagging marche’.

On peut voir dans l’image ci-dessous que Google Analytics ne montre aucun donnée après 8:00 am (du jour où j’ai pris l’image)

1

La solution. Savoir qu’est-ce qui passe et comment fixer le problème?

La première chose à faire, c´est identifier le genre du problème:

  • A) Google Analytics ne collecte plus de données.
    ou
  • B) Google Analytics ne montre plus de données.

Evidemment, ce n´est pas habituel mais il faut dire qu´il est possible que Google Analytics ne montre pas de données, tout simplement. Ça m’arrivé plusieurs fois..Parce il est possible que Google Analytics continue à collecter de donnes (comme d’habitude) mais ne montre plus les dernières (d’aujourd’hui).

Qu’est-ce qu’on peut faire pour savoir où on en est? Problème A ou B

Il faut qu’on applique un filtre.

C’est-à-dire, on oblige à Google Analytics “à penser”. Si Google Analytics a collecté les donnes correctement, il va nous les montrer. Dans l´image on peut voir qu´une fois le filtre est appliqué, ´tout à coup´ les donnes d’aujourd’hui apparaissent. Pour tout le trafique, et pour le segmente qu’on vient d’appliquer.

2

Si notre site web est un e-commerce, on peut cliquer aussi là-bas3

Comme on peut voir dans l’image dessus, les donnes d’e-commerce apparaissent aussi. Et encore pour ‘aujourd’hui’ (le jour ou que j’ai pris l’image) et les jours préalables. Alors, lorsqu’on oblige Google Analytics à penser, tout rente à l’ordre.

On peut aussi regarder les donnes en ´real time´ et vérifier s’il y a des donnes là-bas ou pas. Il est recommandé de regarder encore une fois le lendemain du souci ou quelques jours plus tard. Moi, quand ce problème m’a arrivé cette petite astuce m´a suffit pour le fixer.

Cependant, si on applique le filtre et le donnes ne revient jamais, alors on a un problème d’autre type et beaucoup plus grave. Il faudra faire attention au tagging et à la collection de donnes. Je parlerais de ce sujet prochainement.

Pour en finir, je vais dire que moi, je suis francophile mais pas francophone. Depuis j´ai quitté Paris je continue toujours à lire Le Monde Diplomatique et en general à faire d´autre choses en français 🙂 Mais pas forcément à écrire. J´espère qu´il n’y a pas de trop grosses fautes de grammaires ou de vocabulaire. Excuse my French, stp 🙂

Alors, des idées? De commentaires? Des plaintes? 🙂 Laisse ton commentaire, ou contacte mois dans mon email adresse geekeandoenanalytics@gmail.com ou sur mes profiles de Linkedin et  Twitter

That’s why I like juggling and you, as a Digital Analyst, should too

Even if it may sound a bit weird, analytics and circus have more in common that you would expect. At least in terms of mindset and skills to develop, I really see some interesting similarities. Probably, it happens the same thing with music- Unfortunately I cannot play any instruments, but apparently I have the looks 🙂

I like juggling and analytics. And what it’s around as well.

When I juggle or i.e. I track something complex for the first time, I can do things that one time in the past I even thought I would not  be able to do. It’s mainly about practice and eagerness to learn.

When I juggle (or I do analytics) always -and actually is always- I can learn something new, and have a new challenge to face new problems, or situations I have not faced before. And it nurtures -among other things- the which to exceed oneself, the curiosity or the creativity in the approach. At some point, dealing with new things becomes part of everyday life. 

There isn’t just a right way to juggle in terms of posture, the way you open your hands etc. Same thing in Analytics. There are different alternatives and approaches depending on what you do at a specific moment. I have learnt to be flexible enough depending on the context, with a focus on solving the problem/trick.

Juggling is a social thing. Most jugglers are pretty easy going and sociable. Juggling let you meet and interact with like minded people, with whom you normally share more things.
Same thing in Analytics, assuming you attend regularly to conferences or events for us.

It happens to me all the time that the balls fall down. And it happens because I am constantly pushing myself up, trying new stuff out of my  comfort zone. But finally I manage to learn and do the trick correctly, and each time in less time. The willingness to progress and doing well more complex things is what define your progress. And it also happens in Analytics.

Attending a juggling contention is fun. Attending Measure Bowling, Measure Camp or the ‘celebration’ at the Adobe Summit is fun as well. Same thing with some MeetUps. I have had always a good time and also had the chance of meeting new people, some of them becoming friends later.

Any idea? Any comment? Any complaint? Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

Anomaly Detection – ¿Por qué han subido o bajado mis conversiones?

La Detección de Anomalías (no confundir con picos o caídas)  es parte las nuevas funcionalidades ofrecidas por Adobe Analytics y aporta un método estadístico sobre como una métrica ha cambiado en un periodo de tiempo.

El post en español me lo he publicado Carlos Lebron en su blog Analísis Web y puedes leerlo haciendo click en el enlace

¿Por qué han subido o bajado mis conversiones?