An analogy about the differences and similarities between a digital analyst and a data sciencist

Introduction

I have been doing analytics for a few years now, and made some friends who are also analysts. However, anyone among my friends from my hometown, uni etc. does something similar. Data science is being a popular topic and that´s why quite often I am asked if I am a data sciencist, or even about the reasons I am not.

I always explain it using an analogy that everybody can understand. I have also spoken about it with some fellow analysts, who agree with me. So, finally I have decided to write about it.

Let´s think of two musicians: a violinist and pianist.

Similarities

Both, a violinist and pianist are basically musicians. Thus, from the outside they may seem to be doing the same thing.

Probably, they also have:

  • A similar mindset and a special creativity to develop an activity like creating music.
  • Attended a music school for several years.
  • A taste for keep learning, and always pushing their own boundaries.
  • The same talents likes identifying by name individual pitches.
  • The same basic hard skills like music theory, reading / writting music.
  • A to-do approach to work, knowing that practice is the key to mastery.
  • The ability to play several intruments, even from different instrument families.
  • The posibility of easily learning a new instrument that will open up some professional alternatives. Like learning the Viola if already play the Violin, or playing the Viola with the hands if already play it with the bow, moving into electronic music with a keyboard if already can play the piano etc.

And as the Spanish musician Enrique Bunbury says “we all suffer from the same ´pain´”.

But still …
A violinist is a violinist. And a pianist is a pianist.

No violinist is going to think that a pianist is that similar to him. They should see themselves close to other musicians within the violin family, like  the viola or the bass violin. And only similar to the pianists if we talk about music in a wider sense.

Try also to ask a violinist why he does not play the piano or plan to do it? The answer in most cases will be something like “because I am a violinist” or “my learning goals are getting better with the Violin / learning the Viola”

Just change to change the words violinist & pianist for digital analyst & data sciencitist. And also change musician for data geek.

Closing thoughts

As a digital analyst, I know how to create insights (music) using certain tools like Adobe Analytics (i.e. the violin). Actually more than one tool type, like tagging solutions (the bass violin). I can also do data qa and debugg (tune the instrument). I can writte tracking requirements (writte music). I understand looking at the console what others have done (read music). I also know many other analysts from other companies (violinist from other orchests) etc.

However, it does not mean that I am almost the same as a data scientist (i.e. pianist). Not even that we would be necessarily good at that, at least in the short term, or plan to become one. Well, let´s see.

 

How to make sure your whole site is well tracked and only once

Introduction

It´s very important that we track our site properly. However, it may happen that:
a) we are not tracking a part of it
or
b) we are tracking it twice.

Tracking, yes, but just once, please. Let´s see in this post how to detect a possible issue and how to fix it.

The scenario in which this issue can easily happen is when we migrate from harcoded to using a tagging solution.

Let see it with an example: migration from Google Analytics (GA) harcoded to GA via Google Tag Manager (GTM). After the migration, we need to make sure that:

  • The tracking code is included in every page
  • There are no duplicated codes (GA & GTM at the same time)
    Otherwise we will be counting everything twice if both codes are in place for the same page loads or user interactions.

How can we make sure we do it correctly?

We can use tools like Screaming Frog, which is widely common for SEO purposes. But we digital analysts can also use it. And actually it’s recommended to do it when we perform an analytics audit or get a new client / project.

Just go and download the tool from their web. The price is just 149 pounds per year (less than 200 euros) what is fairly cheap for the value it offers. Just think that the cost of having a goofy analytics implementation can be estimated in more than that ….

There’s also a free version, but can just check 500 urls. Depending on the size of your size, it can work for you.

How to start using Screaming Frog.

If you have got the paid version, once you open the tool, click on “Licence” and enter the keys. And then:

1) Customizing the configuration

This is recommended to for two reasons:

  • Avoid stuff we don’t need
    It’s always distracting and makes slower the checking. For example images or CSS.
  • Subdomains are not included by default
    If your site has subdomains, then we should select them.

1

 

 

 

 

 

 

 

2) Using the filters to see whether the GA & GTM codes are included or not

We can filter pages “containing” and “not containing”. And we should check both options. For both, GA & GTM.

The ideal result of this is:

  • Contains GA:  no results
    Harcoded GA has been removed from all pages
  • Does not contain GA: all pages
    Harcoded GA has been removed from all pages
  • Contains GTM: all pages
    GTM is in the whole site
  • Does not contain GTM: no results
    GTM is in the whole site

The most important filter is “Does not contain GTM” (to detect inmediately if some pages don´t have GTM). But as explained before, we need to make sure these pages don´t contain GA & GTM at the same time.. “Does not contain GA” is not really necessary.

 

Once the filters are ready, we just need to introduce the name of the domain and click on “start”
5

Wait it comes to 100 % to look at the results.

Click on “custom” (blue arrow) and select the filter you want to apply (red arrow) -remember, about contains or not contain a specific code-

Now you can get a list of all the pages in your site (including subdomains if you did include them) matching the condition we are applying in the filter. This list can be exported to Excel.

3) Detecting tracking issues and fixing them.

  • Pages without GTM code

This is easy. Just need to use the filter “not containing” GTM.
If everything was done properly, we will not get results here.

– Next step if there´s something wrong: Make sure you include the GTM code in these pages as well. And then, check again.

  • Pages having GA & GTM at the same time

You need to select “contain” for both Google Analytics & Google Tag Manager, then export to Excel, put everything together and select “duplicated values” to get the list of pages having duplication issues.

– Next step if there´s something wrong: Remove one of them. If we are migrating from harcoded to GTM, then harcoded is the one that should be removed. And then, check again.

And you? How do you make sure your site is well tracked and every page is included just once?

Any idea? Any comment? Any complaint? Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

Analyse de Cohortes avec Adobe Analytics

C’était pendant la conférence “The Chef’s Table” (Adobe Summit EMEA 2016) de Ben Gaines que j´entendu parler de l’analyse de cohortes pour la première fois. Immédiatement, j’ai pensé que ce type d’analyse est un ingrédient clé sur une table où on pense à “cuisiner” des insights savereux.

Qu’est-ce que c’est l’analyse de Cohortes?

L’analyse / étude de cohortes est une étude statistique où les sujets qui font les groupes à étudier (cohortes) sont sélectionnés en dépendent de la présence d’une caractéristique spécifique.

Autrement dit, et pour donner un peu plus de context, on peut dire qu’un cohorte est un group d´utilisateurs (ou clients etc.) qui ont réalisé une action concrète pendant une moment ou période de temps déterminé.

Par exemple, les utilisateurs qui nous visitent grâce à une campagne d´Adwords et qui ont créé une compte pendant les premiers jours du moins “x”. Ça nous permet de “gratter” sur par exemple les ventes générés pour ce cohorte-là pendant les dix semaines suivantes.

Ça nous permettra de segmenter mieux et plus facilement les caractéristiques de ces utilisateurs qui ont fait cette achat (c’est quoi ce qu’ils ont en commun?) et ces utilisateurs qui non (encore, c’est quoi ce qu’ils ont en commun?

Pourquoi il est important?

Parce-que si on suit les conversions et les comportement avec l’analyse de cohortes, on peut comprendre plus facilement la relation au long term avec nos clients ou visiteurs.

On peut se poser de business questions, comme:
A) Comment d’effectif a été une campagne de PPC qui avait pour but la création de comptes pour une période de temps déterminé?
Il faut tenir en compte qu’un utilisateur peut créer un compte et ne jamais retourner, car il l’avait créé par un raison (comme une promotion), retourner sans convertir ou retourner et convertir.

B) Quelle est la loyauté d’achat par les différentes catégories de produit?

C) Les clients qui ont acheté le produit “x”, sont-ils normalement à acheter le produit “y” après?

D) Quels sont les segments qui génèrent plus de profit au long term?

E) Quelle est l´”engagement” en term de visites recurrentes generés pour nos actions marketing?

etc.

Et on peut (et doit aussi) segmenter. C’est-à-dire qu’on peut se concentrer sur une section de notre web, un source de trafic ou un dispositif. Je vais montrer ce qu’on doit faire avec Adobe Analytics (mais on pourrait utiliser Google aussi)

Comment faire un analyse de cohortes avec WorkSpace d´Adobe Analytics?

Il faut ouvrir Workspace, sélectionner “Visualizations” et cliquer sur “Cohort Table”.

 

 

 

 

 

 

Et on verra une table qui contient trois éléments:

 

  • Granularité

Jours, semaine, mois etc.

  • Inclusion Metric

C’est la métrique utilisé pour définir le cohorte.
Par exemple, si on a choisi la création d´une compte, seulement les visiteurs qui ont en la créé seront inclus.

  • Return Metric

C’est la métrique qui indique que l’utilisateur a été “retenu”.
Par exemple, si on a choisi que l’utilisateur aie fait une transaction, seulement ceux qui on achète quelque chose depuis d´un période déterminé seront retenus dans le cohort.

Ici, c´est possible de segmenter un cohort. Et inclure seulement ce qui vient de Facebook, par exemple.

 

Je vais changer la métrique que j’utilise dans l´exemple. Entre les visiteurs qui sont venus un jour, ils sont retournés combien pendant les jours suivantes? On peut créer la table ci-dessous

 

 

 

 

 

 

 

Chaque cellule, en dépendent de si les valeurs sont plus grands ou plus petits montrent une couleur vert plus fort ou moins intense. Évidemment, on peut être “curieux” sur tous les deux. Pour ce qui fonctionne bien, et savoir ce qui ont en commun ces visiteurs, pour en comprendre et le renforcer. Et pour ce qui ne marche pas, à fin d’en corriger

Pour découvrir ce qui ont en commun ces visiteurs, derrière une cellule, ça suffit de cliquer (bouton droit) et l’outil “segment manager” -en incluant déjà le cohort- s´ouvira.

 

 

 

 

 

 

 

 

Après, on garder ce segment pour profundis: landing pages, source de traffique, dispositive, nouveaux visiteurs ou pas etc.

Conclusion

L’analyse de cohortes nous aide à analyser la relation entre les clients / visiteurs à long term. C’est très common analyser les ventes d’une campagne et “c’est tout”. C’est pour ça, que ce type d’analyse nous aide à savoir sur ce qui se passe après. 
Et moi, je crois que ça, c’est une amélioration 🙂

Why revenue-oriented metrics are ultimate KPIs, and always need to be present when talking about conversion rate.

Introduction

I have been requested several times to create “dashboards” (mind the quotes) containing powerless metrics. Powerless because the client has not thought about the business question/s behind the dashboard in the right context, so data be easily misinterpreted.  

Context is so important. For example if I don’t know the impact of an improvement in the conversion rate (CR) in the revenue, then I cannot know if that’s good or bad news. In theory more CR leads to more revenue, but that’s not true necessarily. That´s why CR should go together with a revenue metric, that can be the revenue itself or at least something like average order value .

I think it was at Congreso Web, in Zaragoza where I heard from one the best analysts I know (Xavier Colomes) that he can “believe” any excuse we analysts claim to justify our lack of action. But there is an exception, he cannot believe that your boss does not want to make more money (more revenue). And I agree with him.

Why revenue is that important?

Because it can tell you whether the business (or business unit etc.) is improving over time or not. Any other metric is not that powerful. Specially CR. I have seen too many times the conversion rate going up and the revenue going down, and the other way around.

Revenue gives you the right context to analyse conversion rate.

Explanation is simple. If you sell i.e. theatre tickets, customers buy normally two tickets (couple) or let´s say four (group of friends). Always one transaction, but units and AOV (average orde value) are going to be twice bigger in the second option.

It also happens when you sell i.e. an offer for a cheap meal, that makes an improvement in transactions and CR of i.e. 10%, but in terms of AOV, or revenue could be just i.e. 2% if that´s product is way cheaper than your average. Still good, as any improvement, but the spike is not as good as we would think if we would only look at the CR.

This does not need to happen in every company. I don’t think anyone buys six pairs of boots in different colours at the same time. But units (number of product being sold) is a very important metric in some industries. 

The key idea is that focusing on just improving the conversion rate (without any context on the real impact in the business) is like focusing on improving just the bounce rate.

Why should we work to optimize the revenue and not the conversion rate?

Optimizing the conversion rate should be a means to an end, and not the end. A means to improve the revenue. Ask your CEO if in doubt 🙂

Some products (or categories, or packages etc.) may have a smaller conversion rates but generate more revenue. And we should identify them by segmenting our data, and then try to improve the conversion rate in these specific segments.

That´s why I think that when we create a dashboard or talk about CR, we should look at the evolution of the CR and its impact in the final goal of the company (that is, making money).

I am not a football fan, but will use it for a clear example.

What´s the ultimate “conversion” KPI in football?

Goals. Full stop.

What micro conversions lead to get more conversions or goals?
Let´s mention a few key football metrics: corner kicks, shots on target. ball possession, dangerous attacks, avg. kilometers per player etc.

 

 

 

Passes are very important, and as a KPI can be segmented

 

 

 

 

 

 

These metrics above may make us think that there wasn’t a big difference between both teams. Let´s take a look to goals, which is the ultimate “Key Football Metric”

Something to say? 🙂

Back to Analytics

I would have been grotesque (even more…:) that after the match, Brazil would have given some importance to the metric total shots, or claimed that their players runned more kilometres than in the previous match, so there’s a positive trend there…

Anyone cares if you have improved your “shots on target” rate if you lose 7 -1. And anyone cares about an improvement in <insert your fave metric here> if the revenue goes down. Specially the CR.

Data need context. And part of the context of a micro conversion is how the macro conversion is affected. Thus, part of the context for the i.e. customer retention metric is how the revenue is affected. Same thing, in a football match. Goals needs to be related with the metric dangerous attacks. Any other thing is pointless and may make us think that something is going well when actually it´s not (or the other way around).

We should care about the revenue and a few KPIs, like CR. But these KPIs need to be segmented and measured together with their impact on the revenue.

Two benefits of giving a special treatment to the revenue are:
– We can know very quickly if we are performing better or nor as a business
And then segment and look the other KPIs in order to understand why, and what groups of customers / products we should focus on
– We will catch & keep more easily the attention of the stakeholders
We are talking about what they care. The language of money is always understood.

Last thought

This same idea should apply when we focus of our analysis in a specific part of the funnel. We should look at the CR, or the next step to get the whole context.

It may happen that the Transition Rate (TR) from step A to step B is working better for a specific product, device etc. but the Conversion Rate (CR) is worst. Or the other way around. And that´s something we need to know.

First steps with the tagging solution Signal (formerly BrightTag)

There are three basic concepts that you need to understand to start working with Signal:  data elements, inputs y outputs

  • The Data Elements are the variables we want to collect
    The set of Data Elements is the Data Dictionary
  • The Inputs are what we want to know about the user and its visit to answer our business questions (in plain English)
    It can be based on page loads or user interactions. Also can be for a web browser or an app.
  • The Outputs are the tags we use to send data to digital analytics tools, like Google or Adobe Analytics

Let’s see how they work 🙂

1) Data Elements

The data elements are the variables. The data we want to get from our web or app and send to our analytics tool. Just what in other tools are dubbed as “variables”.

The way to define them and creating the tagging map is what Signal names Data Binding.
That´s the way for us to tell Signal: in this input, you have to collect these data elements (variables)

For example, for every page load (that is an input) we want to collect (among other stuff):
– The page of the name.
– The page primary primary category.
– The page sub sub category.

So, in the Data Dictionary, we have to create a Data Element for all the variables.
1

2) Inputs

The inputs are what we want to know / get about the navigation & behaviour of our users. The pages that are being loaded or the key interactions during the visit.

For example:
– I want to know that the product pages are being loaded
> I create an input “product pages”

– I want to know that as the page loads, we collect the product name, category and subcategory
> I create a Data Element for each (previous example) and inlcude them within the input “product pages”

We should create an input for each type of page load or user interaction.
I mean that we need an input for product pages, another for home page, another for checkout page and so on. The reason is that some data elements are common to every page type (i.e. page name) but some are just for a specific type or some o them, but not all (i.e. product name is not something we want to collect in the product page, category, checkout etc. but not in the homepage…)

To track user interactions, we just follow the same process as page loads. If we want to know if i.e. users sign in or sign up (and both have two steps – first click + actual success) we would create an input for each of the interactions.

2

3) Outputs

Keeping in mind what has been explained so far, Signal already knows what actions we want to collect (product pages load, sign up etc.) and what we want to know about it (the name of the product, the category etc.)

The third step here is actually sending the data to the analytics tool we use. To do so, we need to create an output for each input, specifying the vendor (in the example Adobe Analytics).

3

 

Last idea

There are two steps:
– Signal getting data from the browser or app (data elements & inputs)
&
– Signal sending the data to the analytics tool (outputs)

Both can be the reason of not having data in the analytics tool, so we need to keep in mind this idea in case we detect data collection issues during the QA process.

Also we need the same inputs and outputs for each data element. I would explain in more detail in another post these three concepts and other functionalities of Signal.

Any idea? Any comment? Any complaint? Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

 

Data QA with Charles Debugging Proxy (basic level)

Digital Analytics is intended to transform data into actionable insights. Sure, you have heard it a million of times. However if the data collection has not been previously audited (so we can validate it) we may be taking decisions based in data that are just wrong.

This first and necessary step that is forgotten in many cases is a bit one of geeky obsessions and priorities in every analytics project. It’s something we should use to make sure we collect properly the data we want. But also it can be used to check and see what other companies do track i.e. the side filters the user may apply when looking for some content.

1 – Tag Assistant: check easily and quickly if there’s something wrong in your Google Analytics implementation

You just need to install the plugging Google Tag Assistant for Chrome (developed by Google itself) and click in the blue label appearing in the right corner of the browser. This label will be red if any error is detected, green if everything is ok and blue if there are things that can be improved.

Let’s see an example:

p1 i1

Oh, it seems that the GA code has been placed outside the <head> tag. This means that the “tracking beacon” could not been sending data before the user leave the page, having a logic and negative impact in the quality of the data we are collecting. To fix it, you just need to click in ‘more info’ and follow the recommendations from Google Analytics Support. In this case, the solution would be just pasting the GA code in the head tag.

It’s needless to say that the UA code showed by Tag Assistant and the one in your GA property should be the same.

Just taking this little step, we would be doing a basic -but important- step to understand what’s wrong with our data collection and get a first idea of what we need to do it to fix it.

2 – Debugging tools to check what we are actually sending to Google Analytics

My favorite tools for performing a basic analysis are Fiddler y Charles . I will use Charles in this example. You just need to download it (it’s free), open the browser and visit the website you want to audit.

Then, in Charles you need to look in “google-analytics.com” in the menu on the left (structure o sequence), y click in “request” in the right.

Let’s see a basic example with a downloads marketplace called Fileplaza

i2i3

What can we see here? In the screenshots above

  • What is being collected?
    A pageview
  • What page is being collected?
    http://www.fileplaza.com (I have just visited the homepage)
    The URI (URL without domain) being collected in Google Analytics will be: /
  • What’s the tittle of the page we are collecting?
    Free Software Download and Tech News – File Plaza
  • What’s the referrer that is sending the visit to fileplaza.com?
    google.co.uk (I am in the The UK and have searched “Fileplaza.com” in Google)
  • What’s the Google Analytics UA code of the property in which FilePlaza is collecting its data?
    Actually, there are two UA-48223160-1 & UA-23547102-20 (a different UA code in each screenshot)
  • Why are there two screenshots shoing the same data but having different UA codes?
    Because they are sending traffic to two GA properties.
  • Are they using Google Tag Manager?
    No, otherwise we would see GTM-XXXXXX

Now, we know what is being sent. But is is correct?

This is quite simple to answer. What we see should be the same as we want to collect with GA.
Otherwise, there’s something wrong. For example, if I do a specific interaction that is tracked with an event, but Charles doesn’t show that event, it means that the interaction is not collecting the event. And that’s something we should fix.

Let’s navigate a bit more …

1) I visit a product page: /home_education/teaching_tools/kid_pix_deluxe/

i4i5

In the screenshots above, I see:

  • A page view is being collected
  • The page being collected has changed
    Now is http://www.fileplaza.com/home_education/teaching_tools/kid_pix_deluxe/
    So the URI in Google Analytics will be /home_education/teaching_tools/kid_pix_deluxe/
  • The data is being sent again to the two properties.

2) I click in the button ‘download’ that leads me to a downloading page:  windows/home___education/teaching_tools/download/kid_pix_deluxe/

3 l click to download a software:

i6i7Oh! Now there’s something new… Looks like my click to download has triggered an event (first red arrow) that is sending the data below to the two GA properties.

1- Event category (ec): descarga
2- Event action (ea): final -I assume is the button type, since it’s the ‘final’ button to download
3- Event label (el): Kid Pix Deluxe -that’s the name of the software i have just downloaded-

As Fileplaza is a marketplace of downloads, so the buttons to download are key. What Charles shows is the fact that they use events to measure the success of this specific user interaction.

Let’s sumarise what Charles is showing:

  1. Fileplaza measure the success of the downloads using events.
    Do they do it correctly? Yes
  2. The events contain relevant data about the interaction: action being perfored by the user (download), button being clicked and name of the software
    Do they do it correctly? Yes
  3. For a reason, FilePlaza wants to collect the data in two different properties of GA
    Do they do it correctly? Yes

I like Charles, but which tool you use is the less important thing here. I am quite tool agnostic and there are other that are also very good, like Wasp (recommended for a data layer) Google Analytics Debugger or Data Slayer (recomended for ecommerce). Normally, I use the console, in which you can install pluggings for most tools like Google Analytics or Adobe Analytics.

Any idea? Any comment? Any complaint? 🙂 Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.

 

Análisis de cohortes con Adobe Analytics

El análisis de cohortes permite dar un enfoque de largo plazo al análisis de datos. Y no quedarnos con el corto plazo simplemente, sin visión estratégica.

En el post explico porqué es importante este enfoque y como llevarlo a cabo con Adobe Analytics. El original -en inglés- lo he publicado en este mismo blog. Y para la versión española, nuevamente Carlos Lebron me ha prestado amablemente su blog www.AnalísisWeb.es

Puedes leerlo haciendo click en el enlace Análisis de Cohortes con Adobe Analytics

Adobe Analytics – Cohort Analysis

I heard for the first time about using Cohort Analysis in Adobe Analytics during the talk “The Chef’s Table” from Ben Gaines at Adobe Summit EMEA in London last May (2016). Ben explained that Cohort Analysis was one of the cool things coming with Workspace.

I immediately thought that it’s an “ingredient” that should be present in any analyst’s table in which meaningful insights are to be prepared.

What is Cohort Analysis?

Wikipedia says that “Cohort analysis is a subset of Behavioral Analytics
that takes the data from a given dataset”.

To put it clear and adapt the definition to the context, I will say that a cohort is a group of users who performed a specific action at the same time.

For example: users who came to our site from a PPC campaign and created an account on the first week of May. That’s the cohort, and the cohort analysis will let us take a digging about i.e. the amount of purchase orders generated by that cohort during the next ten weeks.

It will enable us to segment a bit further very easily, and know some characteristics about those users who actually purchased (what do they have in common?) and those who don’t (again, what do they have in common?)

Why is it important?

Looking at conversions / user behaviour over time, cohort analysis helps us to understand more easily the long term relationship with our users or customers.

We can use Cohort Analysis for business questions like:
– How effective was a PPC campaign aiming for new accounts in terms of orders over time?
– How is the purchasing loyalty for a specific product category?
– What segments (cohorts) generate more revenue over time?
– How is the engagement in terms of returning visits generated by a specific action?

We can now easily evaluate the impact and effectiveness of specific actions (or campaigns etc.) on user engagement, conversion content retention etc.

And last but not least, we can apply segments, so we can focus only on a section, referrer, device etc. that is key for us.

How Cohort Analysis can be done in WorkSpace?

Just go to Workspace and select a project. Then select “Visualizations” and drang “Cohort Table” into a “Freedom Table”

 

 

 

 

 

 

A table will appear containing three elements:

  • Granularity

Day, Week, Month etc.

  • Inclusion Metric

The metric that places a user in a cohort.
For example, if I choose Create Account, only users who have created an account during the time range of the cohort analysis will be included in the initial cohorts.

  • Return Metric

The metric that indicates the user has been retained.
For example, if I choose Orders, only users who performed an order after the period in which they were added to a cohort will be represented as retained.

The inclusion and return metric can be the same. For example Orders, to watch the purchasing loyalty.


Cohort Analysis in place

I am changing the example to just visits. Among the users who visited us in a specific day, how many of them returned during the next secuencial days? (remember that this question can be about orders or any other metric/s)

For the question above, I have the table below

 

 

 

 

 

 

 

Every cell, depending on if the numbers are bigger or smaller, show a brighter or softer green colour. And we can be “curious” about both groups (what seems to be working well, to take a digging about “why” who etc. and same thing to understand what seems not to be working as expected)

To know what have in common the users behind a specific cell we want to analyste further, just right click on a specific cell and it will open the segment manager tool containing the cohort.

 

 

 

 

 

 

 

 

Save the segment and take a digging, looking to the entry page, referrer, devices etc. of this specific cohort in which in the example we are not engaging, to know more about that specific type of user

Conclusion

Cohort Analysis can help to analyse the long run. It´s very common analysing how many sales have been generated by the different campaigns and “that´s all”. But Cohort Analysis can help us to know what happened over time with customers who bought for the first time from campaign a and the ones who did for b, o who purchased product type a or b.

And that´s an improvement 🙂

And you? How do you watch your key segments over the time?

Any idea? Any comment? Any complaint? 🙂 Leave your comment and I will get back to you. You can also contact with me via email geekeandoenanalytics@gmail.com o through my Linkedin and Twitter profiles.