Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

AI Customer Voice Analysis • Leveraging AI to Unlock Insights Into User Behavior

  • Client

    Starbucks

  • Solutions

    DataMobile AppsPlatformAI & Emerging Technology ConsultingConsumer Insights & Activation

UI starbucks

Optimizing customer loyalty with innovative solutions.

To become one of the most beloved coffee brands in the world, you need a profound understanding of your customers, their preferences, behaviors and expectations. Over our years-long partnership with Starbucks, we’ve helped the brand develop a system that decodes how people interact with their loyalty app and its features—ultimately creating a user experience that nurtures that strong connection with customers.

Graphic
Graphic

Leveraging AI and data analytics to improve customer satisfaction.

One way we were able to generate customer insights is through our bespoke AI Customer Voice Analysis solution, a product developed by our data science and experimentation teams in EMEA. When it comes to their digital products, many brands question whether investing in innovation actually results in improved customer satisfaction and loyalty. To determine this based on data rather than assumptions, we analyzed not just GA4 but also customer voice, UX, conversion, native app experience, Mobile Order & Pay (MOP) adoption and retention data.

This approach enabled us to pinpoint the primary user frustrations and complaints that were impacting Starbucks’ native loyalty apps in four key markets. Furthermore, we identified which app features were disrupting the customer experience and hindering users from adopting the mobile ordering and payment capabilities within the app.

Monk Thoughts Sentiment analysis is invaluable for monitoring brand perception, gauging customer satisfaction, and surfacing actionable insights from unstructured text data.
Iuliana Jackson headshot
  • icons starbucks image starbucks girl
starbucks icons
UI starbucks

Designing user-centric platforms that align with business objectives.

Equipped with this data and our thorough understanding of each team’s contributions, we supported Starbucks by creating an experimentation roadmap for user-centric product evolution that is deeply attuned to user feedback and the client’s business goals. Running the platform for nine years across EMEA proved to be immensely valuable, providing us with comprehensive insights into the brand’s existing technical and organizational constraints.

The process involved creating and refining the UI to deliver the best possible user experience, covering online ordering, in-store transactions, digital receipts, tipping and more—all based on carefully optimized and mapped user journeys. In particular, we highlighted an urgent need for optimization of the password reset process and the Mobile Order & Pay (MOP) journey. Additionally, we supported the client with insights regarding their native app onboarding process, user retention and their loyalty program. This involved multiple workshops and presentations with key members from the Platforms team.

mockup

Empowering data-driven improvements for long-term success.

In addition to the experimentation roadmap, we further supported the brand with a newly developed and enhanced data collection mechanism crafted by our data architects. This enabled Starbucks to implement industry-standard best practices for app tracking.

The goal is to use data to proactively suggest improvements. With an AI engine that continuously analyzes user reviews across multiple markets, Starbucks can now conduct evidence-based experiments aimed at boosting customer lifetime value and increasing adoption of their native apps across EMEA—and soon, in other regions as well. Overall, the success stemmed from all team members uniting and speaking the same language: that of the users.

Want to talk data-informed platforms? Get in touch.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

Can’t get enough? Here is some related work for you!

A Modeler’s View on Google's Meridian MMM Platform

A Modeler’s View on Google's Meridian MMM Platform

Data maturity Data maturity, Measurement, Media, Media Analytics 3 min read
Profile picture for user Michael Cross

Written by
Michael Cross
EVP, Measurement

Data feeding measurement models

As a leading marketing transformation consultancy at the forefront of marketing analytics, we have taken a deep look into Google's latest offering: Meridian, their new Market Mix Modeling (MMM) tool.

Google's Meridian is built upon the foundation of the previously released RBA/LMMM materials. The developments include geo experiments to ingest into the modeling, as well as detail on reach for YouTube. The emphasis on triangulation via A/B testing to enhance MMM accuracy is a strategy we are well-versed in ourselves and offers a good base to start from. However, it is crucial to note that while Meridian provides a step forward in measurement, it remains just a tool—a sophisticated one that requires expert hands to wield effectively. 

At Media.Monks, we pride ourselves on our robust internal platform that is industry-leading in terms of speed and functionality. Meridian gives a step up for brands who are just starting off in their MMM journey, helping them move away from last click to better quantify media uplifts.

Monk Thoughts At the end of the day, a model is only as good as its modeler: you can have the best model in the world, but if it's not fed with accurate, high-quality data or delivered clearly to key stakeholders, it's not going to be trusted (and therefore, adopted) in an organization.
Portrait of Michael Cross

From an experienced modeler’s perspective, these are some of the key points to consider with Meridian:

  • The methodology behind Meridian is solid and makes sense around the emphasis on triangulation, which enhances the accuracy of the results.
  • However, experienced econometricians will be essential for operating Meridian effectively in-house. Brands must ensure their teams possess the expertise to source the right data, build the models to reflect the real world, and translate data insights into actionable ROIs and response curves, or they risk making flawed decisions from the outputs.
  • As with all MMM initiatives, data quality remains a critical factor in whether or not you’re adding value or making accurate decisions. Having accurate and full data across all drivers of sales (media, price, promotions, seasonality, climate, etc.) is critical for MMM. Strong data foundations also gives a significant advantage, whether brands are utilizing Meridian or any other technology.
  • Effective communication within organizations is key to driving traction and implementation of MMM strategies, and explaining models clearly and effectively is key for any MMMs success.
  • The launch of Meridian represents a shift away from outdated attribution models towards a more accurate, incremental media valuation approach. Even if it isn’t the best-fit tool for all brands, it is another step in the industry’s maturation, especially in the wake of cookie deprecation and changing privacy legislation.
  • Smaller clients with simpler data structures, such as ecommerce clients spending less than $2 million USD on digital media, will benefit from this tool as an entry point to the world of MMM.
  • Some clients may question running their media measurement on a platform from a media owner

In conclusion, Google's Meridian offers a solid starting point for less complex brands looking to enhance their measurement capabilities via a framework. Increasing the usage of MMM can only be good for the industry as a trusted tool to measure and optimize media. That being said, hard work is still needed in attracting econometric talent into the marketing world to maintain model accuracy and increase adoption of these methodologies. At the end of the day, a model is only as good as its modeler: you can have the best model in the world, but if it's not fed with accurate, high-quality data or delivered clearly to key stakeholders, it's not going to be trusted (and therefore, adopted) in an organization.

A good step forward, but still more to do on the talent front. See our post on apprenticeships to learn what we are doing to address this.

For more information on how we can help with your marketing effectiveness measurement or Market Mix Modelling, visit our Measurement page or contact us.

Learn about Google's latest Market Mix Modeling (MMM) tool, Meridian. The Measure.Monks share what brands will get value and Meridian's impact on the industry. Learn about Google's latest Market Mix Modeling (MMM) tool, Meridian. The Measure.Monks share what brands will get value and Meridian's impact on the industry. MMM Market Mix Modelling Media Optimisation data analytics Media Measurement Measurement Media Analytics Media Data maturity

Navigating Consent Mode in GA4 & BigQuery

Navigating Consent Mode in GA4 & BigQuery

Data Data, Data Analytics, Data Privacy & Governance 3 min read
Profile picture for user Data Monks

Written by
Pedro Ginel & Brianna Mersey

add

In today’s day and age, where we see a large amount of privacy litigation and fines, the application of Consent Mode is a step towards keeping inline with privacy compliance regulations. Join us as we explore two distinct approaches to Consent Mode—Basic and Advanced—and highlight the implications for data collection in Google Analytics 4 (GA4) and BigQuery.

Basic Mode: Compliant but at the cost of data collection.

Implementing Basic Consent Mode via GTM is a straightforward path to compliance, ensuring that Google tags remain dormant when the user denies consent. Google tags are not loaded until a user grants consent. While this expedites compliance efforts, it comes with a trade-off: data generated by unconsented users is not tracked in either BigQuery or GA4. You will not receive modelled data in your GA4 property to fill in the data gaps from unconsented traffic to your website. Though efficient in meeting compliance requirements, Basic Mode sacrifices the depth of GA4 data utilization, impacting data tracking significantly when compared to the Advanced mode implementation. Often clients see up to 30% or more of data loss.

Advanced Mode: Unveil deeper insights responsibly.

Advanced Mode takes a more sophisticated approach, allowing Google tags to trigger even without user consent. However, it omits identifiable client data, such as the _ga cookie used by GA4 for identifying users by browser and device. The use of Advanced Mode impacts both BigQuery and GA4 in different ways, which we’ll dive into below.

BigQuery: Track unconsented events.

When using Advanced Mode in BigQuery, unconsented events are still tracked, but they lack certain parameters used to identify users. This becomes evident when attempting to calculate metrics like user count, because the absence of the _ga cookie in events means the user_pseudo_id value (used to help GA4 identify users and calculate user metrics) is missing, resulting in an underestimation of user count. While BigQuery captures all events, the exclusion of critical information affects the accuracy of reporting, particularly in metrics relying on unique identifiers.

This concern doesn’t apply if the user has authenticated and their user ID is sent to GA4. That data will be then sent to BigQuery.

In short, BigQuery retains all events, including unconsented ones. Unfortunately, missing information influences the reporting of metrics like user count, demanding a strategic approach in data analysis.

Based on experiments ran with a custom GTM container & custom GA4 tags

GA4: Model metrics beyond consent.

When using Advanced Mode in GA4, you may notice an initial drop in metrics because unconsented users and their events are not reported. However, the innovative aspect of Advanced Consent Mode lies in its ability to model data: over time, Google analyzes both consented and unconsented traffic, then builds estimations of the relevant metrics. While this modeling occurs programmatically and beyond our control, GA4 users are not restricted to reporting limitations. Metrics like user count, initially affected by unconsented data exclusion, become estimable through Google's modeling efforts.

GA4’s UI modeling will become active as soon as you implement Advanced Consent Mode. You don't necessarily need to use GTM for that; you can use any other tag manager or run it directly in your banner code

Tip: To see modeled data in your reports, choose the Blended reporting identity, under Admin > Data display > Reporting Identity > Blended, otherwise select Observed to view strictly consented data. You may switch back and forth between options without impacting data collection.

Strike the Right Balance.

Dedicate time to implementing Advanced Consent Mode to prevent complete data loss on unconsented hits. This mode provides a nuanced solution for those ready to navigate the intricacies of unconsented data tracking. Additionally, selecting a Cookie Management Platform (CMP) is essential for managing the cookie consent banner and directing the consent management process that is initiated when visitors arrive on your website and choose to allow or deny cookies. As global regulations evolve, it becomes crucial to have robust, privacy-centric measurement solutions accessible to marketers worldwide.

And finally, before you start Advanced Consent Mode implementation, get your legal team onboard and discuss any possible ramifications of collecting cookieless pings from users who declined tracking.

Unlock GA4 and BigQuery insights with our experts! Navigate consent mode complexities, explore basic and advanced approaches, and ensure privacy compliance. Google Analytics data privacy big data data analytics Data Analytics Data Data Privacy & Governance

Leveraging Data to Elevate PersonalizedExperiences—Insights From Salesforce, Google and Lenovo

Leveraging Data to Elevate PersonalizedExperiences—Insights From Salesforce, Google and Lenovo

Data Strategy & Advisory Data Strategy & Advisory, Industry events 4 min read
Profile picture for user Ashley Musumeci

Written by
Ashley Musumeci
Global VP of Lifecycle Marketing & CRM

Brunching Up on Personalization at CES

From the spectacular Sphere to our AI-powered alien robot Wormhole chatting it up with the press, there were numerous showstoppers at this year’s Consumer Electronics Show (CES) in Las Vegas. Futuristic attractions aside, we were especially impressed by the minds behind all the astonishing technologies and the many inspiring conversations we had with our tech partners. For one, during our “Brunching Up on Personalization: A Tasty Discussion on Data Foundations” panel, we asked experts from Salesforce, Google and Lenovo how to bridge the gap between adtech and martech to create groundbreaking experiences. Their response? Data elevates your personalization efforts. Here’s what we learned.

Start building your data culture now. 

In the spirit of first things first, the speakers wasted no time to highlight the importance of initiating and nurturing a strong data culture. Once again, the CMO profession is evolving—while the Mad Men heyday revolved around big creative ideas, this changed when the industry started implementing martech and focusing on precision. Now, it’s all about data. “The advertising landscape has changed so much that today’s marketing professionals are almost scientists,” said Google’s Global Brand Lead Felipe Gomes.

 

Monk Thoughts Successful companies, especially in tech and media, foster a strong data culture and weave that into everything they do.
Ashley Musumeci headshot

However, many brands find themselves a bit stifled by how overwhelming data can be. In that case, Salesforce’s Vice President Tech Industry Strategy Lauri Palmieri argues that it’s critical to just get started—anywhere. “Choose the KPI and the segment that you care most about, obtain access to the data, and get going. Brands sometimes spend way too much time thinking about what they should do with their data instead of actually doing it,” she said. 

Always make sure to establish solid and secure data foundations. 

Once you get going, it’s crucial to govern the data. Before figuring out which sources of data to connect—think of information around clients, sales and marketing, cost and operations—brands must focus on taking care of their data. During the panel, Gomes stressed that data governance is the first thing he and his team at Google talk about with clients, raising questions such as what are the team’s roles and responsibilities, who is making sure the data is shared with a company’s core business units, how do you organize the data, how do you secure the data, and how do you ensure compliance. 

The last question in particular was echoed by the other panelists, as the importance of privacy, ethics and trust can’t be overstated—any failure to comply with current privacy laws would not just affect the brand in question, but also the ones they collaborate with. “We don’t operate in silos, but we work very closely with our partners to understand all the governance, laws and regulations and ensure we meet them,” said Chin Wu, Lenovo’s Director of North America Marketing Consumer PC, Gaming, Tablets. 

Chiming in, Gomes emphasized that education around privacy and safety is absolutely critical. “The amount of data, possibilities and, on top of that, regulations, it’s all really overwhelming. So, brands have the tendency to freeze,” he said. “Especially right now, with the main concern to transition to the cookieless future, [brands worry] what’s going to happen to the conversions or personalization strategies that are already happening with third-party data. That said, many tech partners are more than prepared to help brands on this journey towards safe and trusted data foundations.” Moreover, the good thing is that a lot of AI-powered solutions are built on first-party data.  

Differentiate your brand in the new AI economy. 

This brings us to the last topic of the day: AI and its unbreakable bond with data. We would be remiss if, at one of the top tech conferences in the world, we didn’t talk about this technology—particularly given the role it can play in advancing personalization efforts.

Establishing strong data foundations is key to the success of AI. From Google’s point of view, Gomes argues that when we talk about AI, we are essentially talking about two things that are available today: embedded solutions and applied solutions. Zooming in on the latter, this entails customized AI models that are related to factors like creative work, trends, insights and forecasting sales, to name a few. It should be noted that every brand that partners with Google has access to these solutions. 

“This brings me to the point of distinguishing yourself using data, as the data serves as fuel for all these models,” said Gomes. “The better data you feed these models, the better your output is going to be. That is why it’s so important in 2024 and beyond to really focus on how you organize your data foundations. This is going to be the competitive differentiator for your company.”

Dialing it up a notch, Palmieri argued that the experiential piece—what you do with data—will ultimately differentiate your brand from another. “Since you can get to know your customers so well through data, the main question is: what do you actually do experientially as a result of that information? Marketing has an important role to play in using data to drive even more value for consumers,” Palmieri said. When it comes to personalization, first-party data and forward-thinking AI solutions leave today’s marketers with the opportunity to tailor best-in-class experiences to each and every individual.

CES brunch with Salesforce, Google and Lenovo
During our data foundations panel at CES 2024, experts from Salesforce, Google and Lenovo argued that data elevates personalized experiences. data analytics personalization Google salesforce marketing Data Strategy & Advisory Industry events

A Comparative Analysis of Google Analytics 4 and Adobe Analytics

A Comparative Analysis of Google Analytics 4 and Adobe Analytics

Data Data, Data Analytics, Data maturity 3 min read
Profile picture for user Brianna Mersey

Written by
Brianna Mersey
Director of Data, NAMER

Google Analytics 4 and Adobe Analytics

In the dynamic landscape of data analytics, choosing the right platform is crucial for businesses seeking actionable insights. Two major players in this space, Google Analytics 4 (GA4) and Adobe Analytics, cater to distinct needs and come with their own set of features. In this article, I will compare the two based on their key differences, features, and use cases, along with ways you can identify which platform may be the best fit for your team’s needs and objectives.

Understand the fundamental differences in platforms.

One prominent distinction between each platform lies in their parent suites. GA4 is an integral part of the Google Marketing Platform (GMP) suite, focusing on advertising-driven analytics. On the other hand, Adobe Analytics is an essential component of the Adobe Experience Cloud, emphasizing the delivery of customized user experiences. This fundamental difference is reflected in their core objectives, features and functionalities.

Keep cost in mind.

One of the initial differentiators is the cost factor. GA4 offers a free entry-level product, making it accessible for businesses of all sizes. In contrast, Adobe Analytics typically starts at around $100,000, with costs influenced by features, server calls, and negotiations (ask for discounts!). The pricing structure is a crucial factor for businesses when deciding on the analytics platform that aligns with their budgetary constraints.

Be aware of differences in data collection and reporting.

Adobe Analytics excels in enterprise level data collection. Notably, it provides advanced segmentation options and robust custom reporting and tracking capabilities. Nevertheless, setting up Adobe Analytics demands a higher level of technical expertise.

In contrast, GA4 offers a more straightforward data collection process with minimal technical implementation complexities and less customization. Enhanced by machine learning and AI-powered insights, GA4 stands out for its web and app architecture and audience linking with the GMP platform. Additionally, GA4 proves adept at tracking cross-device user journeys and effectively measuring campaign ROI.

Consider ease of use.

Both Adobe Analytics and GA4 feature user-friendly interfaces, with GA4 being notably more accessible, particularly for non-technical users. GA4 boasts a straightforward and streamlined interface, enhancing ease of navigation and setup. This now includes a feature that lets you customize your home page. 

In contrast, Adobe Analytics presents a more intricate interface, which demands a greater level of technical knowledge and a steeper learning curve. I like the fact that Adobe Analytics’ menu button names can be changed in the Admin section, but I don't suggest changing the order of the buttons.

GA4 and Adobe Analytics

Strengths of Google Analytics 4

  1. Seamless Integration with GMP Stack: GA4 seamlessly integrates with the Google Marketing Platform stack, even at the free version level. This integration offers users a comprehensive view of channel performance, covering social, direct, organic, and paid media.

  2. Behavioural Insights from Chrome: GA4 leverages behavioural information collected from Chrome, including interests and demographic user data. This information, encompassing demographics and interests, proves valuable for building retargeting segments.

  3. Cost-Effective Entry with Free Products: Google Analytics 4 provides a cost-effective entry point with free products, making it an attractive option for businesses with budget constraints.

Strengths of Adobe Analytics

  1. Unparalleled Customization: Adobe Analytics stands out for its exceptional customization capabilities, empowering users to tailor analytics solutions according to their specific needs.

  2. Overcome Sampling Challenges: Unlike GA4, Adobe Analytics does not face data sampling issues. GA4 users seeking unsampled data must integrate with BigQuery, adding an extra step to the process.

  3. Advanced Segmentation Tool: Adobe Analytics boasts a more advanced segmentation tool, allowing users to create intricate segments with a broader array of operators

Which platform is right for you?

In conclusion, the choice between Google Analytics 4 and Adobe Analytics hinges on various factors, including the existing tech stack, business requirements, use cases and budget considerations. For ecommerce-centric businesses heavily reliant on paid media, simple report customizations, and looking to track apps, GA4 emerges as a strong option, despite some features being in beta and occasional bugs.

On the other hand, if you are an Adobe Experience Cloud customer, opting for Adobe Analytics ensures a seamless integration within the Adobe family, offering a customizable and robust implementation tailored to your unique needs, which may include collecting data from very diverse sources.

Ultimately, the decision rests on a thorough evaluation of your organization's priorities, goals, and resources, ensuring that the chosen analytics platform aligns perfectly with your business strategy.

If you’ve already invested time and energy into one product, stick it out and keep moving forward. Switching products won't solve the core issues, it just means more time and money.

Our Data.Monks provide insights on the fundamental differences between GA4 and Adobe Analytics and how both could help with your unique analytics needs. data analytics Google Analytics adobe analytics Data Data Analytics Data maturity

The Importance of BI Governance in Complex Data Environments

The Importance of BI Governance in Complex Data Environments

Consumer Insights & Activation Consumer Insights & Activation, Data, Data Privacy & Governance, Data Strategy & Advisory, Data maturity 5 min read
Profile picture for user mediamonks

Written by
Monks

header image

In today’s digital era, organizations are amassing ever-growing amounts of data, while an increasing number of employees need to access it from different locations. This surge in data accumulation has given rise to complex environments, where finding the right information can be challenging. However, having more data should not mean greater difficulty in leveraging it—quite the opposite, as long as it is managed efficiently and effectively. It is within this landscape that the significance of data governance comes to the forefront.

In essence, governance provides the rules, processes, and policies we need to securely manage data and content, through workflows that ensure their availability to the right people at the right time. We then complete the cycle with business intelligence (BI) by using that data to generate valuable insights and analyses that support strategic decision-making. “We understand BI governance as an end-to-end cycle that spans everything from data collection to visualization and consumption, with a focus on feedback and continuous management,” says Juana Masanet, our Semi Senior BI Analyst.

Monk Thoughts The ultimate goal is to ensure that the organization maximizes the potential of its data, thereby driving business growth.
headshot of juana masanet

Indeed, this is a process of vital importance. However, BI governance is an aspect that organizations often overlook. “Within our data team, we collaborate with clients grappling with two common challenges that BI governance can solve,” says Masanet. “On one hand, we have companies with scattered data sources across various platforms that need a centralized database for their analyses. On the other hand, there are companies with dispersed dashboards among different individuals or departments, leading to discrepancies in their KPIs and other issues.”

When facing these challenges, certain BI platforms such as Looker, Power BI and Tableau have solidified their positions as reliable solutions. They empower organizations to centralize the modeling of diverse data sources, ensuring the seamless availability of information crucial for data-driven decision-making. Mastering these platforms is crucial, not only for security reasons—which are paramount in themselves—but also for scaling large-scale projects involving numerous stakeholders, teams and roles with diverse objectives, responsibilities and tasks. Now, one might wonder, what does a good BI governance strategy look like?

The image below features the essential elements of a robust BI strategy. Equally important to defining these elements is ensuring that everyone involved in the workflow understands and adheres to them. This way, they can rely on the analyses they consult to make data-driven decisions, break down silos, foster user collaboration, and reduce costs.

Overall sentiment score vs. star rating

“Having experienced teams and safe, efficient, and flexible tools is essential to orchestrate workflows that prioritize BI governance,” explains our Analytics Specialist, Martin Milloschik. “Alongside my colleagues in the data team, we collaborate with various organizations to achieve this objective, leveraging two powerful tools: Looker and Tableau.”

The benefits and specifics of Looker and Tableau.

Looker and Tableau, two of the most renowned BI platforms in the industry, offer the ability to consolidate various data lifecycle processes into a single tool. Both platforms host their resources in the cloud, with Tableau providing Tableau Cloud or Server, and Looker using GCP (Google Cloud Platform). This cloud-based approach not only bolsters efficiency but also levels the playing field for smaller enterprises with limited operational capabilities. Let us delve into some of the standout features that enhance BI governance.

One of the key strengths of Looker is its centralized data management. The tool is purpose-built to seamlessly integrate data from diverse sources, creating a unified view that undergoes meticulous cleansing and consolidation through its proprietary semantic layer modeling. On the other hand, Tableau, while lacking native ETL (Extract, Transform, Load) tools in its platform, offers an alternative with Tableau Prep Builder and Tableau Prep Conductor, which serve the same purpose. 

Data collaboration is also optimized in both tools. “Teams can access, analyze and share data within a shared workspace, enhancing communication and avoiding data duplications or conflicts,” says Milloschik. Additionally, both platforms offer version control, enabling smooth collaboration and empowering teams to work simultaneously, regardless of their level of expertise in data usage.

Monk Thoughts Having a common data source throughout the organization assures users that the data is reliable and has adhered to specific standards of quality and governance.
Martin Milloschik headshot

For metadata management, Tableau offers an additional functionality called Tableau Catalog. This tool indexes all the content on your site, including dashboards, metrics, data sources, virtual connections and flows, to gather detailed metadata about the content. With Tableau Catalog, users can access features such as lineage, impact analysis, and a data dictionary, which provide them with a deeper understanding of the information they use and enable them to track where it is utilized throughout the organization.

Furthermore, Tableau provides users with an additional layer of data oversight through its Data Quality Warnings and Asset Monitoring features. These capabilities empower users to set alerts on data sources, ensuring that those who utilize dashboards or access the data directly are promptly notified of any pertinent issues. The warnings can encompass a range of concerns, including deprecated or obsolete data, ongoing maintenance, and the presence of sensitive information, among others. Additionally, these functionalities serve as a mechanism to alert users when data is not up-to-date due to failures or other factors. This helps ensure that users are informed about the quality and availability of the data.

Regarding security, both platforms offer robust features to control access to sensitive data and apply various security measures based on user roles. These measures include data source authentication and authorization, row-level filtering, permissions and encryption, among others. “It’s possible to create roles and groups that are then associated with different users, allowing them to access layers of information based on business needs and defined accessibility for each user,” explains Valentina Gonzalez, BI Analyst. This approach ensures that restrictions are in place so only authorized users can access the appropriate information.

Monk Thoughts The best practice is to use groups to associate sets of users with these roles instead of individually assigning permissions for each case.
Valentina Gonzalez headshot

Both Looker and Tableau offer management and monitoring tools that enable administrators to gain insights into data usage patterns and proactively address concerns related to performance, connectivity, usage and update failures, among others. Tableau provides default administrative views using data from the Tableau Server or Cloud repository. Meanwhile, Looker offers the System Activity section in the general admin menu, providing access to a variety of LookML Dashboards and models. These resources allow analysis of user activity, data and content consumption, frequency of querying and usage of published dashboards, and even identify errors that may arise during the modeling and creation of Looks, which are predefined visualizations to answer specific business questions.

To summarize, both platforms provide all the necessary components of the feedback cycle to ensure effective governance. However, it is crucial to recognize that success is not solely determined by the choice of tool. Equally important are user training and the implementation of suitable policies and processes within the organization. These key elements play a vital role in attaining success in data governance, ensuring that data is managed and utilized in a manner that aligns with organizational goals and objectives.

Our experts explain BI governance and how it helps organizations make the most of their data. data analytics Data Data Privacy & Governance Data Strategy & Advisory Consumer Insights & Activation Data maturity

Collect the Data You Need, Right Where You Need It

Collect the Data You Need, Right Where You Need It

Data Data, Data maturity 4 min read
Profile picture for user Julien Coquet

Written by
Julien Coquet
Senior Director of Data & Analytics, EMEA

Abstract square shapes in orange and blue tones.

So you went ahead and deployed your digital analytics solution with all the bells and whistles. Your data collection plan is exhaustive, privacy-friendly, sophisticated and will track more data points and attributes than you will ever use or need. Your data integrates seamlessly with your online marketing campaigns and you’re able to gain valuable insights, optimize and activate your data. No, is that not the case? Then get in touch and make sure to keep reading. 

In times of endless data, it is crucial to collect smarter.

As an analytics expert and practitioner, I know first-hand that collecting data across multiple digital assets and channels can be daunting. This is especially the case when the number of devices connected to the global internet exceeds 21 billion in 2023. Thankfully, our current Iinternet addressing system can handle a lot of these devices, namely up to 3.4×10E38 (that’s 34 followed by 37 zeroes). 

Out of these 21 billion devices, about 66% is made up of Internet of Things (IoT) devices, all of which generate data about their operation, features and settings. Call it connected black boxes or telemetry on steroids, but these devices are sending data home to service providers who use that data for product enhancement.

Such a scale of data collection provides not only the ideal fuel for AI and machine learning, but also the means to establish performance baselines and outliers. Feature usage models, insights and action plans can all be derived from such an unfathomable well of information.

(Re)introducing the Measurement Protocol.

How do these devices measure activity, you ask? This post is a perfect excuse to look at Google Analytics 4's Measurement Protocol as an alternative data collection method that can help you measure all the IoT data you need—and make it compatible with the flat data model you have come to adopt and love. The Measurement Protocol was introduced in the early 2010s with the former version of Google Analytics, the now sunsetted Universal Analytics. Back in the day, the Measurement Protocol was used in very creative ways, so seeing it reborn for GA4 is a great opportunity to (re)discover this lesser-known yet powerful feature in Google Analytics.

In essence, the Measurement Protocol is an API that allows you to send events directly to Google Analytics servers, bypassing the need for bulky software development kits and complex integrations. The minimal software footprint of the Measurement Protocol means it is easily embeddable in every system that can call a URL. As you can imagine, this can be used for all IoT—everything from kiosks to points of sales to IoT devices. Some clear advantages include:

  • Standard protocol, so it is compatible with a wide range of devices and platforms
  • Easy to use, even for developers with limited experience
  • Scalable, so it can be used to collect data from large numbers of users
  • Security, through the use of data collection secret keys

Because of its lightweight approach, using the Measurement Protocol means you can collect just the data you need. The lack of an explicit user consent mechanism on most IoT devices will encourage you to adopt a privacy-first approach, so focus on telemetry and not on personal data. 

Uncovering the Measurement Protocol’s inner workings.

How does it all work? Well, when creating a Google Analytics 4 (GA4) property for your IoT project, you first need to create a new web property and then simply click on this newly created data stream to access the Measurement Protocol API secrets panel.

 

Data streams in GA4 measurement protocol

The next step is to create a key, which you will reference in your Measurement Protocol API calls. All you need to do is provide a nickname for your key and you can use the provided ID in your API calls. As you can see from the list below, our Data.Monks use it quite a lot!

Measurement protocol API secrets

Once your keys are set up, make note of your GA4 Measurement ID for your IoT stream and use code to create a URL to the Measurement Protocol service that combines everything we need, including event parameters. In the example below, our connected fridge will send an event when the fridge door is open.

The desired URL should look like this:

https://www.google-analytics.com/mp/collect?measurement_id={your ID}&api_secret={your key}

Now we need to send the above URL as a POST request, with a JSON payload containing the event parameters we want to send along. Keep in mind that, because this is not like a GA4 event sent from a browser or a mobile app, there is no automatic detection and collection of extra elements, as with GA4’s enhanced measurement. In fact, the Measurement Protocol only measures what you send it. From there, post the request in your favorite programming language—Python, in my case.

Sure enough, the event registers in the GA4 real-time interface and subsequent hits will become part of your GA4 reports—and live on to BigQuery if you’ve linked your property to Google Cloud Platform.

And of course, as I’m sure you can already guess, creating dashboards on your devices’ activity is a breeze in Google Looker Studio. That’s all there is to it!

Time to try out the Measurement Protocol yourself.

We have seen that the Measurement Protocol, like other event-level data collection platforms, uses an API-friendly format to send data out to Google Analytics. From a technical standpoint, this is a very easy and efficient implementation, so feel free to get creative for all your IoT projects.

We’ve discussed using the Measurement Protocol primarily for IoT devices (or any device that isn’t a computer, mobile phone or gaming console). Bearing that in mind, you can also use it as a data exchange method in a cloud environment as an API callback after a process completes. This means the Measurement Protocol works great with Cloud Functions or messaging queues like Google Pub/Sub or Kafka.

Finally, circling back to the remark I made about AI, this kind of measurement is indeed an ideal way to collect fuel for an AI/ML model, but AI can also be used to trigger the right event at the right time, and with the right data payload. At this point, AI can improvise and improve on your intended data collection plan, start sending events outside of the scope of its original program, and unlock even more insights. Coupled with Google Cloud Platform’s Cloud ML, the results may surprise you! 

In short, here are your key takeaways about the Measurement Protocol:

  • Simple mechanism: any system that can generate a URL can use it
  • Encourages concise, compact, privacy-friendly data collection
  • Can be used on anything, about anything
  • Leverages the power of the Google Analytics 4 flat data model
  • Small software footprint: very limited resource consumption
  • Complements an AI strategy and unlocks new opportunities
Our Data.Monks recommend Google Analytics 4's Measurement Protocol as an alternative data collection method to measure all the IoT data you need. data analytics Google Analytics AI Data Data maturity
GA4

Transitioning to GA4 at Scale • Simplifying Data Measurement With Automation

  • Client

    St. James's Place

  • Solutions

    DataData Strategy & AdvisoryData AnalyticsMeasurement

Making mass data measurement easier.

As an expert in effective wealth management and financial planning, St. James’s Place understands the value of a long-term investment. So, with the announced sunsetting of Google’s Universal Analytics and well aware of the time it would cost to manually measure thousands of sites, the British investment management company wanted to migrate both their corporate website and the 3,000 partner sites they manage to standard Google Analytics 4 (GA4)—and we made sure they got their money’s worth by leveraging our home-made, Google Cloud-hosted tool.

Unleashing a tried-and-tested migrating solution to realize success.

To get this massive job done, we relied on our proprietary automation tool, built by Ahmed Tarek M., which we’ve used time and again to create Google Analytics (GA) properties and Google Tag Manager (GTM) containers at scale. With a visual interface that’s easy to use—no code needed—this solution allows us to quickly and accurately create and update these properties and containers, reducing human effort while ensuring we have top-notch first-party data quality. After assessing the account structure and developing a measurement strategy, we deployed our tool to roll out the GA4 template for the 3,000 partner properties within just a few hours. As the crowning touch, we delivered a roll-up property that hosts all partner sites, thereby creating a clear, uncomplicated overview of the results across each and every one of them.

Taking little time to save St. James’s Place a lot of hours.

We migrated the brand’s corporate and thousands of partner sites from Universal Analytics to GA4 across all properties—an automation process that took 80 hours to complete, which is near nothing compared to the 1,500 hours it would have taken to implement all of this manually. This way, we helped St. James’s Place save precious time, which can now be spent on connecting customers with the right financial advisor. And speaking of impact, while our team has a lot of experience doing data analytics at scale, migrating 3,000 websites was the highest number we’ve hit thus far. On top of that, it was the first time we focused solely on Google Analytics 4. That said, given the ever-growing urgency to migrate to this system, we’re sure we’ll be welcoming many more cases like this one in no time.

Results

And it’s working…

  • Migrated 3,000 websites from Universal Analytics to GA4
  • Reduced migration time from 1,500 hours to 80 hours
  • Used proprietary automation tool to quickly and easily scale GA4 migration

Want to talk data? Get in touch.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

`

Data

Data Analytics

Unlock audience insights and boost growth with data analytics.

Business people discussing data
A line graph spiking and falling on a laptop

Take your web and app analytics practice to the next level with our Google Analytics expertise.

Gain access to our Google Analytics experts to help you deliver on mission-critical analytics projects, improve tracking efficiencies, get actionable insights and hit growth targets. Our offering is rooted in helping clients maintain an innovative, competitive and privacy-centric approach to their data analytics implementation. As technology and consumer mindset continue to evolve, we’ve got your back by building durable, scalable solutions that apply advanced data analytics to drive top-line and bottom-line impact.

Data Analytics at Scale

Football players go up for the ball
Girls playing field hockey
A baseball player throws a pitch to a batter

Case Study

Data Analytics at ScaleBy deploying Google Analytics and Tag Manager across 1,400 of SIDEARM sports partner sites, our automated approach provided quality data in record time.

See Full Case Study

How can we help you innovate? Get in touch.

Hey👋

Please fill out the following quick questions so our team can get in touch with you.

Geomarketing: What It Is and When You Should Use It

Geomarketing: What It Is and When You Should Use It

Data Data, Data maturity, Media 4 min read
Profile picture for user gabriel.ribeiro

Written by
Gabriel Ribeiro
Marketing Head

People using tablets and smartphones

The first step to relocating, opening a new store or planning how to stand out in a particular region is to first study where the best location for your business actually is. Once you’ve settled on where to go, the next step is to focus on attracting the attention of the public. This is where geomarketing comes in, an essential form of marketing that can help a company attract leads and increase conversions.

What is geomarketing?

Geomarketing is a technique that uses location data to optimize campaigns, helping you engage with customers at the right place and time. Geomarketing can be used for both online and offline touchpoints, making it a versatile part of your toolkit. It can take several forms: a set of information that helps you with decision-making, an analytical approach to building campaigns, or a strategic channel that helps you gather demographic data. It can even be a combination of these tools.

Why do brands use geomarketing?

Demographic surveys have long been used by brands to learn more about existing and prospective customers, and historically geomarketing has been used to help retailers choose the right region to open a physical store based on that data. Now, geomarketing is continuing to evolve along with demand for services within specific geographic areas. For example, an estimated 97.1% of users in Brazil access the internet via smartphones—and with so many customers always on the move, the need for geographically relevant messages and services has increased. There are three main advantages of geomarketing:

Audience segmentation. Geomarketing is a great way to segment your audience. This way, your campaigns can extract greater results from specific locations. Use this data to drive better placement in local searches, like “pharmacies in Rio de Janeiro.”

Increased ROI. Without a geomarketing strategy, it’s possible that your campaigns will reach people located far away—who might have no use for your services. For example, that pharmacy in Rio de Janeiro won’t want to advertise to people several cities away. By employing geomarketing, brands have the power to choose exactly where their campaigns run, meaning they’ll spend less for more effective results.

More qualified leads and higher conversion. The previous point shows how targeting more specific, engaged audiences is more cost-efficient. But it can also earn you more leads, because you’ll be reaching an audience likely to have a greater interest in your product or service—especially when taking other data, like purchasing behavior or interactions on social media, into account.

If you own an ice cream parlor in Brasília, for example, and you're on a tight marketing budget, geomarketing will help you to get leads who are in Brasília, close to the neighborhood and interested in ice cream. This way, you'll get more conversions at a much lower cost than advertising to the whole city, or to all of Brasília, the state.

Here’s how to use geolocation marketing in your business.

Once you understand the concept of geolocation marketing and how important it is, you can use one of many pieces of software available to manage data and optimize your geomarketing efforts, like Google Analytics or Meta Ads. Here are three tactics to get the most out of geomarketing.

Geotargeting. Geotargeting is a way of showing users content based on their location. With a database that maps IP addresses onto specific locations, you can target by country, state or even ZIP code depending on your platform of choice.

Geofencing. Geofencing is the use of technologies such as the Global Positioning System or radio frequency identification to create a virtual geofence. In other words, it involves collecting location data from electronic devices in order to take action based on it. You can use geofencing to deliver real-time content to your customers based on their GPS data. Note that geofencing requires the use of a branded application that your audience has already downloaded and authorized to track location data.

Another way to serve content to customers is by leveraging third-party platforms like Waze, a collaborative traffic and navigation app. By using Waze Ads, your content can be shown to drivers within a certain vicinity.

Geotagging and check-in. Another interesting geomarketing tactic is the strategic use of the check-in feature. For example, if you create a Facebook and Instagram page that includes your business address, both apps will allow customers to check in. Marking the location helps others easily find the profile of the business, along with other useful info.

Geotagging is similar, in which users tag the business location to a photo or other piece of content when sharing it to social media. Again, this helps people discover the business and generates publicity for the brand. Because people tend to be influenced by their peers, this can be a great factor in analyzing consumer behavior.

You can leverage geomarketing alongside other marketing strategies, too.

Geomarketing becomes even more useful when tied to other marketing strategies. Having access to customers' location is a great way to build efficiency across your brand’s actions. You can analyze market competition in your region of choice as well as the behavior of your target audience.

Geomarketing involves large volumes of information, and you can use that additional info to optimize your processes and improve business strategies overall—like directing investments to regions with the greatest potential for conversion, or identifying areas with high demand for your products or services.  

Geomarketing truly shines when you look for quality information that can provide insights into consumption patterns or other data obtained through studies, thus improving your geomarketing performance. For example, you can look into public databases of sociodemographic data. My team in Brazil uses IBGE, PNAD and Ipea.

With that, you should be ready to begin supporting geomarketing. For my team, geographical diversity is a big part of what we do, and leveraging insight into the interest and behaviors across different regions, cities and places is a fascinating way to deliver content to build your business. By using the strategies above, you’re well on your way to meeting the diverse needs of your own customers.

Learn how to leverage insight into the interest and behaviors across different regions, cities and places with geomarketing. data analytics Google Analytics data consumer insights Data Media Data maturity

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss