Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

Salesforce Marketing Cloud Growth Edition Was Just Announced—Here’s What It Means for You

Salesforce Marketing Cloud Growth Edition Was Just Announced—Here’s What It Means for You

AI AI, CRM, Data, Data maturity 3 min read
Profile picture for user Tammy.Begley

Written by
Tammy Begley
Head of Marketing Automation

Abstract image of a storm cloud encased within a transparent box.

Salesforce has recently dropped two big announcements that will help small to medium-sized businesses kick-start their AI transformation journey.

First, Salesforce has announced its new product, Marketing Cloud Growth Edition, which is designed to put data at brands’ fingertips to help them grow. Second, Salesforce is extending no-cost access to Data Cloud to Marketing Cloud Engagement and Marketing Cloud Account Engagement customers with Sales or Service Enterprise Edition (EE) licenses or above, giving them access to tools they can use to fuel stronger business outcomes with Einstein 1. This includes increasing speed to market, generating more relevant content, increased conversions, and the ability to connect conversations across the entire customer relationship. Both announcements roll out to the Americas this February and the EMEA region in the second half of 2024.

Whether you’ll just be trying Marketing Cloud for the first time or are firmly established on the platform, the news has promising implications for everyone. Read on to learn what the announcements mean for your team.

Data Cloud helps brands of any size prepare and streamline generative AI capabilities.

This is an exciting time to be a marketer; generative AI has ushered in an era of marketing-led transformation and new workflows that help us do our jobs smarter and faster. According to Salesforce’s Generative AI Snapshot Survey, 71% of marketers say generative AI will eliminate busy work and allow them to be more strategic. But AI is only as good as your data; high-quality data is essential for fueling accurate models and insights in the Einstein 1 platform.

And that’s where Data Cloud comes in. No-cost access to Data Cloud opens the door for even more businesses to use the platform for the first time, helping them build a solid data foundation that will service and streamline their AI transformation journeys.

Data Cloud is quickly becoming the leading customer data platform thanks to its ability to connect across sales, services, marketing, commerce, loyalty, third-party advertising and legacy applications. Previously, Salesforce announced no-cost access for Sales Cloud and Service Cloud customers.

Marketing Cloud Growth Edition applies marketing automation to help brands grow.

The launch of Marketing Cloud Growth Edition will expand the platform’s access to the small business market for the first time. In fact, this new edition of Marketing Cloud is designed to help brands grow their business. From delivering campaigns and content faster with trusted AI to better personalizing customer relationships with data, Marketing Cloud Growth Edition applies marketing automation to help small businesses connect their teams and drive revenue on a single, intuitive platform.

Benefits of the platform include helping small teams do more with fewer resources. This means small teams can spend more time building strong data foundations—data that will be critical for successful generative AI—and leverage enterprise AI capabilities for the first time. Marketing Cloud Growth Edition removes some of the technological barriers to entry that smaller businesses face in implementing AI, helping them get up to speed with fewer headaches.

What if I’m already a Marketing Cloud user?

Marketing Cloud Growth Edition might be the cool, new kid on the block, but there’s no need to move from an existing Marketing Cloud to this one. Marketing Cloud Growth Edition indeed brings new functionality to the table, but we look forward to seeing further innovations in Marketing Cloud Engagement and Marketing Cloud Account Engagement that will bring each platform in parity with one another. This means all Marketing Cloud customers can expect more good news on the horizon.

Whether you’re an existing Marketing Cloud user or will just be getting started with Marketing Cloud Growth Edition, we’re happy to help you make the most of platform features. Having a seat on the Salesforce Marketing Cloud Partner Advisory Board and being part of the the product’s pilot program, I'm thrilled to see the platform become more accessible to even more businesses, granting them access to robust, enterprise-level data tools for the first time—especially at a time when that is so crucial to entering the new AI economy.

For those using Marketing Cloud in any of its forms, we offer guidance on how to connect your data strategy and implement features as they go live. For Data Cloud users, we can help you realize the role of data beyond the context of customer relationship management, like how to join it together with generative AI functions.

Unlock the power of AI with Salesforce.

Both of Salesforce’s recent announcements will be welcome news to marketers who are itching to ramp up their AI transformation journeys. While Marketing Cloud Growth Edition and no-cost access to Data Cloud are especially beneficial to smaller businesses, a strong data strategy is important for organizations of any size—and no matter your Marketing Cloud of choice, we can help you make the most of your customer data on the platform.

Got any questions about Marketing Cloud Growth Edition? Check out Salesforce’s announcement for more.

The announcement of Salesforce Marketing Cloud Growth Edition and no-cost access to Data Cloud will help brands kick-start their AI transformation. salesforce marketing cloud data cloud ai transformation Data CRM Data maturity AI

Are Your Emails Truly Compliant? How to Ensure One-Click Unsubscribe and Marketing Cloud Play Nice

Are Your Emails Truly Compliant? How to Ensure One-Click Unsubscribe and Marketing Cloud Play Nice

CRM CRM, Data, Data maturity 3 min read
Profile picture for user Dave Teo

Written by
Dave Teo
Sr. Marketing Cloud Consultant

Collage of two images: on the left, a person is typing quickly on a laptop keyboard. On the right, a person is opening the Gmail app on a mobile device.

As of February 2024, Gmail and Yahoo Mail have begun enforcing new email deliverability rules for bulk senders, meaning those who send 5,000 or more emails a day. These changes are designed to make it easy for recipients to unsubscribe from emails that are no longer relevant to them. And while that’s good for everyone (less irrelevant info for consumers, better engagement rates for brands), the changes incur hefty fines for non-compliance—and that’s a big deal, because many brands who think they’re in compliance have actually missed one crucial detail.

What are the new Gmail and Yahoo Mail deliverability rules?

Gmail and Yahoo Mail now require bulk senders to do the following:

  • Meet email authentication standards, including DKIM, SPF and DMARC.
  • Implement one-click unsubscribe, and honor unsubscribe requests within two days.
  • Keep their spam complaint rate below 0.3%.

Platforms like Salesforce Marketing Cloud have done a lot of the heavy lifting here, helping brands get up to speed with their current guidelines. But it’s the responsibility of brands, and partners like us, to confirm their tech stack is properly set up. Throughout this process, we’ve found a common (but easily fixed) problem that has flown under the radar until now.

Don’t overlook this one detail.

Under the new guidelines, brands must implement one-click unsubscribe, otherwise known as the list-unsubscribe header. This shouldn’t be confused with the subscription preference center link that is typically placed at the footer of every brand’s commercial emails; this is a new and additional button that now appears within the header of an email on major email provider platforms like Gmail and Yahoo Mail.

The problem: unless you’ve accounted for it, that button won’t behave as expected. When a recipient clicks the button, your system may not be set up to honor those unsubscribes in Marketing Cloud, particularly if you utilize a Custom Preference Center connected to an external source of truth such as Salesforce CRM, a common practice for most Marketing Cloud users. Instead, the request will be stored in some obscure part of the platform, unbeknownst to you. This means that your Custom Preference Center or CRM may continue to reflect the incorrect subscription preference, putting you at risk of non-compliance with unsubscription preferences. This problem may exacerbate if you are running automations that overwrite subscription preferences based on your source of truth. As a result, you may accumulate spam complaints (and fines) over time.

In short, if you are relying on an external source of truth to manage subscription preferences in Marketing Cloud, the new one-click unsubscribe button won’t automatically behave like your existing unsubscription process. You must review and, if necessary, update your subscription process to account for this additional one-click unsubscribe method that your recipients may adopt to unsubscribe from your emails.

Ensuring compliance is easy.

Want to test whether you comply with these email authentication standards? Start by checking with your email analysis tool of choice, like Google Postmaster, Microsoft Smart Network Data Services (SNDS) or Yahoo Sender Hub. These platforms allow you to easily check email performance and message authentication, including tracing spam complaints.

If you are unsure how to get started or are looking for a quick solution to ensure compliance, especially with the one-click unsubscribe (list-unsubscribe header), don't despair. Our Marketing Automation team has developed an easily deployable solution, so you can rest assured that you won't fall short of Gmail and Yahoo Mail's new email requirements.

Brands have already done so much to meet changing modern guidelines, so it would be a shame for that effort to go to waste because of an easily overlooked disconnect. But we can get you up to speed quickly and easily if you need it.



CTA: Got any concerns about Gmail and Yahoo Mail’s new email deliverability rules? Reach out and we can help!

 

Gmail and Yahoo Mail are enforcing new email deliverability rules for bulk senders—but a common technical issue can lead to non-compliance and fines. salesforce marketing cloud one click unsubscribe email deliverability rules Data CRM Data maturity

Why Market Mix Modelling Should Be Integrated Across the Whole Business

Why Market Mix Modelling Should Be Integrated Across the Whole Business

Data maturity Data maturity, Measurement 2 min read
Profile picture for user Tim Fisher

Written by
Tim Fisher
SVP Measurement - Head of EMEA

Rowing image to show collaboration

So often, Market Mix Modelling (MMM) gets put to one side as a tool purely for marketers to measure and demonstrate the role they play in the company’s ecosystem. In doing so, however, they underestimate the potential impact that MMM insights can have on the performance of the entire business.

I am here to say that nobody puts MMM in a corner.

MMM not only quantifies the impact of various marketing activities on key performance indicators (KPIs), but also provides a measure of the effect on business performance of other controllable and external factors, such as price or distribution changes, competitors' actions, economic climate, and can even answer the question we all love to discuss, “How much of our performance is really down to the weather?” 

As MMM provides this panoramic view of all the key drivers, it is important to ensure it never operates in a silo if you want it to deliver its full potential. Here's why:

1. Cross-Functional Collaboration: MMM involves analyzing vast amounts of data from various teams. By demonstrating the benefits of MMM in terms of additional insights and recommendations, it encourages greater engagement from teams. This collaboration leads to a more comprehensive understanding of marketing effectiveness and drives better outcomes.

2. Influence Beyond Marketing: MMM has a significant role in shaping commercial decisions, particularly in pricing and promotions. Informed decision-making in these areas, such as identifying optimal price points, understanding price elasticity, and evaluating the impact of promotions on sales and revenue, empowers businesses to strike the right balance between profitability and customer demand.

3. Engaging Finance Teams: MMM often provides budget allocation recommendations across media channels, campaigns, departments, different brands in your portfolio as well as different markets. Involving finance teams ensures that recommendations are implemented and beneficial across the entire organisation. This collaboration quantifies the business decisions in both the short and long term.

In conclusion, MMM should be integrated across the whole business. By breaking down silos, fostering collaboration, and incorporating MMM into the decision-making processes, businesses gain more accurate insights, make better decisions, and achieve improved marketing performance and overall results.

Soon everyone will be holding MMM results and recommendations aloft above their head in the middle of the boardroom.

 

For more information on how we can help with your Marketing Effectiveness measurement or Market Mix Modelling,visit our Measurement page or contact us.

Fully utilising MMM insights allows us to see beyond pure marketing, and instead allows us to gain insights on the performance of the whole business. market mix modelling MMM marketing insights Measurement Data maturity

A Comparative Analysis of Google Analytics 4 and Adobe Analytics

A Comparative Analysis of Google Analytics 4 and Adobe Analytics

Data Data, Data Analytics, Data maturity 3 min read
Profile picture for user Brianna Mersey

Written by
Brianna Mersey
Director of Data, NAMER

Google Analytics 4 and Adobe Analytics

In the dynamic landscape of data analytics, choosing the right platform is crucial for businesses seeking actionable insights. Two major players in this space, Google Analytics 4 (GA4) and Adobe Analytics, cater to distinct needs and come with their own set of features. In this article, I will compare the two based on their key differences, features, and use cases, along with ways you can identify which platform may be the best fit for your team’s needs and objectives.

Understand the fundamental differences in platforms.

One prominent distinction between each platform lies in their parent suites. GA4 is an integral part of the Google Marketing Platform (GMP) suite, focusing on advertising-driven analytics. On the other hand, Adobe Analytics is an essential component of the Adobe Experience Cloud, emphasizing the delivery of customized user experiences. This fundamental difference is reflected in their core objectives, features and functionalities.

Keep cost in mind.

One of the initial differentiators is the cost factor. GA4 offers a free entry-level product, making it accessible for businesses of all sizes. In contrast, Adobe Analytics typically starts at around $100,000, with costs influenced by features, server calls, and negotiations (ask for discounts!). The pricing structure is a crucial factor for businesses when deciding on the analytics platform that aligns with their budgetary constraints.

Be aware of differences in data collection and reporting.

Adobe Analytics excels in enterprise level data collection. Notably, it provides advanced segmentation options and robust custom reporting and tracking capabilities. Nevertheless, setting up Adobe Analytics demands a higher level of technical expertise.

In contrast, GA4 offers a more straightforward data collection process with minimal technical implementation complexities and less customization. Enhanced by machine learning and AI-powered insights, GA4 stands out for its web and app architecture and audience linking with the GMP platform. Additionally, GA4 proves adept at tracking cross-device user journeys and effectively measuring campaign ROI.

Consider ease of use.

Both Adobe Analytics and GA4 feature user-friendly interfaces, with GA4 being notably more accessible, particularly for non-technical users. GA4 boasts a straightforward and streamlined interface, enhancing ease of navigation and setup. This now includes a feature that lets you customize your home page. 

In contrast, Adobe Analytics presents a more intricate interface, which demands a greater level of technical knowledge and a steeper learning curve. I like the fact that Adobe Analytics’ menu button names can be changed in the Admin section, but I don't suggest changing the order of the buttons.

GA4 and Adobe Analytics

Strengths of Google Analytics 4

  1. Seamless Integration with GMP Stack: GA4 seamlessly integrates with the Google Marketing Platform stack, even at the free version level. This integration offers users a comprehensive view of channel performance, covering social, direct, organic, and paid media.

  2. Behavioural Insights from Chrome: GA4 leverages behavioural information collected from Chrome, including interests and demographic user data. This information, encompassing demographics and interests, proves valuable for building retargeting segments.

  3. Cost-Effective Entry with Free Products: Google Analytics 4 provides a cost-effective entry point with free products, making it an attractive option for businesses with budget constraints.

Strengths of Adobe Analytics

  1. Unparalleled Customization: Adobe Analytics stands out for its exceptional customization capabilities, empowering users to tailor analytics solutions according to their specific needs.

  2. Overcome Sampling Challenges: Unlike GA4, Adobe Analytics does not face data sampling issues. GA4 users seeking unsampled data must integrate with BigQuery, adding an extra step to the process.

  3. Advanced Segmentation Tool: Adobe Analytics boasts a more advanced segmentation tool, allowing users to create intricate segments with a broader array of operators

Which platform is right for you?

In conclusion, the choice between Google Analytics 4 and Adobe Analytics hinges on various factors, including the existing tech stack, business requirements, use cases and budget considerations. For ecommerce-centric businesses heavily reliant on paid media, simple report customizations, and looking to track apps, GA4 emerges as a strong option, despite some features being in beta and occasional bugs.

On the other hand, if you are an Adobe Experience Cloud customer, opting for Adobe Analytics ensures a seamless integration within the Adobe family, offering a customizable and robust implementation tailored to your unique needs, which may include collecting data from very diverse sources.

Ultimately, the decision rests on a thorough evaluation of your organization's priorities, goals, and resources, ensuring that the chosen analytics platform aligns perfectly with your business strategy.

If you’ve already invested time and energy into one product, stick it out and keep moving forward. Switching products won't solve the core issues, it just means more time and money.

Our Data.Monks provide insights on the fundamental differences between GA4 and Adobe Analytics and how both could help with your unique analytics needs. data analytics Google Analytics adobe analytics Data Data Analytics Data maturity

Establishing a Good Marketing Effectiveness Practice

Establishing a Good Marketing Effectiveness Practice

Data maturity Data maturity, Measurement, Media Strategy & Planning 5 min read
Profile picture for user Michael Cross

Written by
Michael Cross
EVP, Measurement

Gathering of people around an image

In times of economic uncertainty, there is often more scrutiny than usual on marketing budgets, and increased pressure to cut investment, but how do you know which piece to trim? What cost saving will be least detrimental to sales or profit? That’s where establishing a Marketing Effectiveness evaluation and measurement plan is crucial, as having a decent process will be key in helping to defend budgets from cuts by the number crunchers.

 

But how do you ensure marketing effectiveness measurement is robust, defendable and clear? Here are some top tips for you to consider.

It has to start with objective alignment!

The first step is to be clear on your objectives: what is the campaign’s purpose? Is it to drive up knowledge of the brand? Or is it a reminder to purchase again? Are we trying to increase the reliance on online as its higher margin? By being clear about what you want the campaign to do, and to which audience, you are starting to define which metrics you should be measuring against, and therefore the definition of the KPI you should track against (for example, increasing awareness if your goal is to increase brand knowledge). This article by data guru Avinash Kaushik is worth a read on further defining the right KPIs.

Set realistic targets!

Once you’ve chosen the KPI, you will need to set realistic expectations of how you expect the campaign to move the KPI. For this, look to the past to see how much the metric has moved; if there hasn't been much variation, then perhaps you shouldn’t expect a huge increase. If there is a lot of movement, what looks realistic in terms of uplift? Make sure you consider seasonality. For example, look at the three-year pattern: are there certain times of the year that are always up? If so, take that into account.

How will you measure?

With targets set, you will need to think about how you are going to measure the campaign before you deploy it. Some techniques include:

  • Random control tests (RCTs) and geo testing
  • A/B tests
  • Measuring against a historic baseline
  • Multi Touch Attribution (MTA)
  • Market Mix Modelling (MMM)

Knowing which technique you will use helps you define the shape of the campaign. Knowing which technique you will use helps you define the shape of the campaign. For example, if you undertake geo testing, you will need to identify the most appropriate geographical area for your activity to occur within and a comparable area to use as a control. Meanwhile, for MMM, you will need to ensure you have sufficient media spend levels and variation to enable you to get a read on the impact.

Make sure the test spend has the following key attributes:

  • Is there enough spend to move your KPI?
  • Is it shaped so you can get as clear a read as possible (i.e. bunch it up, don't spread it out)?
  • Is it at a time that will conflate with other impacts?

Execute and measure.

Once you have the right objectives and metrics, know which measurement method you are going to use, and the campaign is successfully deployed, it is time to then execute and measure. When it comes to measurement, be conscious of limitations of your measurement technique.

RCTs, geo and A/B approaches are easy to deploy, simple to understand and can be deployed internally. However, there are some limitations to these techniques, which can prevent them from giving a full read on the effectiveness of your activity.

First, there's difficulty getting a read on the "carryover" of the campaign (often called the memory effect), which is the effect the campaign continues to have after the campaign has finished. These approaches also struggle to measure the impact of specific media activity onto other channels; for example, running social activity can boost PPC. These findings are key when trying to understand a full view on your media performance.

These methods are also unable to provide information on scaling the results. A test spend of £20k in one region will not have the same ROI as a £5M national campaign. Be aware of diminishing returns. You can get around this by upping the investment in increments and continuing to test.

MTA is great at giving detail, can be relatively easy to set up, and gives you a good relative read. It is not, however, an incremental analysis, so it is not reliable for ROI calculations.

MMM is incremental and includes a full read on all drivers of your KPI. However, it is blunt (you need to spend at least £100K on a campaign), does not give as granular detail as other techniques, and can be expensive.

Considerations for in-housing measurement.

So, as a client what could you do yourself?

  • RCTs, geo and A/B tests: Most of the time, there’s no real need for external partners.
  • Multi Touch Attribution: Give it a crack, it’s fairly straightforward and you can use techniques like Markov chains. But you can only use traditional MTA for a short while: with third-party cookies going away, there is a longer term need to invest in cookieless solutions.
  • MMM: Great to in-house if you have the scale, but you need to keep a team busy and fulfilled. This, therefore, only works if you are either an enterprise or global company spending over £100m on media.

Be aware of a couple watch-outs for in-housing.

  • Don’t use data scientists for MMM—use econometricians. We’ve seen time and time again that where data scientists have been tasked with building MMM models, it very very rarely works. Data scientists and econometricians see things in different dimensions.
  • Econometricians are hard to find, and harder to keep hold of. You need to make sure the work is varied and interesting for them to stick around.
  • Make sure there is enough work for at least four people. Otherwise, if you are reliant on one econometrician and they leave, it's very hard to get someone else in to pick things up.
  • Career progression: A consultancy can always get bigger, but an in-house enterprise team hits a limit. So then you have no progression and people leave. Or you move them into non-econometric roles but then you still have the problem with recruiting in a niche field.
  • Method stagnation: There is less opportunity from your econometricians to learn new techniques from larger teams working on other clients. So there is a danger that your capability starts to fall behind the curve—unless you have really high staff turnover, which you’ll need to have a big enough team to support.

However, if you are large enough and have the right team, then in-housing can save a lot of future money on consultants, whilst keeping complete control of your data and models. To keep it moving, and to make sure you don't stagnate, consider using an external partner to help refine your MMM approach.

Summary

In conclusion, getting the start nailed down is critical in marketing evaluation. Align objectives with KPIs and how you lay down the campaign. Think carefully about which technique you will use to measure, and whether you do that internally or with external help. In these uncertain economic times, there has never been a greater need to get this right. Bon chance!

For more information on how we can help measure drivers of growth for your business visit our Measurement page or contact us.

Establishing a marketing effectiveness evaluation and measurement plan is crucial, especially when budgets are tight and investments need to be defended. Media Measurement Media Evaluation Marketing Effectiveness MMM evaluation framework Measurement Media Strategy & Planning Data maturity

The Importance of BI Governance in Complex Data Environments

The Importance of BI Governance in Complex Data Environments

Consumer Insights & Activation Consumer Insights & Activation, Data, Data Privacy & Governance, Data Strategy & Advisory, Data maturity 5 min read
Profile picture for user mediamonks

Written by
Monks

header image

In today’s digital era, organizations are amassing ever-growing amounts of data, while an increasing number of employees need to access it from different locations. This surge in data accumulation has given rise to complex environments, where finding the right information can be challenging. However, having more data should not mean greater difficulty in leveraging it—quite the opposite, as long as it is managed efficiently and effectively. It is within this landscape that the significance of data governance comes to the forefront.

In essence, governance provides the rules, processes, and policies we need to securely manage data and content, through workflows that ensure their availability to the right people at the right time. We then complete the cycle with business intelligence (BI) by using that data to generate valuable insights and analyses that support strategic decision-making. “We understand BI governance as an end-to-end cycle that spans everything from data collection to visualization and consumption, with a focus on feedback and continuous management,” says Juana Masanet, our Semi Senior BI Analyst.

Monk Thoughts The ultimate goal is to ensure that the organization maximizes the potential of its data, thereby driving business growth.
headshot of juana masanet

Indeed, this is a process of vital importance. However, BI governance is an aspect that organizations often overlook. “Within our data team, we collaborate with clients grappling with two common challenges that BI governance can solve,” says Masanet. “On one hand, we have companies with scattered data sources across various platforms that need a centralized database for their analyses. On the other hand, there are companies with dispersed dashboards among different individuals or departments, leading to discrepancies in their KPIs and other issues.”

When facing these challenges, certain BI platforms such as Looker, Power BI and Tableau have solidified their positions as reliable solutions. They empower organizations to centralize the modeling of diverse data sources, ensuring the seamless availability of information crucial for data-driven decision-making. Mastering these platforms is crucial, not only for security reasons—which are paramount in themselves—but also for scaling large-scale projects involving numerous stakeholders, teams and roles with diverse objectives, responsibilities and tasks. Now, one might wonder, what does a good BI governance strategy look like?

The image below features the essential elements of a robust BI strategy. Equally important to defining these elements is ensuring that everyone involved in the workflow understands and adheres to them. This way, they can rely on the analyses they consult to make data-driven decisions, break down silos, foster user collaboration, and reduce costs.

Overall sentiment score vs. star rating

“Having experienced teams and safe, efficient, and flexible tools is essential to orchestrate workflows that prioritize BI governance,” explains our Analytics Specialist, Martin Milloschik. “Alongside my colleagues in the data team, we collaborate with various organizations to achieve this objective, leveraging two powerful tools: Looker and Tableau.”

The benefits and specifics of Looker and Tableau.

Looker and Tableau, two of the most renowned BI platforms in the industry, offer the ability to consolidate various data lifecycle processes into a single tool. Both platforms host their resources in the cloud, with Tableau providing Tableau Cloud or Server, and Looker using GCP (Google Cloud Platform). This cloud-based approach not only bolsters efficiency but also levels the playing field for smaller enterprises with limited operational capabilities. Let us delve into some of the standout features that enhance BI governance.

One of the key strengths of Looker is its centralized data management. The tool is purpose-built to seamlessly integrate data from diverse sources, creating a unified view that undergoes meticulous cleansing and consolidation through its proprietary semantic layer modeling. On the other hand, Tableau, while lacking native ETL (Extract, Transform, Load) tools in its platform, offers an alternative with Tableau Prep Builder and Tableau Prep Conductor, which serve the same purpose. 

Data collaboration is also optimized in both tools. “Teams can access, analyze and share data within a shared workspace, enhancing communication and avoiding data duplications or conflicts,” says Milloschik. Additionally, both platforms offer version control, enabling smooth collaboration and empowering teams to work simultaneously, regardless of their level of expertise in data usage.

Monk Thoughts Having a common data source throughout the organization assures users that the data is reliable and has adhered to specific standards of quality and governance.
Martin Milloschik headshot

For metadata management, Tableau offers an additional functionality called Tableau Catalog. This tool indexes all the content on your site, including dashboards, metrics, data sources, virtual connections and flows, to gather detailed metadata about the content. With Tableau Catalog, users can access features such as lineage, impact analysis, and a data dictionary, which provide them with a deeper understanding of the information they use and enable them to track where it is utilized throughout the organization.

Furthermore, Tableau provides users with an additional layer of data oversight through its Data Quality Warnings and Asset Monitoring features. These capabilities empower users to set alerts on data sources, ensuring that those who utilize dashboards or access the data directly are promptly notified of any pertinent issues. The warnings can encompass a range of concerns, including deprecated or obsolete data, ongoing maintenance, and the presence of sensitive information, among others. Additionally, these functionalities serve as a mechanism to alert users when data is not up-to-date due to failures or other factors. This helps ensure that users are informed about the quality and availability of the data.

Regarding security, both platforms offer robust features to control access to sensitive data and apply various security measures based on user roles. These measures include data source authentication and authorization, row-level filtering, permissions and encryption, among others. “It’s possible to create roles and groups that are then associated with different users, allowing them to access layers of information based on business needs and defined accessibility for each user,” explains Valentina Gonzalez, BI Analyst. This approach ensures that restrictions are in place so only authorized users can access the appropriate information.

Monk Thoughts The best practice is to use groups to associate sets of users with these roles instead of individually assigning permissions for each case.
Valentina Gonzalez headshot

Both Looker and Tableau offer management and monitoring tools that enable administrators to gain insights into data usage patterns and proactively address concerns related to performance, connectivity, usage and update failures, among others. Tableau provides default administrative views using data from the Tableau Server or Cloud repository. Meanwhile, Looker offers the System Activity section in the general admin menu, providing access to a variety of LookML Dashboards and models. These resources allow analysis of user activity, data and content consumption, frequency of querying and usage of published dashboards, and even identify errors that may arise during the modeling and creation of Looks, which are predefined visualizations to answer specific business questions.

To summarize, both platforms provide all the necessary components of the feedback cycle to ensure effective governance. However, it is crucial to recognize that success is not solely determined by the choice of tool. Equally important are user training and the implementation of suitable policies and processes within the organization. These key elements play a vital role in attaining success in data governance, ensuring that data is managed and utilized in a manner that aligns with organizational goals and objectives.

Our experts explain BI governance and how it helps organizations make the most of their data. data analytics Data Data Privacy & Governance Data Strategy & Advisory Consumer Insights & Activation Data maturity

Get to Know Enhanced Conversions and Value Based Bidding

Get to Know Enhanced Conversions and Value Based Bidding

Consumer Insights & Activation Consumer Insights & Activation, Data, Data Strategy & Advisory, Data maturity, Data privacy, Death of the cookie 6 min read
Profile picture for user doug_hall

Written by
Doug Hall
VP of Data Services and Technology

Abstract image of a virtual room

Following up on an internal training session at Media.Monks, this article introduces two key tactics you can use to support and grow your business through digital marketing on the Google Marketing Platform. The audience is intentionally broad with the view of sharing the “what” and the “why” across the full spectrum of digital marketer roles.

These techniques are exciting, as Google has published data demonstrating that double-digit percentage uplift in conversion value is possible. Results clearly depend on having the very best data, the best modeling capabilities and the best activation strategy, which is where Media.Monks teams play an essential role.

Who is this for? Everyone!

Are you in digital marketing as an “analytics person”? Primarily data focused? Technical? You’ll know about enhanced conversions (EC) and value-based bidding (VBB), but beyond the tagging, do you know what’s going on in the media systems and what it’s actually for?

Or are you a “non-technical” marketer? Your talents for campaign setup and management don’t overlap with tagging. Again, you’re across EC and VBB but where does the data come from? Why’s it so tricky to get right? What’s the hold up with the tags?

Regardless of our role specifics, we all need to have as full understanding of the solutions as possible. We need to get a handle on what happens “on the other side” so we can deliver the very best solutions for clients, and for users. Here’s the scoop you need. This is relevant to people on the Search Ads/Display & Video/Campaign Manager side as well as those on the Google Analytics/Google Tag Manager side. Here’s an opportunity to share knowledge… LFG.

Set the scene.

Cookie atrophy is a poorly kept secret. Browser tech continues to erode cookie usage. Third-party cookies are being deprecated from Chrome in 2024, which holds a dominant market share that’s significant for marketers. That doesn’t mean we are on safe ground when it comes to first party cookies though; just check through the details on Cookie Status to see the reality.

As data volume diminishes with sufficient signal quality, we can still use modeling techniques to mitigate for gaps in data, but that’s not a robust solution in isolation. We continue to make every effort necessary to maintain data volume, whilst evolving our tactics to improve efficiency.

This is where EC becomes a playbook entry to maximize observable conversions, while VBB drives greater efficiency by enabling optimization for value rather than volume.

Maximize observable data.

If we have less data, we must have better data quality. By that, we mean clean and clear data where we can clearly see conversions and channels. This means that the data still has utility even if it’s not complete. Where we may have holes due to browser tech and cookie loss, for example, we can still use first-party data to get better conversion accuracy. Enhanced conversions help us see more conversion data, but in a privacy-safe manner.

What it does.

Basically, on the conversion/sale/thank you page, a tag will fire—let’s say a floodlight tag for simplicity. The user’s email address is hashed (encoded using the SHA-256 algorithm), and then added to the tag data which is then sent to Google. This hashed value is then used to match the user with Google’s data to recover conversions that are absent from your data set.

You can use a range of values in addition to, or instead of, the email address. The email address is normally fine. It’s hashed, so no third party (not even Google) sees the data and it’s deleted after use. Google has published in-depth details on how the data is used, and this is essential reading for your teams.

Use best practices for tagging.

Ideally, you’d expose pre-hashed personal identifiable information (PII) on the dataLayer variable which can be picked up easily by Google Tag Manager (GTM) and added to the floodlight.

You can scrape the Document Object Model (DOM) to extract the data, but this is not a robust, long-term solution. You can use Google tag instead of GTM if a tag management system is not available. For offline conversions (leads), you can also upload conversion data via an API.

Collaboration is key.

Tech, data media and legal teams should work closely in order to correctly implement and then validate changes in data volumes.

This is not legal advice, so you need to get buy-in early from your legal team. Advise your teams to make sure EC usage is covered in your privacy policy and cookie policy and that consent is fully informed with a clear opt-out option.

Make sure you know the conversion page path, and that the PII variable is available. Scraping the DOM might be okay for a proof of concept, but don’t rely on it as a permanent solution.

Media teams need to make simple configuration changes and then report accurately on conversion volume changes. Use your data science teams to establish causality and validate EC is working. Liaise with your media teams regularly after rolling out EC to maintain scrutiny on the data volumes and changes. Be impatient for action (get it done!), but patient for results—manage expectations regarding timing, change may take weeks.

Using value-based bid optimization.

As we progress along the path of digital maturity, our tactics adapt and evolve. Where it’s normal and fine to optimize for click volume in the early days, the optimization KPI changes as our business grows. We aim to reduce cost, grow revenue, build ROI and ultimately optimize for long-term profit.

Optimizing a campaign for click volume was a brute-force budget tactic. Optimizing for value (profit stems from value) is a more precise allocation of budget. How the budget is allocated is the clever part.

Optimize for value.

Consider an ecommerce site where the obvious valuable outcome is a sale. There are other outcomes that serve as signals to indicate a user may be a valuable customer: viewing a product detail page, adding to cart, starting a checkout. All actions lead to the conversion, all with varying degrees of value. As each outcome is completed, fire a floodlight to inform GMP that the user has performed a “high-value action” worth €x. These actions and values are then used to automatically optimize the bid for the user.

Previously, defining the values associated with an action was a matter of experimentation. Now you can use an online calculator to refine these numbers.

This approach to value-based bidding needs a level of data volume and quality that is delivered by using EC with VBB—and is extremely powerful. It has few moving parts, but the values are static, commercial values that don’t always reflect the user’s likely behavior. To address this, let’s look back at an older solution to see how we can level up this approach.

Using coarse-grained optimization.

Previously, we’ve used machine learning to build a predictive model that will output an answer to “how likely is it for user X to convert”? At scale, the data is imported into GMP as an audience, and we use this to guide where the budget is spent. A simple approach here is to build a set of audiences from the model output to drive bid optimizations:

  • “No hopers” with the lowest propensity to convert: €0.
  • “Dead certs” with the highest propensity to convert: low or €0
  • “Floating voter” with medium propensity; needs convincing: €maximum

This technique has delivered great results in the past. There are shortcomings, however. With three audiences, the segmentation by propensity is quite coarse. As the number of audiences ramps up, there is more to compute and more to maintain in terms of infrastructure. The user needs to revisit the site to “get cookied” and be included in a remarketing audience.

There is a more modern approach that addresses the shortcomings from these techniques.

Modeled VBB optimization goes even further.

We’ll now blend these two solutions with server-side data collection (sGTM). Server-side data collection has a number of key features that make it very appropriate for use here:

  • First, it allows data enrichment in private—we can introduce profit as a value for optimization without exposing margin data to third parties.
  • Additionally, first-party cookie tenure is enhanced by server-side data collection. Your first-party cookies are set in a way that prevents third-party exposure—browsers like this and take a less harsh view of them. This is better for your first-party data quality.
  • There is no need to revisit the site to establish audience membership; all cookie-ing is done in the pixel execution.

So now, we can fire floodlights for our sales conversions, attach per-item profit data at the server level and optimize bids based on user profitability. Awesome, but what about the predictive model output?

At the server-side data collection point, sGTM can integrate with other Google Cloud Platform (GCP) components. As well as extracting profit data, we can interrogate a propensity model, and for each high-value action per user, ask what the propensity is for the user to convert. The predictive score is then attached to the floodlight to drive VBB.

This has fewer moving parts than the older solution. It solves for the coarse-grained audience feature by delivering per user scoring as the data is collected. Again, we team this up with EC to maximize conversion visibility and drive powerful marketing optimizations.

Optimize your marketing with EC and VBB.

These techniques have existed in isolation for some time. With a broader understanding of the data requirements, and the activation of the data, we’re all in a better position to use privacy-first marketing optimizations to deliver efficiencies for clients, and ultimately, a better, more useful online experience for consumers.

With the demise of third-party cookies, enhanced conversions and value-based bidding can help maximize observable data quality and conversion accuracy. value-based marketing data first-party data Data Data Strategy & Advisory Consumer Insights & Activation Death of the cookie Data privacy Data maturity

Collect the Data You Need, Right Where You Need It

Collect the Data You Need, Right Where You Need It

Data Data, Data maturity 4 min read
Profile picture for user Julien Coquet

Written by
Julien Coquet
Senior Director of Data & Analytics, EMEA

Abstract square shapes in orange and blue tones.

So you went ahead and deployed your digital analytics solution with all the bells and whistles. Your data collection plan is exhaustive, privacy-friendly, sophisticated and will track more data points and attributes than you will ever use or need. Your data integrates seamlessly with your online marketing campaigns and you’re able to gain valuable insights, optimize and activate your data. No, is that not the case? Then get in touch and make sure to keep reading. 

In times of endless data, it is crucial to collect smarter.

As an analytics expert and practitioner, I know first-hand that collecting data across multiple digital assets and channels can be daunting. This is especially the case when the number of devices connected to the global internet exceeds 21 billion in 2023. Thankfully, our current Iinternet addressing system can handle a lot of these devices, namely up to 3.4×10E38 (that’s 34 followed by 37 zeroes). 

Out of these 21 billion devices, about 66% is made up of Internet of Things (IoT) devices, all of which generate data about their operation, features and settings. Call it connected black boxes or telemetry on steroids, but these devices are sending data home to service providers who use that data for product enhancement.

Such a scale of data collection provides not only the ideal fuel for AI and machine learning, but also the means to establish performance baselines and outliers. Feature usage models, insights and action plans can all be derived from such an unfathomable well of information.

(Re)introducing the Measurement Protocol.

How do these devices measure activity, you ask? This post is a perfect excuse to look at Google Analytics 4's Measurement Protocol as an alternative data collection method that can help you measure all the IoT data you need—and make it compatible with the flat data model you have come to adopt and love. The Measurement Protocol was introduced in the early 2010s with the former version of Google Analytics, the now sunsetted Universal Analytics. Back in the day, the Measurement Protocol was used in very creative ways, so seeing it reborn for GA4 is a great opportunity to (re)discover this lesser-known yet powerful feature in Google Analytics.

In essence, the Measurement Protocol is an API that allows you to send events directly to Google Analytics servers, bypassing the need for bulky software development kits and complex integrations. The minimal software footprint of the Measurement Protocol means it is easily embeddable in every system that can call a URL. As you can imagine, this can be used for all IoT—everything from kiosks to points of sales to IoT devices. Some clear advantages include:

  • Standard protocol, so it is compatible with a wide range of devices and platforms
  • Easy to use, even for developers with limited experience
  • Scalable, so it can be used to collect data from large numbers of users
  • Security, through the use of data collection secret keys

Because of its lightweight approach, using the Measurement Protocol means you can collect just the data you need. The lack of an explicit user consent mechanism on most IoT devices will encourage you to adopt a privacy-first approach, so focus on telemetry and not on personal data. 

Uncovering the Measurement Protocol’s inner workings.

How does it all work? Well, when creating a Google Analytics 4 (GA4) property for your IoT project, you first need to create a new web property and then simply click on this newly created data stream to access the Measurement Protocol API secrets panel.

 

Data streams in GA4 measurement protocol

The next step is to create a key, which you will reference in your Measurement Protocol API calls. All you need to do is provide a nickname for your key and you can use the provided ID in your API calls. As you can see from the list below, our Data.Monks use it quite a lot!

Measurement protocol API secrets

Once your keys are set up, make note of your GA4 Measurement ID for your IoT stream and use code to create a URL to the Measurement Protocol service that combines everything we need, including event parameters. In the example below, our connected fridge will send an event when the fridge door is open.

The desired URL should look like this:

https://www.google-analytics.com/mp/collect?measurement_id={your ID}&api_secret={your key}

Now we need to send the above URL as a POST request, with a JSON payload containing the event parameters we want to send along. Keep in mind that, because this is not like a GA4 event sent from a browser or a mobile app, there is no automatic detection and collection of extra elements, as with GA4’s enhanced measurement. In fact, the Measurement Protocol only measures what you send it. From there, post the request in your favorite programming language—Python, in my case.

Sure enough, the event registers in the GA4 real-time interface and subsequent hits will become part of your GA4 reports—and live on to BigQuery if you’ve linked your property to Google Cloud Platform.

And of course, as I’m sure you can already guess, creating dashboards on your devices’ activity is a breeze in Google Looker Studio. That’s all there is to it!

Time to try out the Measurement Protocol yourself.

We have seen that the Measurement Protocol, like other event-level data collection platforms, uses an API-friendly format to send data out to Google Analytics. From a technical standpoint, this is a very easy and efficient implementation, so feel free to get creative for all your IoT projects.

We’ve discussed using the Measurement Protocol primarily for IoT devices (or any device that isn’t a computer, mobile phone or gaming console). Bearing that in mind, you can also use it as a data exchange method in a cloud environment as an API callback after a process completes. This means the Measurement Protocol works great with Cloud Functions or messaging queues like Google Pub/Sub or Kafka.

Finally, circling back to the remark I made about AI, this kind of measurement is indeed an ideal way to collect fuel for an AI/ML model, but AI can also be used to trigger the right event at the right time, and with the right data payload. At this point, AI can improvise and improve on your intended data collection plan, start sending events outside of the scope of its original program, and unlock even more insights. Coupled with Google Cloud Platform’s Cloud ML, the results may surprise you! 

In short, here are your key takeaways about the Measurement Protocol:

  • Simple mechanism: any system that can generate a URL can use it
  • Encourages concise, compact, privacy-friendly data collection
  • Can be used on anything, about anything
  • Leverages the power of the Google Analytics 4 flat data model
  • Small software footprint: very limited resource consumption
  • Complements an AI strategy and unlocks new opportunities
Our Data.Monks recommend Google Analytics 4's Measurement Protocol as an alternative data collection method to measure all the IoT data you need. data analytics Google Analytics AI Data Data maturity

Geomarketing: What It Is and When You Should Use It

Geomarketing: What It Is and When You Should Use It

Data Data, Data maturity, Media 4 min read
Profile picture for user gabriel.ribeiro

Written by
Gabriel Ribeiro
Marketing Head

People using tablets and smartphones

The first step to relocating, opening a new store or planning how to stand out in a particular region is to first study where the best location for your business actually is. Once you’ve settled on where to go, the next step is to focus on attracting the attention of the public. This is where geomarketing comes in, an essential form of marketing that can help a company attract leads and increase conversions.

What is geomarketing?

Geomarketing is a technique that uses location data to optimize campaigns, helping you engage with customers at the right place and time. Geomarketing can be used for both online and offline touchpoints, making it a versatile part of your toolkit. It can take several forms: a set of information that helps you with decision-making, an analytical approach to building campaigns, or a strategic channel that helps you gather demographic data. It can even be a combination of these tools.

Why do brands use geomarketing?

Demographic surveys have long been used by brands to learn more about existing and prospective customers, and historically geomarketing has been used to help retailers choose the right region to open a physical store based on that data. Now, geomarketing is continuing to evolve along with demand for services within specific geographic areas. For example, an estimated 97.1% of users in Brazil access the internet via smartphones—and with so many customers always on the move, the need for geographically relevant messages and services has increased. There are three main advantages of geomarketing:

Audience segmentation. Geomarketing is a great way to segment your audience. This way, your campaigns can extract greater results from specific locations. Use this data to drive better placement in local searches, like “pharmacies in Rio de Janeiro.”

Increased ROI. Without a geomarketing strategy, it’s possible that your campaigns will reach people located far away—who might have no use for your services. For example, that pharmacy in Rio de Janeiro won’t want to advertise to people several cities away. By employing geomarketing, brands have the power to choose exactly where their campaigns run, meaning they’ll spend less for more effective results.

More qualified leads and higher conversion. The previous point shows how targeting more specific, engaged audiences is more cost-efficient. But it can also earn you more leads, because you’ll be reaching an audience likely to have a greater interest in your product or service—especially when taking other data, like purchasing behavior or interactions on social media, into account.

If you own an ice cream parlor in Brasília, for example, and you're on a tight marketing budget, geomarketing will help you to get leads who are in Brasília, close to the neighborhood and interested in ice cream. This way, you'll get more conversions at a much lower cost than advertising to the whole city, or to all of Brasília, the state.

Here’s how to use geolocation marketing in your business.

Once you understand the concept of geolocation marketing and how important it is, you can use one of many pieces of software available to manage data and optimize your geomarketing efforts, like Google Analytics or Meta Ads. Here are three tactics to get the most out of geomarketing.

Geotargeting. Geotargeting is a way of showing users content based on their location. With a database that maps IP addresses onto specific locations, you can target by country, state or even ZIP code depending on your platform of choice.

Geofencing. Geofencing is the use of technologies such as the Global Positioning System or radio frequency identification to create a virtual geofence. In other words, it involves collecting location data from electronic devices in order to take action based on it. You can use geofencing to deliver real-time content to your customers based on their GPS data. Note that geofencing requires the use of a branded application that your audience has already downloaded and authorized to track location data.

Another way to serve content to customers is by leveraging third-party platforms like Waze, a collaborative traffic and navigation app. By using Waze Ads, your content can be shown to drivers within a certain vicinity.

Geotagging and check-in. Another interesting geomarketing tactic is the strategic use of the check-in feature. For example, if you create a Facebook and Instagram page that includes your business address, both apps will allow customers to check in. Marking the location helps others easily find the profile of the business, along with other useful info.

Geotagging is similar, in which users tag the business location to a photo or other piece of content when sharing it to social media. Again, this helps people discover the business and generates publicity for the brand. Because people tend to be influenced by their peers, this can be a great factor in analyzing consumer behavior.

You can leverage geomarketing alongside other marketing strategies, too.

Geomarketing becomes even more useful when tied to other marketing strategies. Having access to customers' location is a great way to build efficiency across your brand’s actions. You can analyze market competition in your region of choice as well as the behavior of your target audience.

Geomarketing involves large volumes of information, and you can use that additional info to optimize your processes and improve business strategies overall—like directing investments to regions with the greatest potential for conversion, or identifying areas with high demand for your products or services.  

Geomarketing truly shines when you look for quality information that can provide insights into consumption patterns or other data obtained through studies, thus improving your geomarketing performance. For example, you can look into public databases of sociodemographic data. My team in Brazil uses IBGE, PNAD and Ipea.

With that, you should be ready to begin supporting geomarketing. For my team, geographical diversity is a big part of what we do, and leveraging insight into the interest and behaviors across different regions, cities and places is a fascinating way to deliver content to build your business. By using the strategies above, you’re well on your way to meeting the diverse needs of your own customers.

Learn how to leverage insight into the interest and behaviors across different regions, cities and places with geomarketing. data analytics Google Analytics data consumer insights Data Media Data maturity

Activating Your Data with Google Cloud Platform’s Natural Language AI

Activating Your Data with Google Cloud Platform’s Natural Language AI

AI AI, Data, Data Strategy & Advisory, Data maturity 4 min read
Profile picture for user Juliana.Jackson

Written by
Iuliana Jackson
Associate Director, Digital Experience EMEA

Activating textual data

If you ever find yourself wondering why anyone in this world would collect valuable first-party and zero-party data without activating it, you’d be surprised to hear that many brands do. More often than I’d like, I see them sitting on glimmering gold in the form of surveys, feedback forms, open-ended submissions and comments. Just like the valuable metal, this textual customer data can be mined to extract meaning and insights into a customer’s attitude towards your products and services.

As a digital treasure hunter, I know better than to leave this gold in the ground—and as a Google partner, I also know how to mine it. Through Google Cloud Platform’s (GCP) Natural Language Processing (NLP) AI, digital marketing partners can help brands conduct sentiment analysis, among other methods, to gather insights into customer behavioral patterns, expectations, complaints and moods, and therefore determine the level of brand loyalty. 



The quantitative data that you obtain through this research method allows you to build dashboards and visualize brand sentiment across regions. The aim here is to discover any areas for improvement, as these data points can be used to optimize a brand’s mobile and web applications or products and services—thus informing their next steps in the experimentation process and helping them get closer to meeting their audience’s needs. 



Over the last few months, I’ve focused on integrating sentiment analysis into our experimentation offering, and it’s quickly changing the game. In the spirit of sharing learnings and making sure no brand leaves their valuable data untouched, let’s talk about why this method is as good as gold. 

Leveraging textual data to determine brand sentiment.

Imagine you’re a top-tier global brand in the food and beverage industry. You’ve recently added new features to your app, and so you’re eager to find out if customers are enjoying this enhanced experience. Right now, there are over 500 thousand reviews on the Google Play Store. Scouring through them would most certainly go a long way, but who’s got that kind of time? It’s a classic case that we see all the time: brands tracking everything, but not doing anything with the info they keep track of. However, this trove of data from active customer interactions is only a treasure if it’s activated and applied effectively. 



This is where sentiment analysis comes in. Made possible by GCP’s suite of tools, this research technique analyzes digital text to determine the emotional tone of a message, such as a review. As part of experimentation, which is all about creating impactful changes to meet the needs of your customers, sentiment analysis allows you to translate qualitative textual data into quantitative numerical data. The aim is to surface key insights about brand loyalty—in the case of said brand, how customers feel about the app’s new features. And then? That’s right, much-needed data activation.  

Put your data to work to improve your business. 

Diving into the nitty-gritty of conducting sentiment analysis, you’ll see it’s very easy to adopt this method. With this AI solution, there’s no need for marketers to manually go through one review after another to get a sense of people’s opinions.



Here's the rundown. Once you have access to a Google Cloud account, you can organize your qualitative, transactional and behavioral data in Google Sheets and Google Cloud Storage. Then, use Apps Script (or another cloud client library) to create a custom menu and leverage GCP’s natural language API. Once you've enabled the natural language API and created an API key, you can start processing your data in a request to the NLP API and then automatically perform sentiment analysis. Ultimately, this opens the door for you to act on those insights through A/B testing campaigns, web and app optimization, brand marketing, and product marketing.



GCP’s Natural Language Processing API is so powerful because it combines sentiment analysis with named-entity recognition, which is a sub-task of information extraction that seeks to locate and classify named entities mentioned in unstructured text into predefined categories. For example, in the sentence “I get a cappuccino every day and I love that I can now earn points on the app and get a discount on my favorite product” we can already identify two types of entities: the product and the platform. So, the tool not only provides information about people’s sentiment, but it also connects this sentiment to the entities in the text.

Monk Thoughts If you ask me, using Google Cloud Platform’s tools in conjunction with GA4 as your data collection tool is one of the coolest things that’s happened to marketing.
Iuliana Jackson headshot

Of course, this isn’t all new—it’s just become mainstream now that Universal Analytics has officially sunsetted, and we’re all moving on with GA4 (if you haven’t yet, this is your sign to do so).

Never let your customer data go to waste. 

Understanding user behavior, expectations and struggles should always be at the core of your efforts. Such critical information fuels all your experiments and supports you in fine-tuning your products and services. So, next time you’re thinking of leaving reviews unread and letting gold wither away, think again—because this easy, AI-powered solution and the partners that know how to apply it are here to help you extract meaning from your valuable first-party and zero-party data. And to add some fresh cherries to the pie, Google has new AI services that would allow you to automatically reply to those reviews and comments, using a Large Language Model (LLM)—but more on that next time.

As a Google partner, we can help brands conduct sentiment analysis using Google Cloud Platform's AI tools to understand their customers' level of loyalty. Google Analytics customer data AI Data Strategy & Advisory Data AI Data maturity

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss