Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

Monks and Google Cloud: Powering the Future

Monks and Google Cloud: Powering the Future

Data Data, Data maturity 2 min read
Profile picture for user Mackenzie Gaura

Written by
Mackenzie Gaura
Director, Partnerships

Badges displaying the specialization certification of Data Analytics and Marketing Analytics for Google Cloud Platform

I am proud to announce Monks has achieved two distinguished Google Cloud partner specializations: Data Analytics and Marketing Analytics. These accolades mark a significant milestone in our journey as long-time, strategic partners with Google, solidifying our commitment to pushing the boundaries of marketing and technology.

A Legacy of Strategic Partnership

For years, Monks has partnered with Google to elevate our clients' marketing strategies to unprecedented heights.  

Monk Thoughts Our collaboration has been driven by a shared vision of delivering transformative solutions that empower businesses to harness the full potential of data and technology. Securing these specializations is a testament to the depth of our expertise and the strength of our partnership with Google Cloud.

Driving Innovation Through Commitment

These certifications underscore our pursuit of excellence in expanding our Google Cloud knowledge base and adopting innovative solutions. Achieving the Data Analytics specialization demonstrates our ability to excel in data ingestion, preparation, storage and analysis using Google Cloud technology. Meanwhile, the Marketing Analytics specialization highlights our capability to transition clients from fragmented datasets to data-driven marketing strategies that yield measurable results.

"This enables Monks to support partners in developing both technical solutions and data strategies that drive their success" my colleague Mikey adds.

Setting the Stage for AI and Machine Learning

These specializations not only validate our expertise but also prepare us to leverage the latest advancements in AI and machine learning. By combining cutting-edge technologies with our deep understanding of data analytics and marketing analytics, we’re poised to unlock even greater value for our clients.

Expanding Capabilities & Innovating with Looker

As we celebrate these achievements, Monks is already looking ahead, leveraging tools like Looker to empower our clients with actionable insights and seamless data visualization. Mikey says, “Looker's semantic layer translates raw data into a language that both downstream users and LLMs can understand. By utilizing LookML to provide trusted business metrics, we create a central hub for data context, definitions and relationships, powering both BI and AI workflows to drive our clients’ businesses forward.”

A Commitment to Client Success

At Monks, our mission is simple yet profound: to empower clients with innovative, data-driven solutions that deliver tangible impact. For years, we’ve championed the importance of breaking down silos, enabling seamless data accessibility across organizations, and building pathways for activation (Optimizing Workflows with First-Party Data). 
These specializations are more than milestones—they embody our dedication to excellence and our pursuit of success for our clients.

“To put it bluntly, we focus on delivering tangible business value to our clients by applying both our technical and marketing expertise,” says Hayden Klei, VP Data Consulting. “We don’t focus on shiny vanity projects for the sake of industry accolades, rather, we define success by the impact we deliver to our clients’ top and bottom lines.”

As we continue to invest in Google Cloud and our talented teams, we are setting ourselves up for even greater achievements, driving value through expertise, innovation and a shared vision of success. Together with Google, Monks is ready to lead the charge into the future of marketing analytics and data-driven solutions.

For organizations seeking a trusted partner to navigate the complexities of data and marketing, Monks stands ready to deliver unparalleled expertise and transformative results. 

Monks has achieved two distinguished Google Cloud partner specializations: Data Analytics and Marketing Analytics. Monks Specialization Announcement: Google Cloud Monks has achieved two distinguished Google Cloud partner specializations: Data Analytics and Marketing Analytics. google Google Cloud Platform data cloud data analytics Data Data maturity

From Dreams to Reality • AI-Powered Performance Creative for Hatch

  • Client

    Hatch

  • Solutions

    Artificial IntelligencePaid SearchMedia Strategy & PlanningPerformance MediaPerformance Creative

Results

  • 50% fewer design & production hours
  • 31% better cost-per-purchase
  • +80% CTR
  • +46% more engaged site visitors

00:00

00:00

00:00

Case Study

0:00

Managing the cost of consumer education.

Hatch, a sleep wellness company that teaches families how to develop better sleep habits with its restful technology devices, sought to engage new audiences for their Restore 2 product. Operating in a unique product category, Hatch needed their ads to balance consumer education with performance.  They opted for an audience-first approach to advertising that helps users envision an unfamiliar product slotting into their lifestyle. However, this comes with challenges. Namely, digital ad platforms require more creative assets to perform effectively, but photoshoots, customized ad ideation and design for multiple personas can be both expensive and time-consuming. To tackle this, we partnered with Hatch to leverage AI to strategize, concept, produce and launch personalized ad creative across diverse audience segments in a matter of weeks.

  1. The Process

    Monks.Flow In Action • Our AI-assisted workflows took us from research and concepting to ad launch in just six weeks using Monks.Flow and Google Gemini.

  2. Conversation with Google Gemini LLM to create personas for Hatch
  3. Moving images of generative AI ad components for Hatch including pictures of a bedroom and a yoga studio

    From AI "conversations" with our new personas, we came up with a new modular creative platform we could customize to each.

  4. Media.Monks team discussing Performance Max ad strategy on a virtual meeting
  5. Layouts and designs were crafted with Google Performance Max campaigns in mind, then our Generative AI workflows produced huge quantities of assets in a matter of hours.

  6. Generative AI production workflow from Monks.Flow
  7. Ready to integrate AI into your ad creation workflows?

Swipe
For More!

Drag
For More!

Incorporating AI from end to end.

We used an end-to-end AI-driven approach to tackle these challenges—from the audience research and ideation stage through the production and launch. Here’s what that process looked like:

  1. Persona Development: We leveraged Google’s AI tool, Gemini, to identify three distinct audience personas that would help us reach beyond the historical core buyer. By having “conversations” with the AI personas, we could better understand their lifestyles and preferences, from how they decorate their bedrooms to their hobbies and routines.
  2. Creative Framework: We accelerated the creative process by teasing out insights from the AI on what a good night’s rest means for our target personas. This led us to a new creative framework for the campaign that our human creative team could expand and run with quickly. The AI-assisted ideation helped us land on a fresh platform to test, positioning Restore as “the everything machine” that enhances daily performance through better sleep. Plus, this gave our team a springboard for new taglines, concepts and test themes.
  3. AI-Driven Asset Design: Building on this data-driven platform, our team crafted immersive visual environments that were tailored to each persona. Conversations with AI personas allowed us to marry visual cues from our personas’ lifestyles—from their aspirations and behaviors to their bedroom decor—with performance-first best practices, while honoring Hatch’s brand guidelines and their typical look-and-feel. Freed from the costs or logistics of location shoots and the limitations of repetitive and uncustomizable stock image libraries, generative AI helped us take relevance and personalization to a whole new level.
  4. Rapid Production: Monks.Flow, our proprietary AI workflow technology, facilitated the rapid generation of high-quality ad variations tailored to each persona. We quickly generated dozens of new ad variants in a matter of hours, designed specifically for Google’s Performance Max (PMax) campaigns.



    An effective PMax strategy isn’t just about imagery; it involves video and many text assets too. We used AI to create custom soundscapes using Hatch’s base audio and multiple descriptive text variants to ensure the algorithm had plenty of assets to choose from when serving tailored ads to each user.
hatch restore on a nightstand

In partnership with

  • Hatch
Client Testimonial Monks.Flow is helping our creatives focus more on being creative, and less on rote production tasks—amplifying our team’s mission to bring craftsmanship, speed, and a sense of relevance and culture to our marketing. Not to mention, more time for sleep!
Eric Pallotta, CMO of Hatch

Eric Pallotta

CMO

AI-assisted creative proves efficient and effective.

By integrating AI workflows into the creative process, we produced:

  •   Three original audience personas
  •   One innovative creative idea
  •   Three videos
  •   60 unique ad variants

…at faster speeds and lower costs than ever before. Overall, this campaign represented a 50% reduction in hours and 97% reduction in costs from legacy approaches, freeing up massive resources for creatives and marketers to focus on areas where the human touch is more critical. From ideation to campaign launch, the entire process was completed in half the time of a standard campaign, thanks to AI-powered marketing.

The Hatch team was thrilled: "Hatch is a company that is uniquely positioned to help so many different people and personalities develop better personalized sleep habits. Partnering with Monks to test AI integrations in our ad workflow that will appeal to all these different types of people and their interests, without spending a ton on net new shoots for every single one, has been incredibly exciting to test out," says Eric Pallotta, CMO. "Monks.Flow is helping our creatives focus more on being creative, and less on rote production tasks—amplifying our team’s mission to bring craftsmanship, speed, and a sense of relevance and culture to our marketing. Not to mention, more time for sleep."

00:00

00:00

00:00

Even better, these AI-generated ads are already outperforming legacy tactics. When combined with the power of Performance Max’s AI-fueled audience targeting and ad delivery, these campaigns are driving 31% better cost-per-purchase (CPA) than comparable campaigns.

Additionally, users are engaging more with the Monks.Flow-generated assets and staying engaged after the click: we're seeing 80% higher CTR and 46% higher site engagement rate than other campaigns.

Monks.Flow and Google Gemini were critical in enhancing efficiency without compromising creativity or impact. By integrating AI-assisted workflows into every step—from persona development to asset creation—we delivered an extraordinarily personalized and effective ad campaign for Hatch in record time.

Good nights indeed make great days—and exceptionally effective ad campaigns.

 

 

Want to try generative AI that performs? Get in touch.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

Check out other AI-fueled client wins.

Google’s AI Overviews: FAQs & Action Items for Marketers

Google’s AI Overviews: FAQs & Action Items for Marketers

Paid Search Paid Search, Performance Media 10 min read
Profile picture for user Tory Lariar

Written by
Tory Lariar
SVP, Paid Search

Person holding a cell phone in an office environment

It has been a landmark few weeks for search advertisers: Google’s AI Overviews, formerly known as the Search Generative Experience (SGE), are moving from the experimental Search Labs to public usage, as announced at Google I/O. Then, hot on their own tracks, they confirmed at Google Marketing Live 2024 that the obvious missing component from the announcement—ad placements—are coming now as well. Initially rolling out in the US, with global adoption not far behind, we’ve officially arrived at an AI-powered SERP from both an organic and paid search perspective. Let's explore what this means for marketers and how they can optimize to this new landscape.

Recapping what Google is rolling out this year.

AI Overviews are designed to provide users with summarized, informative answers directly on the Search Engine Results Page (SERP). These overviews are aimed to offer a general response to the user's query and display clickable sources for deeper exploration. Beyond just answering queries, they will allow users to continue the interaction, refine their search and dive into specifics without leaving the SERP, or to view and click into the specific sources that informed that AI response.

Google is also bolstering AI Overviews with related features fed by its AI technology, Gemini. Among the notable new features rolling out over the coming months, their announcements included:

  • "Simplify" and "Break it Down" to make complex answers more digestible
  • Multi-step question handling, to tackle more intricate queries
  • Organizing search results into AI-curated categories
  • Enhanced planning features, to generate complex answers for leisure and travel activities
  • Integrated video search using Google Lens

AI Overviews experience example, with integrated ad placements, provided from Google.

With the announcement of ad placements for AI Overviews, marketers finally had confirmation of how ad placements would be affected by the rollout of the generative content above-the-fold. The ads will include both search and shopping results; the algorithm will coordinate the chosen ad variant or chosen SKU with the user’s query and the content generated in the AI overview, prioritizing relevance. The ads eligible to serve in AI Overviews will be selected from brands’ Broad match search campaigns and Performance Max (PMax) campaigns (which they’re terming “the Power Pair”).

Google also shared that at first, AI Overview Ads will be more focused on retail, travel, home improvement, moving, and similar consumer verticals—this is unsurprising given these are industries that rely heavily on feed-based data, giving the algorithm more data to assess ad relevance from than just some ad copy and a landing page.

With AI Overviews rolling out for both organic and paid search results, marketers are facing several key questions about how to position their brands best and avoid getting knocked down below the fold. Here’s everything you need to know about setting your brand up for success.

Will AI Overviews push my ads lower on the SERP? Not necessarily.

The announcement of integrated ad placements within the AI Overviews section of the SERP changes the initial understanding that many marketers had after Google I/O earlier in May. We can see now that AI Overviews do not inherently push paid results lower on the page, but will fit them into a new context, as part of the generated response. Since the public launch, regular ad placements have been continuing to serve below the AI Overviews as well.

We currently see in our testing that ads appear with the same frequency as before, but we are seeing more experimentation from Google post-GML. We’ve recently seen examples in the wild of AI Overviews showing below-standard ad units (i.e. ads that are not integrated into the AI Overviews content), as well as AI Overviews content serving for mid- and lower-funnel queries (more on that below, see our example images). Marketers should expect ad positioning to keep evolving this year and beyond.

Google search results showing AI Overviews

While AI Overviews are typically the highest result on the SERP (left), we're seeing Google continue to experiment with different layouts, so standard ads could still serve above AI Overviews (right).

Since ad placements are dynamic to each search, Google itself is continually gathering data about what layouts lead to more engagement and satisfaction from users. The launch of all the new ad features will keep impacting this; therefore, it remains to be seen how ad placements will be affected when AI-based search categories and multi-step answers roll out later in the year.

What about organic results? Yes, expect to appear lower and expect traffic decreases.

This has been one of the chief concerns from publishers and marketers since SGE was first announced: if users are getting information straight from the SERP, won’t that mean less traffic to my site? Yes, zero-click content from AI Overviews will take up real estate on the SERP, meeting some searchers’ needs directly.

However, this is part of a longstanding trend of SERP layout changes, from the shift of Shopping results from the right rail to the top of the page, to expansion of Knowledge Graph content like Featured Snippets, People Also Ask (PAA), and other panels. Because of this, “traditional 'Top 10' organic ranking positions no longer exist,” says Dang Nguyen, Sr. SEO Strategist at Monks. “Webpages ranking below the fifth position will be pushed even further out of searchers’ view, and inevitably will see a drastic drop off in CTRs.”

However, there will be more chances for brands to get content to appear in the AI Overviews, to balance out for the decrease. “Since Google is providing multiple links to sources in AI Overviews, there are more chances to get included compared to Featured Snippets, which only provide one source link,” says Nguyen. “If users want more than the brief AI Overview summary, they’ll click through.” This shifts the focus for SEOs and brands to optimize to what the AI Overviews algorithm seeks. More on that below.

Monk Thoughts Traditional 'Top 10' organic ranking positions no longer exist. Webpages below the fifth position will inevitably see a drastic drop off in CTRs.
Dang Nguyen smiling at the camera

Will this affect all queries? Mostly upper-funnel terms…for now.

Our testing shows the most impact at the top of the funnel. Our team has been monitoring key terms for our clients in both the Labs environment throughout Q1 2024 and now in the public SERP. In our live testing, we’re primarily seeing AI Overviews serve for head terms and informational queries; they’re typically mapping to users in a top-of-funnel research journey. For the last few months, we have seen less frequent use of AI Overviews when queries are more “consideration” oriented (typically aligning with mid-funnel or bottom-funnel searches), or especially when users seek out a particular brand.

However, we are seeing early shifts since AI Overviews moved from Labs to public usage: recently, more AI content is appearing for consideration queries, especially those comparing different types of product/service offerings or seeking instruction. See the example images below, captured only days after GML.

Most of the other features announced at GML 2024 focused on queries with high commercial intent; advertisers should expect that brand queries and bottom-funnel queries are more likely to be impacted by Visual Brand Profiles, for example, than AI Overviews.

Google search results comparing the SERP before AI Overviews and after AI Overviews

During Google's Search Labs testing, we observed AI Overviews most often for informational queries, while queries with bottom-funnel intent or brand names were less likely. After GML 2024, we’re seeing more mixed usage for consideration and conversion queries (right image).

Should advertisers be concerned about brand safety?

No, not enough to justify any resistance to these changes. With the public rollout came fresh scrutiny, including inaccurate AI responses going viral, causing a stir in the marketing and tech worlds. It’s not surprising that an LLM would generate some buggy content while being brought up to scale for the first time, or when the user base expands so broadly (from Labs users likely testing it for professional purposes, to any user) that it would need to adapt more human elements, like satire. At the end of the day, marketers should remember:

  1. The generated content is not a black box, for a reason. Google’s search results and Featured Snippets have always relied on user input to validate that they’re serving the right content, and this is no different. No matter what, AI Overviews list sources so users have full access to the original information, and can judge what source sites they trust.
  2. Google is incentivized to improve this product. Since other tools like Microsoft Copilot or Perplexity AI are regularly making headlines and receiving favorable reactions, Google has a competitive need to ensure AI Overviews are meeting users’ needs as quickly and seamlessly as possible—otherwise they’d be jeopardizing their search revenues in the long run.

How can marketers optimize their SEO to appear as sources for AI Overviews?

With the recent SEO algorithm "leak", uncovering documentation about Google’s ranking factors, marketers everywhere are buzzing with “gotchas” about how to hack the organic ranking algorithm; however, at its heart, content SEO is still about understanding your target audience’s needs and buying journey to produce valuable content that they will seek out. Even when AI-generated content was rolled out last year as the Search Generative Experience, we were already talking about how it doesn’t change the fundamentals for a strong SEO strategy as much as people might assume.

First, technical SEO principles are still critical for Google to even be able to effectively find and interpret your site’s content, and must always be addressed first. But beyond that, here are tips to evolve your content strategy for the AI-first SERP:

  1. Focus on depth and own your niche to be considered a topical authority. While many brands are leaning on generative AI to churn out surface-level content quickly and cost-effectively for a wide variety of keywords, Google will be more likely to view your content as meaty enough for the LLM to learn from if it goes deep and offers unique value. “Now more than ever, publishers need to provide much more value-add to their content to succeed,” says Nguyen. “Brands should produce niche-relevant content that doesn't exist anywhere else, eg. new knowledge and facts, interviews with industry experts, verification of accuracy and fact checking, etc… basically everything that good journalism represents.” Websites that show a deeper level of knowledge and expertise will thrive in the new Google search landscape.
  2. Instead of a discrete list of keywords, built content around topics in a hub-and-spoke model based on the buyer journey and natural language. Google touted at GML that searches are continually getting longer, as users query the search engine with more complexity. As searching becomes a more immersive experience, this behavior will be reinforced, until searches more closely mirror conversations. For example, instead of users searching “smartphone with best camera,” in the future they will be more likely to query “currently have a google pixel phone and looking for something better, price point max of $1,000, and a device where I can take great landscape pictures.” Basic keyword analysis won’t cut it anymore in that environment. “It’s important to understand the entire customer journey and how it relates to the user’s intent in that moment, then offering that exact information they’re looking for,” Nguyen adds. “Ensure that your content is covering the entirety of the customer journey.”
  3. Don’t neglect third-party credibility. Brands must reinforce overall site authority with good backlinks from other highly relevant and authoritative sites. This doesn’t just include press coverage and classic backlinking tactics, but also engaging where users are already having conversations. “Don’t forget to distribute content across various social platforms and UGC forums (serving the appropriate format), like Reddit.com and Quora.com,” Dang Nguyen suggests.
  4. Learn from the AI-generated content itself. Unfortunately, at this time Google Search Console won’t differentiate the traffic coming from AI Overviews specifically, but we can learn from the answers they receive when performing our own searches. Marketers should engage with AI Overviews themselves. On a recurring basis, see what content serves for your most relevant queries and analyze AI Overview responses to identify gaps and opportunities for enriched content creation.

     
header

How can advertisers serve in AI Overview Ads, and what’s the “Power Pair”?

AI Overview Ads are not fueled by their own campaign type the way Demand Gen and Performance Max campaign placements are. Instead, eligible ads come from the “Power Pair”— PMax campaigns and Broad match keywords in search campaigns. Therefore, it’s critical for advertisers to lean into both tactics, prioritizing budget allocation and testing to increase your odds of serving. To make sure you’re hitting performance goals when leaning into the Power Pair, advertisers should:

  • Pay attention to the top and the bottom. Monitor your top-performing keywords for any immediate performance losses across core KPIs as AI Overviews roll out. This would signal that users only sought a surface-level answer and didn’t get added value from clicking to your site. Also, reinforce your foundation by continuing to monitor for poor-performing queries on your Broad terms, maintaining robust negative lists.
  • Share knowledge more frequently. Continually align on cohesive keyword and topic strategies with SEO to share learnings about which paid keywords perform best and to supplement any burgeoning site content with paid ads.
  • Check ad strength, and perform more copy tests as needed. To combat expected decreases in CTR, ensure your ads have strong ad strength or you’ll risk serving even less frequently. If strength ratings aren’t as high as you’d like to see, launch new RSA tests that prioritize relevance (more directly work in the query to the ad copy and landing page experience).
  • Maximize audience signals to feed the algorithm. To thrive in this new landscape, brands must feed the ever-hungry algorithm as much data as possible. First-party audience data from a CRM or CDP is worth its weight in gold, in addition to increasing the number of (meaningfully different!) creative variants, and making your product feeds as robust as possible. All of these signals will point the algorithm the right direction towards users most likely to convert.

What other changes do advertisers need to make to gain a competitive advantage?

Nimble advertisers who can act quickly in these other areas will have the most to gain:

  • Utilize more visual ad formats to stand out as Google makes the SERP layout even more dynamic. Enhanced personalization means marketers must compete even more for relevance.
  • Establish new performance benchmarks. With CTR and traffic volume likely to shift rapidly, it’s important to update the SEM/SEO KPIs you measure against, to proactively understand MoM and YoY differences. Google Search Console won’t necessarily shed light on this directly: “At this point, clicks and impressions shown in GSC will not discern between regular SERP clicks and AI Overviews clicks,” says Nguyen. Therefore, to quantify the impact of zero-click content, leverage incrementality testing, such as matched market tests or holdout tests. Historically, most SEM marketers have focused their holdout tests on Brand terms, but with AI Overviews impacting terms at every stage of the funnel, it's now crucial for broader categories.
  • Combat rising CACs from lower traffic with CRO. While we anticipate a decline in CTR and click volume, the clicks that do occur will likely be of higher intent, leading to better conversion rates (CVR). To bolster this, brands should invest in conversion rate optimization, A/B testing their landing pages and site experiences to reduce friction or leaks in their conversion process.
  • Consider the impact to other media channels. For example, with lower site traffic from some search terms we can expect remarketing pools to decline; to avoid oversaturating your consideration audiences if they shrunk, set frequency caps on your remarketing campaigns.

The SERP of 2024 and beyond is personalized and interactive; are you ready to embrace the change?

Google's rollout of AI Overviews marks a significant evolution in the search experience. This AI-driven shift promises a more personalized and interactive SERP; if this pays off with  enhanced user satisfaction and engagement as Google projects, it’s ultimately a win for marketers hoping to reach engaged audiences. To benefit from this, it necessitates strategic recalibration for both advertisers and SEO professionals. Embracing this change with innovative approaches and enriched content will be key to thriving in the evolving search landscape. Welcome to the future of search—where AI Overviews lead the way.

Take action on recent news and ongoing developments about Google’s AI Overviews, with these FAQs and action items from the SEM and SEO experts at Media.Monks. Take action on recent news and ongoing developments about Google’s AI Overviews, with these FAQs and action items from the SEM and SEO experts at Media.Monks. paid search google Generative AI AI search engine marketing search engine optimization Performance Max Google AI Overviews Paid Search Performance Media

Next-Level Data Management: The Evolution of Consent Mode V2

Next-Level Data Management: The Evolution of Consent Mode V2

Data Data, Data Privacy & Governance, Data privacy 4 min read
Profile picture for user Oksana Davydenko

Written by
Oksana Davydenko
Analytics Manager

add

In an era where data privacy concerns are a hot topic, the demand for digital privacy and personalization has become increasingly important. Recognizing these pressing needs, Google made updates to the Consent Mode API that offer more nuanced consent settings.

To enhance user control over data collection and personalization, Google has introduced two new parameters in the Consent Mode arsenal. These parameters allow for more granular consent settings, specifically related to advertising data passed to Google:

  • ad_user_data: This parameter manages consent for passing user data to Google for advertising purposes.
  • ad_personalization: This parameter controls consent for personalized advertising.

In the following paragraphs, we delve deeper into the importance of Consent Mode Version 2 (V2) and its implications for user privacy and data management.

Why is it important?

Google stresses the importance of obtaining consent from end users in the European Economic Area for the use of their personal data, as required by law, when incorporating these new parameters. It is important to note that these requirements also extend to cases where Google Analytics data is shared with Google Marketing Platform (GMP). For example, if you are sending Google Analytics 4 (GA4) to GMP or importing GA4 conversion to optimize ad campaigns on the GMP side, you should collect these new consent parameters.

For customers who have implemented Consent Mode, the ad_storage parameter will be automatically mapped to the new ad_user_data parameter starting March 2024. This means that when consent is granted or denied for ad_storage, ad_user_data will respect the same setting, ensuring that the performance measurement capabilities of your original implementation continue to work as expected.

No changes will be made to the new ad_personalization parameter configuration, but it is necessary to integrate the new consent mode parameters to maintain access to tag-based audience and personalization features through Google Ads, GA4 or GMP. This can be done either directly in your Google tag (gTag) or through a consent management platform (CMP) that has successfully migrated to the new version of consent mode (for example, OneTrust recently confirmed they completed the updates for two new consent parameters in new integration).

Understand Consent Mode V2’s impact on data collection.

Deciding not to implement/upgrade to Consent Mode V2 will impact audience collection in your GA4 property. The size of audiences you use for remarketing will likely decrease since there is no user consent collected for the ad_personalization parameter.

And as we mentioned above, should you fail to collect the ad_user_data parameter, which is necessary for sharing conversion data from GA4 to GMP, Google ensures that it will be appropriately mapped to the existing ad_storage parameter. This means that at least a Basic Consent Mode implementation is mandatory.

It’s equally important to emphasize that obtaining user consent extends to data tracking in mobile applications as well as data uploads to Google. Make sure that you send consent parameters with this data. Otherwise, if data sent to GA4 is not labeled as consented, it will negatively impact conversion tracking and audience size for remarketing. You will not be able to share conversion data with Google Ads for campaign optimization and will not be able to leverage data modelling. And in the case of audiences, you might see a decrease in the size of your remarketing audiences if you do not collect ad_personalization consent.

Validate your Consent Mode V2 setup.

To validate Consent Mode V2 setup, you should check the network requests tab in the browser’s developer tools. In the Network tab, the gcd parameter should appear in all requests sent to Google (GA4, Google Ads, etc.)

The gcd parameter should appear in all requests sent to Google

Important: If you are sending data to the server side container, you should look out for a parameter called ‘sst.gcd’ in the same Network request. Its value should be the same as the gcd parameter value.

Look out for a parameter called ‘sst.gcd’ in the same Network request

Each value starts with the number 13 and ends with the number 5. In between, you’ll find a string of letters separated by the number 3. These letters correspond to different consent states (either default or updated), and their sequence corresponds to the following signals: ad_storage, analytics_storage, ad_user_data, and ad_personalisation.

If this sounds confusing, don’t worry. Using the table below, we’ll decode an example gcd string together:

Use the table to decode an example gcd string

Let’s decode the example 13v3u3v3v5:

  • ad_storage = granted (both by default and after update)
  • analytics_storage = denied (granted by default and denied after update)
  • ad_user_data = granted (both by default and after update)
  • ad_personalisation = granted (both by default and after update)'

Recap 

Google's release of Consent Mode V2 is crucial for businesses operating in today's privacy-focused landscape, ensuring:

  • Enhanced User Control: It empowers users with granular control over their data. This transparency builds trust and fosters positive user experiences.
  • Compliance with Data Privacy Regulations: Consent Mode V2 helps businesses comply with evolving regulations like GDPR and CCPA. 
  • Accurate Data Measurement: For businesses, Consent Mode V2 provides a clearer picture of collected data. This allows for more accurate measurement and analysis in advertising and digital analytics. This, in turn, helps optimize marketing campaigns and improve user targeting strategies.

Future-Proofing Data Practices: As data privacy regulations continue to develop, Consent Mode V2 positions businesses for the future and helps companies demonstrate a commitment to responsible data collection.

Discover the evolution of Consent Mode V2, ensuring enhanced user control and data compliance with Google's latest updates. data privacy google Data Privacy & Governance Data Data privacy

Build Your Data Game Plan with Insights from Superweek

Build Your Data Game Plan with Insights from Superweek

Data Data, Data Privacy & Governance, Data Strategy & Advisory, Death of the cookie, Industry events 1 min read
Profile picture for user mediamonks

Written by
Monks

Headshots of Doug Hall and Julien Conquet

The data landscape is no stranger to tectonic shifts that curtail brands' control. From Google's announcement to push back cookie deprecation once more, to Apple's app tracking transparency, to differences in data regulation around the globe, emerging bumps in the road continue to challenge plans to provide personalized user experiences. These issues—and more—were discussed at the 2022 Superweek Analytics Summit, a global community of digital marketing professionals, analysts and thought leaders of the measurement industry.

Now, marketers can relive the excitement and ideas of the conference (or encounter them for the first time) in a new documentary. THE GAME features insights from speakers—including Vice President of Data Services and Technology (EMEA) Doug Hall and Senior Director of Analytics, EMEA Julien Coquet—to discuss how recent developments in data collection, activation and regulation are reshaping the strategies of brands and their partners.

For a clear understanding of where the industry is headed, find the documentary in full below. Look forward to more Superweek next year, running from January 30 to February 3 in Egerszalok, Hungary!

Monk Thoughts It's like the classic physics three-body problem, where we have tech, regulation and public opinion are the three bodies. The physics problem states that their orbits are so complex in the system that you cannot predict where these entities are going to go.
Doug Hall headshot
Get insights from this year’s Superweek Analytics Summit, a global community of digital marketers, analysts and thought leaders of the measurement industry. data analytics google Google Analytics data privacy third-party cookies first-party data Data Data Privacy & Governance Data Strategy & Advisory Industry events Death of the cookie

Sunset of Universal Analytics: What should businesses do for 100% data continuity?

Sunset of Universal Analytics: What should businesses do for 100% data continuity?

4 min read
Profile picture for user Jakub Otrząsek

Written by
Jakub Otrząsek
VP of Data APAC

A checklist outlined in white

This article was written in collaboration between Jakub Otrząsek, VP of Data, APAC and Edie Cheng, Head of Digital Marketing & Analytics, Media.Monks.

Google Analytics is the most commonly used tool by organizations to get an in-depth understanding of customer behavior on their website or app. Furthermore, it integrates with the Google Marketing platform and helps connect the dots between user behavior and digital marketing campaigns that impact brand experience.

Now Google Analytics’ third generation, Universal Analytics (UA), is approaching the end of its life cycle. Google is heralding its successor, Google Analytics 4 (GA4), as an easier and more seamless way to connect data across multiple touchpoints and provide marketers with a clearer picture of the end-to-end customer journey.

What is changing?

As GA4 matures, Google recently announced it is committed to retire the legacy version, Universal Analytics. Users of the free version will have until the end of June 2023. Premium (paying) customers will however receive an additional grace period of three months.

By that time, Universal Analytics will stop collecting new data and users will need to completely migrate to GA4. As most organizations need more than one year of data to run year-on-year analysis and prime for change, we strongly recommend brands implement tracking in GA4 before the end of June 2022. This is crucial as Google is not planning on providing any data migrations from Universal Analytics to GA4. 

Why does it matter?

As data will no longer be collected by Universal Analytics, multiple metrics and measurement will come to a halt, which will hinder a holistic view of traffic and marketing activities on your website and/or app. Here are things that require immediate attention to achieve 100% business continuity.

Metrics and goals 

As GA4 has a different data model to Universal Analytics (event based vs. session based), metrics will be defined differently, even if they might have the same name. For example, a lot of reporting in Universal Analytics was based on Goals. Universal Analytics Goals can be mapped to GA4’s Conversions, however the configuration is slightly different therefore comparing YoY results may be challenging as it isn’t exactly an apples-to-apples comparison.

GA4 allows 30 Conversions per property (50 Conversions for paying customers), and while Universal Analytics only allows 20 Goals per view, a single property could have more than 30 different types of Goals. If this is the case for your organization, we recommend reviewing your measurement approach and working to consolidate your Goals so that you are only tracking the most important events as conversions. Additional conversion points can be still tracked and presented in custom reports.

Reporting and dashboarding configuration

Regardless of how your dashboards or reports were implemented, changes are required to swap to the new version of GA. In many instances, additional work is required as there is no 1:1 functionality from GA360 to GA4. Actions to take include:

  • Create new reports and dashboards using GA4 metrics and properties.
  • Make sure commonly used reporting views in GA360 have counterparts in GA4 with data collected correctly.
  • Remove user access for obsolete dashboards to ensure everyone is using the right data.
  • Secure additional data exports so that you can create multi-year comparisons and conduct trend analysis.
  • If your organization has GA360, keep in mind that you can access more reports and data through the UI than what is stored in Google BigQuery. Data coming from integrations via Campaign Manager, Display & Video 360, and Search Ads 360 containing metrics and advertising dimensions (affinity, demographics) are not available in BigQuery. Review any dependencies on UI-based reporting and identify alternatives.

Audiences and integrations

Integration of Audiences shared with other Google Marketing Platforms—from Google Ads to Optimize 360 and DV360—will stop working, as Universal Analytics will not receive any additional data. Actions to address integrations include:

  • Reconfigure available platform integrations in GA4.
  • Identify custom-built integrations using the Core Reporting API. These will need to be recreated with the GA4 API.
  • Identify important audiences for analysis and media buying and create them in GA4.

Channel and cross-device attribution

GA4 shares a module for Attribution with Universal Analytics. However, the “old” conversion reports had a premium (GA360) module called Data-Driven Attribution (DDA). Data-Driven Attribution has not carried over to GA4. If you are using an “old” conversion report, be sure to understand the differences in the attribution of Channels because GA4’s event-based model calculates attribution differently to Universal Analytics. 

Cross-device attribution in GA4 is more sophisticated and leverages Google Signals, Google’s privacy-safe identity software that links Google user activity across devices, across all reports, if enabled. Universal Analytics only leverages Google Signals for some reports. Create GA4 vs. Universal Analytics Comparison dashboards to compare conversions attributed to channels and standard user metrics to create comparable GA4 baselines for decision making.

Remember, you will not be able to use Universal Analytics baselines for GA4 data analysis, so it’s important to set up a baseline GA4 implementation as early as possible.

Data collection methods

For data collection that doesn't use a typical tag manager, other considerations need to be made:

  • Measurement Protocol
  • Hard coded (not tag manager), or libraries (e.g. React Google Analytics Module)
  • Custom data imports 
  • Server-side tagging
  • Offline conversion importer

Simply leaving the code/scripts “as-is” may not break anything initially, but may cause issues down the line. Once you are ready, remove any old Universal Analytics code from the code base and move towards a tag manager.

Training and capability

As you can see from the points above, there are vast differences in collecting, managing, and using data between GA4 and Universal Analytics. Preparing your organization with the knowledge and skills to derive value from GA4 will require investment in time, technical resources and upskilling your workforce. Identify skill gaps across the organization and develop tailored training modules to upskill stakeholders in GA4.

Some examples of training could include:

  • Implementation and Data Collection in GA4 for developers or technical analysts.
  • GA4 Analysis Fundamentals for analysts or marketers who are concerned about website or campaign performance.
  • Foundations of GA4 Change Management and Migration for administrators, project managers, or decision makers who need to understand how migrating to GA4 will affect them and what they can do to prepare for it.

Dawn of new changes 

With Universal Analytics ebbing away, there is a lot to be done to get your organization ready. While July 2023 sounds far away now, if your team wants a year-over-year comparison of data, you will need to implement data collection by June 2022. Setting up data collection, getting familiar with the differences and benefits of GA4, and developing baseline reports are gradual and necessary projects to take on in the coming months to set yourself up for a seamless transition in 2023. Media.Monks can tailor a GA4 Migration Program for your organization to ensure business continuity and digital maturity uplift as you enter the new era of GA4.

With the sunset of Universal Analytics and users migrating to Google Analytics 4 (GA4), here's what brands can do now for 100% data continuity. With the sunset of Universal Analytics, here's what brands can do now for 100% data continuity. data analytics google data driven

Austria Has Not Banned Google Analytics

Austria Has Not Banned Google Analytics

4 min read
Profile picture for user doug_hall

Written by
Doug Hall
VP of Data Services and Technology

A title accompanied by a phone with google analytics on it and a girl on a bench

The past month has seen numerous cases of entities in the European Union being found in breach of GDPR. Local authorities have purported that transgressors’ use of Google Analytics has exposed data processing that violates GDPR obligations—prompting Austria’s Data Protection Authority to issue penalties for violating GDPR norms.

But the product is not the subject of the ruling; the transfer of data, its use and safeguarding measures must warrant scrutiny. If Google Analytics is held to be illegal, the verdict will also have an immediate impact on all products and services that transfer data outside of the EU.

While this article is not intended to be legal advice, I intend to share potential future areas of discussion for EU-US data transfer—and immediate steps to take in light of the recent decision from Austria.

The Evolution of Privacy Regulations

Understanding how and why calls for a history lesson. Long before GDPR came into effect, there was the “Safe Harbor” agreement made between the EU and the US. The 2000 agreement allowed companies to self-certify they would protect EU citizens’ data if storing it within US data centers. The agreement stood for 15 years until it was invalidated by the European Court of Justice.

Safe Harbor was followed by the “Privacy Shield” agreement in 2016, which imposed stronger restrictions on US businesses in accessing and transferring EU citizens’ data. But in 2020, the Privacy Shield met the same fate at the hands of the Court of Justice via resolution C-11/18— colloquially called “Schrems II,” a reference to Austrian lawyer and privacy rights advocate Max Schrems.

Schrems began his privacy battle based on the testimony of Edward Snowden in 2013, regarding the PRISM program that gave the United States National Security Agency (NSA) unfettered access to data. Schrems argued that Facebook aided the NSA, violating the rights of EU citizens to have their data processed fairly.

At this time, the basic principle is that when personal data leaves the EU, the law travels with it. It’s referred to as the transfer of personal data to third countries. For example, Third Countries might be the US, Australia, the UK or anywhere outside the European Union. Recent violations of GDPR, then, are not specific to Google or even data housed in the US; it’s equally applicable to Adobe, Facebook, Amazon and all third parties who function as data collectors across geographical boundaries.

What Does This Mean for the Internet and Data?

The issue that Schrems II (PDF) raises fundamentally applies to the internet as a whole: analytics data collection uses basic internet technologies that are no different than those used when a browser loads an image. The image request still sends cookies and exposes the user’s IP address to the request endpoint.

Still, how analytics are used and how data is managed requires attention and respect for regulation. Increasingly, data protection authorities (DPAs) are ordering the suspension of personal data transfers to third countries. In March ’21, the Bavarian DPA found an unlawful transfer from Germany to the US by MailChimp. A month later, the Portuguese DPA ordered a suspension of personal data transfer to the US and other countries outside the EU by Cloudflare.

Ensure Your Data is GDPR-Compliant

How Google Analytics is used has always been subject to scrutiny and regulation. As a result, it is prudent to make sure all your data collection and activation is compliant with the most current regulations. Consider these basic steps as possible actions and repeat them at least each quarter:

  1. Anonymize IP addresses in Google Analytics. This will impact geographic reporting, but is a relatively small trade-off.
  2. Ensure your cloud data storage is located in the EU. This is an opportunity to review all data storage locations.
  3. Make sure your consent banner is compliant. Implement an automated scanning process that runs on a regular cadence to quickly identify the setting of cookies without consent.
  4. Review your cookie and privacy policies regularly for compliance.

Get third-party legal advice to ensure compliance or address any questions you have. A data partner like Media.Monks can also provide support in implementing changes to Google Analytics and providing automated solutions to measure and analyze data collection with respect to consent banner functionality.

Where Do We Go from Here?

The subject of the Austria DPA’s complaint is the transfer of personal data to the US that lacks adequate protection from US authorities who gain access to it. Standard Contractual Clauses (SCCs) have been used previously to allow data transfer, however, questions have been raised regarding the feasibility of SCCs with respect to FISA (PDF). New SCCs have been published that require supplementary measures that go beyond encryption, referring specifically to scrutiny of the destination country’s legal regime. Google maintains these measures have been met (PDF).

Currently, there appears to be difficulty where both encryption and transparency requirements seem to contradict each other. Revised SCCs or a successor to Safe Harbor and Privacy Shield appear to be the favored solution by Google, although the practicalities and timing of such solutions remain unclear. Until then, following the steps above to regularly review compliance goes a long way to ensure your brand remains in good graces.

Numerous cases in the European Union have been found in breach of GDPR. Here’s some potential future areas of discussion for EU-US data transfer—and steps to take now. Numerous cases in the European Union have been found in breach of GDPR. Here’s some potential future areas of discussion for EU-US data transfer—and steps to take now. data data analytics google data privacy

What’s in a (Domain) Name, and How Does It Matter in a Cookieless World?

What’s in a (Domain) Name, and How Does It Matter in a Cookieless World?

4 min read
Profile picture for user Jakub Otrząsek

Written by
Jakub Otrząsek
VP of Data APAC

Fortune cookies with a fortune coming out of one

“What’s in a name?” William Shakespeare’s famous line now has a new meaning with respect to Privacy Sandbox, an initiative led by Google to protect user privacy while giving companies the tools and insights they need to better build digital experiences. In studying recent announcements from Google about their new Privacy Sandbox feature called Topics, I’ve noticed the new feature will have implications for how brands claim their space online—in particular, it may be time to consolidate under a single domain name.

What’s topical about Chrome’s Topics?

The Chrome browser currently has a strong market foothold with above 50% in market share, despite not really having a solution following Safari’s (Apple’s) crusade to kill third party cookies. While it hasn’t gone for marking the cookiecalypse yet, like Safari and other browsers have, Chrome has attempted to address the needs of marketers with ideas to alter some marketing capabilities to work in a privacy-safe way. 

Google has sought solutions which would enable some form of safe profiling and data exchange between martech players as ad revenues continue to be mission-critical for the health of many businesses. These solutions are going to be built into the Chrome browser, packaged as the Privacy Sandbox.  

The most recent announcement introduces Topics, which are an updated version of FLoC (federated learning of cohorts). The initial idea behind FLoC was to create a mechanism which would classify users based on their behavior into cohorts which guarantee privacy (through entropy). By design, cohorts would be more generic and would remove 1:1 targeting, but at the same time would restore interest-based targeting. The main issue of the initial solution was in mathematics, as algorithms were translating domain names into numbers without a clear understanding of “topicality” of the site. 

As FLoC did not win hearts of the industry, Google went to the drawing board and came back with Topics. The idea of cohorts still persists, though the mechanism of translating domain names into “topics” for further targeting was updated along with some privacy assumptions. Even though the proposal is not yet fully developed, there is a consistent approach of using domain names in order to classify users into cohorts. Google envisions some form of a dictionary and set of rules that determine which domain name translates to which topic. Current documentation points out that usage of sub domains is encouraged to support mapping into topics. 

How does the third-party cookie crumble crush my current domain name strategy? 

The main issue with third-party cookies is that they enable “foreign” actors to collect information about individuals as they travel between different sites. Though as everything in a binary word of computers, the definition of a foreigner is very black and white. All cookies set by a different domain are considered third parties. Computers do not care much about the structure of your organization, brands, subsidiaries and ownership. 

Multibrand businesses which operate across multiple domains face challenges in building user profiles without third-party cookies. As data management platforms (DMPs) and many marketing solutions struggle to exist without third-party cookies, it is becoming more difficult to create a single customer view across brands one may own. As first-party data strategies are picking up steam, there are some critical decisions to be made. To operate in a first-party cookie context and be able to exchange data between their own brands, brands need to operate under the same domain name. 

The most apparent manifestation of this situation are media outlets which own multiple mastheads. As publishers try to build value propositions around their audiences, every piece of information counts. Without a DMP or third-party cookie, it would be impossible to achieve scale across different sites they own today. 

So, how can I make a name for myself?

As it is possible to register your own top-level domains or TLDs (though they are expensive and it takes time), and we observe ongoing pressure on first-party data collection (meaning you need one universal domain across your whole business), it's time to consider your new universal domain name! 

Let's assume you run a business called “Example” together with two brands, “Big” and “Small.” It’s likely you have example.com, big.com and small.com as domain names. With the lack of third-party cookies, it is hard to exchange information about prospects between the sites. With help of a customer data platform (CDP) or a good data team, you may join first-party data between the sites to research the level of cannibalisation or overlap. 

To simplify your life (and data), you may want to consider big.example.com and small.example.com as primary addresses. This will enable all sorts of integration and will load your first-party data strategy with rocket fuel. If you are big enough, you can go for your own top-level domain to create something like big.from.example and small.from.example. Coming back to Topics, if your brands operate with multiple categories, more subdomains enables better profiling, like automotive.big.from.example or sport.big.example.com. 

How do I get started now?

Well, FLoC did not survive long enough to become a thing and Topics are still quite nascent themselves. Though everyone is pretty committed to getting rid of third-party cookies, and some businesses already operate in a world where over 80% of traffic comes from browsers that no longer support them by default. Google has postponed the moment of putting the final nail into the cookie’s coffin, so the timelines seem rather floaty. 

Today we operate with the assumption that hour 0 will come around next year or the year following. All things considered, there is not much time to prepare for such big decisions. Now, it’s time to plan.

Google’s Privacy Sandbox feature called Topics has implications for how brands claim their space online—in particular, it may be time to consolidate under a single domain name. Google’s Privacy Sandbox feature called Topics has implications for how brands claim their space online—in particular, it may be time to consolidate under a single domain name. data google data privacy privacy

How AdLingo Lets Brands and Consumers Talk Up a Storm

How AdLingo Lets Brands and Consumers Talk Up a Storm

3 min read
Profile picture for user mediamonks

Written by
Monks

How AdLingo Lets Brands and Consumers Talk Up a Storm

Everyone has a favorite barista or retail associate who knows how to make the perfect recommendation. These essential workers we meet in our everyday lives are attentive to our needs and work with us to find the best solution. 

But while traditional advertising strives to anticipate how consumers are feeling, the one-size-fits makes it difficult to truly offer that level of helpfulness at scale. So, what if you could converse with an ad just like you would with a person? With AdLingo, you can.

Always-On Conversations at Scale

Developed in Area 120, Google’s in-house incubator, AdLingo enables brands to embed conversational assistants (like chatbots) within display ads. These ads are distributed on Google partner inventory by Google Display and Video 360, reaching users across the web and within web apps. The conversational platform offers brands a simple way to adopt an “always-on” customer acquisition model that engages users while they browse online, ready to meet their needs wherever your audience may find a display ad.

Monk Thoughts AdLingo ads are relatively quick to produce and scale up, but are most effective when optimized over time.

Quick to implement, they offer personalization at scale by asking users directly about their needs and preferences and dynamically providing relevant information based on the user’s inputs—making AdLingo a great way to qualify leads and gain insights on your target audience. This level of personalization also makes the format more engaging, with users spending about one minute and 17 seconds on average in conversation with the bot, according to January/December 2020 average results from AdLingo.

Continuously Improve Charisma—and Effectiveness

AdLingo ads are relatively quick to produce and scale up, going from idea to launch in only four to six weeks. But it’s important to understand that AdLingo campaigns are most effective when optimized over time, continuously enhancing the user experience. By measuring conversation, engagement, depth of outcome and more, brands can A/B test on the fly to identify how different copy or logic variations perform. Constant iteration makes it easier for users to find the best solution for their needs and take the intended action, like visiting a landing page or converting.

Best Practices for Your First AdLingo Experience

Because an AdLingo campaign should be continuously optimized to better meet consumers’ needs over time, it’s ideal to let one run for two or three months at minimum. This is critical to ensure your campaign has enough time (and engagement volume) to accurately assess performance and optimize. And due to the format’s always-on acquisition model, AdLingo should ideally support a larger campaign than stand on its own; apply measurable insights from AdLingo conversations to optimize the broader campaign.

As for the user experience, start the conversation off with a compelling hook that engages the user. Focus on empowering questions rather than taking a negative tone, and make it clear that the user is talking to a virtual assistant, not a real person. And while open dialogue is supported by the format, pre-written responses help direct the consumer and make it easier for users to answer quickly.

How Nespresso Brewed a High-Performance Bot

When Nespresso wanted to convince customers to consider Nespresso as a one-stop destination for holiday-season gifts, the coffee brand partnered with AdLingo and MediaMonks to build a display ad that would help consumers in the US and the UK discover the perfect gift. Much like a personality quiz, the chatbot asked users a series of questions that identified the best gift for their chosen recipient and budget, boosting the perception of Nespresso as a premium gifting option.

nespresso3

Nespresso's AdLingo ad helped users find the perfect gift for their loved ones.

The brand lift study ran in the UK showed a 119% significant lift in ad recall and a 75% lift in brand consideration compared to top three competitors¹. 0.43% of impressions led to engaged conversations² in the UK, and 15% of engaged users across both countries answered all six criteria questions to receive their personalized recommendation. “The new AdLingo Ads proved a powerful solution to connect in a personalized way and at scale with consumers and position Nespresso as a great gifting option,” says Paulo R. Dias, Nespresso Global Brand Campaigns Manager. “Our global AdLingo pilot developed with MediaMonks has demonstrated strong results and impact on brand perception, that go beyond our existing channels.”

With AdLingo, brands can achieve great results from simple yet highly engaging experiences. By offering personalized and attentive service anytime, anywhere, the scalable format helps users quickly find the product or info they need—leaving them better equipped to convert. Likewise, AdLingo serves as another channel brands can draw from to capture insights that help them better understand their audience’s needs.

¹ Brand Lift Study run on mobile app among people who engaged with the ad.

² Users that engaged in the chat for 1+interaction

All sources; internal data AdLingo and Nespresso.

By engaging audiences through a conversational interface, Google’s AdLingo format offers a simple way to meet consumer needs across the web. How AdLingo Lets Brands and Consumers Talk Up a Storm Speak up and meet consumers’ needs across the web.
Adlingo google display ad chatbots always-on

What We Learned from Demoing Google’s New Depth API

What We Learned from Demoing Google’s New Depth API

4 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

What We Learned from Demoing Google’s New Depth API

Get ready for an upgrade: in early December, Google revealed its Depth API, a new functionality coming to ARCore that allows virtual objects and real-world environments to play nicer together, allowing for more convincing and immersive mixed reality experiences. A demonstrable way that Depth API achieves this is by enabling occlusion, or the illusion of virtual objects’ becoming obstructed behind real-world ones.

Convincing occlusion has historically been difficult to achieve, though Google has put together a video portraying demos of the new API that show off its features. One of those demos, which challenges the user to a virtual food fight against a levitating robot chef, was developed in collaboration with MediaMonks.

What’s exciting about Depth API is its ability to understand the user’s surroundings at an unprecedented speed and ease. “The API’s depth map is updated in real time, allowing AR apps to be aware of surfaces without complex scanning steps,” says Samuel Snider-Held, Creative Technologist at MediaMonks. This enables not only occlusion as mentioned above, but also the mimicry of real-time physics. For our virtual food fight against the AR-rendered robot, missing is part of the fun; users can take delight in the digital splatters of food on the objects around them without worrying about cleanup.

The Building Blocks to More Immersive AR

How does Depth API work, and what sets it apart from other methods of occlusion? “The Depth API uses an approach called ‘depth from motion,’ in which ARCore determines distances to objects by detecting variances between image frames while the camera is moving,” says Snider-Held. “The result is a high-resolution depth map that is updated in real time, allowing the device to better understand where objects are in relation to one another and how far away they are from the user.”

Depth API is software-based, requiring no new hardware for users with ARCore-enabled devices once it releases publicly. While sufficient occlusion significantly increases the verisimilitude of virtual objects, it follows a series of incremental updates that build on one another to allow for more realistic immersive experiences. Just last year—the same year ARCore debuted—Google released its Lighting Estimation API, which lights virtual objects to match the existing lighting conditions in the real-world setting, including light reflections, shadows, shading and more.

Screen Shot 2020-01-02 at 5.38.40 PM

Since then, a feature called Cloud Anchors allows multiple users to view the same virtual objects anchored in a specific environment. It’s the key feature powering the multiplayer mode of Pharos AR, an augmented reality experience we made in collaboration with Childish Gambino, Wolf + Rothstein, Google and Unity—which itself served as a de facto demo of what Cloud Anchors are capable of in activating entirely new mixed reality experiences.

“We have the creative and technical know-how to use these new technologies, understand why they’re important and why they’re awesome,” says Snider-Held. “We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.”

A Streamlined Way to Map Depth

Depth API wasn’t the first time that MediaMonks got to experiment with occlusion or spatial awareness with augmented reality. Previously, we got to experiment with other contemporary solutions for occlusion, like 6D.ai, which creates an invisible 3D mesh of an environment. The result of this method is similar to what’s achieved with Depth API, but the execution is different; translating an environment into a 3D mesh with 6D.ai is fastest with multiple cameras, whereas Depth API simply measures depth in real time without the need of scanning and reconstructing an entire environment.

Similarly, Tango—Google’s skunkworks project which was a sort of precursor to ARCore—enabled special awareness through point clouds “When we had Tango from before, it used something similar to a Kinect depth sensor,” says Snider-Held. “You’d take the point clouds you’d get from that and reconstruct the depth, but the new Depth API uses just a single camera.”

Monk Thoughts We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.
Samuel Snider-Held headshot

In essence, achieving occlusion with a single camera scanning the environment in real time offers a leap in user-friendliness, and makes it widely available to users on their current mobile device. “If we can occlude correctly, it makes it feel more cemented to the real world. The way that they’re doing it is interesting, with a single camera,” says Snider-Held.

Adding Depth to Creative Experiences

Depth API is currently opening invitations to collaborators and isn’t yet ready for a public release, but it serves as a great step in rendering more believable scenes in real time. “It’s another stepping stone to reach the types of AR experiences that we’re imagining,” says Snider-Held. “We can make these projects without caveats.”

For example, a consistent challenge in rendering scenes in AR is that many users simply don’t have large enough living spaces to render large objects or expansive virtual spaces. Creative teams would get around this by rendering objects in miniature—perhaps just contained to a tabletop. “With Depth API, we can choose to only render objects within the available space,” says Snider-Held. “It lets us and our clients feel more comfortable in making these more immersive experiences.”

As brands anticipate how they might use some of the newest features of fast-evolving mixed reality technology, they stand to benefit from creative and production partner that can bring ideas to the table, quickly implementing them with awareness of the current opportunities and challenges. “We bring creative thinking to the technology, with what we can do given our technical expertise but also with things like concept art, animation and more,” says Snider-Held. “We don’t shy away from new tech, and not only do we understand it, but we can truly make something fun and inventive to demonstrate why people would want it.”

MediaMonks built a demo featuring occlusion and realtime physics in AR, showcasing the functionalities of Google's new Depth API. What We Learned from Demoing Google’s New Depth API We go in-depth on ARCore’s new Depth API.
AR augmented reality mixed reality xr extended reality occlusion ar occlusion google arcore

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss