Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

Search Generative Experience and Its Potential Impacts on Content and SEO

Search Generative Experience and Its Potential Impacts on Content and SEO

AI AI, Media, Paid Search 5 min read
Profile picture for user Maria Teresa Lopes

Written by
Maria Teresa Lopes
Content Technical Lead

header

In May 2023, Google announced the introduction of a new search experience that’s primarily based on the use of generative AI to adapt to new search behaviors. In Google’s own words, it’s a way to “unlock entirely new types of questions you never thought Search could answer, and transform the way information is organized, to help you sort through and make sense of what’s out there.” 

Search Generative Experience (SGE), or “generative Search Engine Results Pages (SERPs)” as I call it, serves as a facilitator for those seeking to find information on the web with greater speed. Instead of relying solely on keyword-based searches, you could pose complete questions and even follow up with additional inquiries, mimicking the conversational style of interacting with a language model-based chatbot.

However, since its launch, numerous discussions have arisen regarding this innovation—its advantages, disadvantages and potential limitations. For content marketing and SEO professionals, the question remains: what does this feature mean for our work?

What are the main changes we’ll see in the SERP?

Based on the various featured snippets and enhanced results currently available, it’s evident that the SGE will indeed greatly enhance users’ access to information. And if you are thinking, “I need a quick answer to a question, and I want it in an easily accessible place that allows me to navigate through complementary pages to delve deeper,” then Google will remain unrivaled. 

It’s like getting all you need in one search. And it makes sense, right? After all, Google held a global market share of 90.6% in June 2023, as reported by Similar Web. Additionally, according to Semrush's The State of Search, about one in five searches resulted in a click on the first search result. These are significant numbers, and when we talk about clicks in top positions, we are also referring to visits to sites that have a user-first mindset, promote quality content creation, respect the best practices of EEAT evaluations, conform to Google’s Helpful Content system and contribute to user engagement on the channel.

In terms of the interface, the biggest change brought about by SGE is at the top section of the search results page. Essentially, generative answers replace the traditional list of paid URLs, providing users with a more immersive and semantic experience.

The image below provides a clearer illustration—although it’s worth noting there are various result variations depending on the type of search, which I’ll explore below.

screenshot of how SGE looks

Source: Search Engine Land

Let’s look at some other examples I found.

1. Informational search for a public figure

screenshot of public figure search

The Knowledge Panel upon searching for Martin Luther King Jr. (In Portuguese)

When searching for a public figure, the Knowledge Panel still appears, and the SGE asks if you want to generate something from it. As the Knowledge Panel typically offers comprehensive information about public figures or brands, as long as the information on the corresponding Wikipedia page is reliable, SGE seems to recognize that it may not be necessary to generate or synthesize further information.

2. Transactional search

screenshot showing results after searching for smartphones

The main results page after searching for Samsung smartphones.

In the case of a transactional intent search, the SERP is predominantly taken over by ads. Upon reaching the end of the first scroll, SGE once again prompts you to generate information but doesn’t do it automatically. Since the search query involves a transactional keyword rather than an informative/transactional or commercial one (such as “best phones to buy on Black Friday”), SGE appears to comprehend that the user intends to view prices directly. For that, the Shopping results provided are deemed sufficient.

3. Geolocated search

sreenshot showing results after searching for restaurant in a specific location

The results page when searching for restaurants in a specific location.

In this example of a geolocated search, SGE does not appear in the first tab, and organic results are not altered in any way. This is probably based on the same reasoning behind the previous example.

4. Informational search about health

 

screenshot showing results when searching for health information on Google

The results page when googling health-related information.

In the case of an informative intent search, SGE generates a comprehensive answer while displaying the first three organic results as snippets on the side. This format allows users who are seeking detailed information to access a full article. This is one of the primary advantages that I envision in the Search Generative Experience scenario: users who visit blogs or similar platforms will likely be more qualified and engaged with the content they choose to explore.

5. Informational search about finance

screenshot of results when looking for financial info

The results page when googling "save money every month." The first thing that comes up is a line that reads, "Generative AI is experimental. The quality of information may vary. Find out more."

In another search involving a hybrid intent, encompassing both informative and commercial aspects, SGE is displayed in a condensed manner, while a list of Featured Snippets offers more comprehensive results just below. However, due to space limitations, SGE requires users to click on “See more” to access the complete information. Upon expanding the view, SGE provides somewhat more generalized tips compared to the Featured results—image below.

screenshot google search

Upon clicking on "Know more," the page expands and features a list of tips on how to save money and their respective sources.

Informational search about games

google search results page screenshot

The results page when googling "the evolution of videogames." The first thing that comes up is a line that reads, "Generative AI is experimental. The quality of information may vary. Find out more."

In a purely informative intent search, SGE once again appears in a condensed form, with the Ingram blog ranking as the top organic result. Additionally, a “People Also Ask” box is visible below. For SEO professionals, it is important to note that, according to data from Insight Partners, 57% of the result links mentioned by SGE are derived from the entire first page of organic results. This means that if your brand appears on that page, there is a significant likelihood that it will be referenced by the AI and consequently maintain high visibility.

How can brands prepare for SGE and succeed in this new landscape?

According to the information we have so far, the SERP will change. However, this is not the first time we’ve had to get used to changes, is it? Like other times, we are always preparing for future Google alterations. The extent to which this poses a challenge for brands depends on the quality and focus they put on SEO and content marketing.

SGE doesn’t change the core principle of SEO, which is to ensure visibility and relevance. It is important, therefore, that we see it as a new era for brands who want qualified organic visibility and are already working hard for it. It will still be very necessary to:

  • Continue evolving and seeking better practices for organic ranking, focusing on content that prioritizes being user-first and responding to search intent.
  • Increase care during article production, looking at expertise in the segment, optimizing features, and different ways to stand out in SGE.
  • Ensure continuous and in-depth understanding of SGE, delving into concepts like Answer Engine Optimization (AEO), for instance, rather than perceiving it as a “villain” in the SEO landscape.
  • Continue to adhere to the principle that content is king, and its quality will always be fundamental, with even greater emphasis on sharing authentic research and data, user-generated content (UGC), and utilizing various media formats. Incorporate linguistic inclusion, keywords from related semantic fields, and adapt to hybrid search intent. Additionally, ensure an optimized HTML structure.

To succeed in this new landscape, you have to immerse yourself in it and study it thoroughly. This is precisely what my team and I have been doing, and we recommend anyone interested in the subject to do the same. If your brand has not yet prioritized SEO, content marketing, and the essential work required to attract new users and populate the top of the funnel, now is an excellent time to do so. If needed, look for a team of specialists who can help you diagnose, detect, optimize, measure, and—of course—elevate your website to the top.

Our Content Technical Lead sums up everything you need to know about Google Search's new AI feature. Google search engine marketing content Media Paid Search AI

Media.Monks is Named Adweek’s Inaugural AI Agency of the Year

Media.Monks is Named Adweek’s Inaugural AI Agency of the Year

AI AI, AI & Emerging Technology Consulting, AI Consulting, Consulting, Monks news 4 min read
Profile picture for user henry.cowling

Written by
Henry Cowling
Chief Innovation Officer

Collage of four images: a plant-strewn wall with a window in a cartoon world, AV equipment overlooking a basketball court, virtual influencer Lil Miquela sitting on a car, and a smartphone featuring the words "Future Record."

The news is out: we’ve been named Adweek’s AI Agency of the Year!

Each year, leading marketing and advertising publication Adweek honors agencies and marketing services partners who have demonstrated exceptional creativity, innovation and success across various fields and themes. The AI Agency of the Year category is new this year, making us the inaugural winner, and is designed to honor an agency that has shown creativity and ingenuity with applying generative AI to clients’ work. In addition to the work itself, entrants must also prove how they are enabling new efficiencies with the technology. This achievement marks a major step in our becoming the premier AI-first digital marketing services partner, helping brands accelerate and scale their adoption of AI.

Brands often run into one of two common problems when making early moves in AI: they either understand the benefits of the technology but don’t know where to start, or they stitch together a Frankenstein’s monster of point solutions and tools that are ultimately disconnected from each other—inhibiting AI’s potential. Solving these operational challenges is our bread and butter as a consultative partner because we use AI each day—and have built our credentials in the technology over the last decade.

Integration lays the groundwork for AI transformation.

One of the major opportunities in AI is that it serves as a systems integrator, ingesting data and insights from across business units and customer touchpoints to enhance the customer experience and build new efficiencies. At least, that’s the ambition. Each element of the marketing mix—data, media, creativity and technology—must combine for AI to be deployed at its fullest potential.

Monk Thoughts To train a bespoke brand language model, you first need a solid data foundation and unified workstreams that bring both creative and data disciplines together.
sol

In the past five years, we’ve made the case for an integrated marketing services partner who can unify each of these practices. That process has laid the groundwork for building an AI-powered, end-to-end workstream extends across and merges each of our capabilities. Our recent release of Generation AI, a formative report launched in collaboration with Salesforce, offers a window into how the technology is helping teams shape the entire marketing remit from insight to idea to execution.

Breaking silos and building a culture of experimentation have accelerated our ability to build this mature AI offering for brands. In fact, experimentation with AI has long been part of our business, with Co-CEO, Content Wesley ter Haar telling VentureBeat in 2017, “[AI] will allow us to refocus our efforts on what’s really going to impact the project in a meaningful way––which is design vision, design thinking and real creative leadership.”

But experiments in AI have long been relegated to the realm of R&D rather than the hands of everyday talent. When the generative AI boom suddenly ignited a year ago, making the technology far more user-friendly to people regardless of their knowledge in AI, our team enthusiastically rallied around Slack channels dedicated to brainstorming ways it could help them in their work.

Screenshots of the BMW Tomorrowland experience, which features knobs and settings in a chat environment enabling users to create a song with AI.

In celebration of the Tomorrowland festival, BMW gave users the chance to create their own music with AI using a chat-based interface.

This collaborative spirit has matured from casual experiments to innovation sprints that push tech to its limits, to products and services built bespoke for brands. Just look at our work with BMW Group, in which we built an AI-powered music creation platform that celebrated the brand’s partnership with EDM festival Tomorrowland. Music fans worldwide could create their own custom festival track by selecting different options in mood, pace and feeling, enabling creative expression on an unprecedented level.

A cohesive, end-to-end workstream beats random acts of digital.

Every brand may have a different need for AI: enhancing the customer experience, overcoming production constraints, activating customer data, or even a combination of those needs. That’s why we’ve built a flexible, AI-driven pipeline that connects a wide range of proprietary and third-party microservices across a single workstream.

In addition to making it easier to develop content at scale, the workstream critically has the potential to break down siloes between marketing and adjacent parts of the business. For example, imagine if your product design team could share 3D renders used to iterate assets that are then tested for performance, ultimately identifying the product angles and other variables in creative that best fit customers’ interest. Marrying product design, creative production, market mix modeling and customer insights, this workstream results in a flywheel that can inform future product designs and content.

"Experimenting with AI is important, but what’s more important are the business workflows,” says Michael Dobell, Co-Founder and EVP Innovation. “AI has sparked the need for workflow transformation, and we guide them through that change, unlocking performance gains, lowering costs of production and finding new ways of working." While marketing may be the most natural place for AI to make an impact right now, it’s actually a harbinger for wider business transformation—and this view, in turn, sets a new expectation for the role that agency partners will need to play to help brands get there.

For Kraft Heinz, we took a strategic approach to helping their in-house agency, The Kitchen, identify where and how AI could drive high value by increasing efficiencies, saving costs and elevating creative output. Together with the team, we developed a roadmap that walked through four key takeaways: data security, internal adoption, testing use cases and establishing adaptable frameworks. Overall, the engagement resulted in actionable, initial steps for the brand’s own internal team to take calculated steps toward AI maturity.

Here's to many more wins in the AI race.

We’ve always called ourselves a new age, new era partner to the world’s most innovative brands. Earning the title of Adweek’s AI Agency of the Year demonstrates that we’re already ahead—and just in time at the dawn of what’s been called the fourth Industrial Revolution.

See what else we've been up to with AI here.

Media.Monks is Adweek’s first-ever AI Agency of the Year—the culmination of early integrative efforts and experimentation. AI Generative AI AI & Emerging Technology Consulting Consulting AI Consulting AI Monks news

How Our Innovation Sprints with AWS and Google Push Our Talent and Tech Partners Forward

How Our Innovation Sprints with AWS and Google Push Our Talent and Tech Partners Forward

AI AI, Experience 5 min read
Profile picture for user Iran Reyes

Written by
Iran Reyes
VP, Global Head of Engineering, Experience

Innovation sprints

“Given that AI technology is evolving rapidly, it’s extremely valuable to have a safe space to experiment with these technologies at an early stage,” our Executive Technical Director Andy McDonald tells me. So, we’ve created this safe space: innovation sprints are all about learning by doing and giving our talent the opportunity to get hands-on experience with building brand-new tools and technologies—and not just for our own gain, but to help push innovation forward at the world’s most impactful tech companies.

And opportunities there are. In the last few months alone, we’ve completed three innovation sprints in collaboration with some of our key cloud partners. First, we joined forces with Amazon Web Services (AWS) to host a challenge across time zones to create internal AI tools using Amazon SageMaker. A few weeks later, Google gave us, in our capacity as a large Workspace customer, the chance to play with Vertex AI and push the technology to its limits in two multi-day events focused on experimentation.

From the outset, the purpose of these sprints is to benefit our cloud partners next to our own business, as we collaborate on solving key industry challenges, developing use cases that drive brand results, and strengthening our partnerships.

Seizing every opportunity to tweak our expertise. 

The setup of innovation sprints is as follows: together with our partners—AWS and Google in this case—we come up with a challenge. Hereafter, our talent dedicates their time and creative chops to come up with ideas and execute on them by using the partner’s AI technologies. 

The needs of brands are at the heart of every sprint. Privacy, for one, is always a bright golden thread, as most brands are highly concerned with making sure everything is safe and sound. Collaborating with AWS and Google to develop AI tools guaranteed we were operating in a privacy-safe environment within their cloud computing platforms. For instance, when you’re deploying a project to Vertex AI, it’s going to be sort of sandboxed within your own hosting environment, which means it’s only pulling data from a knowledge base that you control. As for Amazon SageMaker, this service is GDPR-compliant. 

When it comes to AI-driven projects fully hosted by our partners, our Technical Architect and one of our AWS Certified Solution Architects Ben Moody says, “We used to tackle AI projects with high-level tools like Amazon Rekognition and Transcribe, among others. With Amazon SageMaker, we can be entirely flexible, covering any custom AI solution, high AI data privacy needs, and low latency requirements.” 

In developing AI-driven solutions for brands, it’s critical to know all the capabilities as well as limitations of the tools you’re working with. As our Senior Creative Technologist Angelica Ortiz highlights, “We use a lot of the latest tools from our cloud partners, and these innovation sprints are a great opportunity to formally dedicate the time towards testing their capabilities.” Such early-stage testing enables us to truly understand the limits of what we are pushing certain tools to do—and as a result, McDonald says, “New ideas get spun around all the different ways we could use a technology, which is 100% going to show up in our client work.” 

Accelerating experimentation to drive results for brands.  

As anyone who works in the field of technology knows, experimentation has a ripple effect. Whether you run into a roadblock or discover a new possibility, you’re always expanding your knowledge and skills. But these ripples have a much further reach than just the individual creative, designer or developer. By experimenting with building our own AI tools in partnership with leading technology brands, we’re able to create truly crafted, custom-made solutions for our clients. Let’s be honest, massive AI tools can’t really do that (yet).

“With tools like SageMaker and Vertex AI being made available to us, we’re really able to supercharge our experimentation processes,” says McDonald. “And then, we can feed these generative AI learnings back into our existing projects and new pitches for AWS and Google as our clients.” As it turns out, most of the solutions our Monks come up with during these innovation sprints are transferable and can be wrapped up and applied to various other scenarios. 

Feedback is a very powerful ripple. Once a sprint comes to a halt, we always share our learnings with the aim to help our cloud partners improve their tooling. For example, Moody says, “During the AI challenge with AWS, we had ten teams with members across different time zones, and so we quickly noticed that it was not easy to set up a seamless MLOps and Monitoring strategy. Since our team was lucky to have direct contact with AWS, they supported us right away and provided learning resources for future production iterations.” 

Similarly, innovation sprints allow us to offer our cloud partners an exciting new take on their technologies. “They’ve been working on their products for years and years, so we can provide fresh perspectives—and sometimes even discover bugs in the system—by applying what we know using their tech. While they help us learn more about these technologies, we give them valuable feedback on how they can improve their products and services, so it’s really a win-win situation,” Ortiz says. And as a fun bonus, it often makes our partners excited to explore uncharted territory together.

Nurturing partner relationships is an ongoing process.  

This feeling of excitement to keep experimenting has been echoed by every participant and organizer of our recent innovation sprints, including myself. Now, all there’s left for us to do is to keep carving out the time, so that we can continually develop our creative ideas inspired by the technologies that our cloud partners kindly make available to us. Our Technical Solutions Engineer Sarah Sheppard highlighted that it’s great to “finally get the time to build some momentum on something. We have so many day-to-day things that can slow us down, so to actually set aside time and create space so we can keep our ideas moving forward—I think that's the best thing these sprints do for us.” Her team, for example, had several weeks to flesh out ideas, while getting trained on the AI tools. “This made the whole experience feel like a sprint, as we tried to do as much as we could in the allocated time,” adds Sheppard. 

Ultimately, this time spent on experimenting with existing technologies and creating new applications allows us to not only drive technical solutions for our cloud partners (and ourselves), but also push our partnerships forward. As Sheppard tells me, “One of the best parts of these sprints has been working with our partners and seeing where their heads are at in a totally different context. Instead of reaching out about a fire that needs putting out, I was now messaging them to say I had some cool ideas and if we could work on it together.” In the end, when it comes to team play, you always want to make sure that you add some fun to the game.  

Our innovation sprints with AWS and Google enable our talent to build new AI tools and push innovation at our cloud partners forward. AI Google amazon Innovation Experience AI

Will Brands Sink or Swim in the AI Video Revolution?

Will Brands Sink or Swim in the AI Video Revolution?

AI AI, AI Consulting, Artists, Studio 3 min read
Profile picture for user Chris Hoffman

Written by
Chris Hoffman
Group Creative Director

Doodle image of different cameras on a pink backdrop

As a lifelong content creator, it’s easy to get stuck in your ways—I, for one, still use QuickTime 7 to play back videos I need to review. Despite being a bit of a stickler here and there, I’ve learned firsthand the importance of being technical as an artist and continually being open to change throughout my years in the field with each passing innovation. Without that, I wouldn’t have made the leap from CPU rendering to GPU rendering, a paradigm shift that required me to learn six different render engines. Altogether, this experience and many others have made me a better creative.

Sure, I’ve seen my fair share of over-hyped duds along the way—we all remember hyped-up “innovations” like 3D TVs that promised to change the way we create and consume video. But once in a blue moon, something comes along that will undeniably change the world. Recently, that’s generative artificial intelligence, yet I still see some brands shun the technology, worried about its risks.

As a creative, I’m not worried about AI taking my job away the way others might. I’m more concerned that by not embracing AI, I risk being left behind. The same risk is true for brands who are reluctant to fold AI into their workflows. Why? AI is making creativity more accessible than ever before; cinematic, high-quality content is no longer exclusive to the skilled few.

Monk Thoughts The cat is out of the bag, giving every brand a leg up in their creative capacity. The risk lies in not keeping up.
Headshot of Chris Hoffman

The democratization of AI will make some things easier, but not without challenges.

Technology has always transformed the creative process—in some ways making it simpler, and in other ways requiring creatives to adopt new skill sets. When the Lord of the Rings trilogy pushed boundaries, it led to the creation of new technology, like motion capture and its evolution into performance capture, and new talent hotbeds designed around making the most of those innovations. Today, AI is likewise challenging all of us to adapt.

First, there is the need to scale up production. The speed of creating content with AI is raising the expectation to make more. In this respect, AI doesn’t necessarily make content production easier; it makes it more sophisticated and ups creative potential. Making a mark remains a challenge.

We’ve already seen this before with CGI. Today, you can render a scene in three minutes in Maya that once took six hours. But fire up the program and it looks more like engineering software than something creative. Cobbling a scene together requires as much of an understanding of mathematics as it does of design. Using the technology to its fullest potential required the confidence to embrace it and tinker with it.

The biggest risk is in doing nothing at all.

It’s easy for brands to default to what’s familiar. I can relate; remember what I said about being stuck in my ways? But those who rest on their laurels risk losing market share to challengers who are quicker to the uptake and embrace experimentation. Smaller brands and influencers are already leveraging the availability of advanced video tools to make their mark. Closing that gap is key to reduce the risk of being forgotten.

Throughout my career, I have witnessed the transformative power of integrating technology and experimentation into one’s own creative DNA, and I am confident that this approach will continue to drive success for creative teams who dare to embrace it. On my team, we’re elevating our already best-in-class talent by augmenting their creative process with AI. As a team, we understand that it may require getting our hands a little dirty, and sometimes going back and forth with a chatbot more than expected, but the rewards are immense. By incorporating AI tools into every stage of the creative process, from ideation to concept art and beyond, we enable ourselves—and our clients—to surpass standard limitations, supercharge our output and create captivating content that leaves a lasting impact. And we can’t wait to see how it develops even further.

Start small, but think big.

The good news for risk-averse brands is that you don’t have to choose between being too conservative or too experimental, throwing caution to the wind. There’s no need for a binary approach to whether you’re in or out with AI adoption; there’s plenty of room to experiment within guardrails. You just need to start playing with the simpler ways to enhance your output (like generating numerous backdrops with AI, or digitally replacing products to make content more dynamic and personalized) and iterate from there as your team becomes more skilled.

If a creative with 20 years in the business can confidently embrace AI without reservations, so can you! While the AI boom may feel like untrod territory, it’s not the first time we’ve needed to creatively adapt—and with new customer expectations and increased competition through the democratization of content creation, there’s no better time than now to start. Otherwise, you might just be left behind.

The democratization of AI is revolutionizing the creative process, encouraging creative to embrace AI technology or be left behind. AI creative AI creative content Studio Artists AI Consulting AI

Performance Max: Over a Year in, Are We Prepared for a Keyword-less Future?

Performance Max: Over a Year in, Are We Prepared for a Keyword-less Future?

AI AI, Media, Paid Search 4 min read
Profile picture for user Tory Lariar

Written by
Tory Lariar
SVP, Paid Search

Image of lips overlaid with ripple design.

Language lives at the core of our experience as humans. We will always use words in our most honest moments when seeking information, looking for a service, buying a product, or solving an immediate need. Search has thrived as a marketing medium for over 20 years thanks to our human need to express our desires with words.

While I believe search will continue to thrive and drive results for advertisers, how we buy search is undoubtedly changing as is the search engine results page (SERP) itself. One of the largest shifts in how we buy across the Google space has been Performance Max (PMax). It’s a keyword-less, AI-powered campaign type that uses machine learning models to optimize bids and placements across Google (including search) to hit a core objective. Advertisers don’t bid on specific keywords; instead, they rather rely on AI to handle bidding and targeting through audience signals, the advertiser's own website/URLs, and creative assets. This serves ads across the Google network to match search queries and browsing behavior to those most likely to convert to the desired action. So, while the search experience is similar to the user, how we as digital marketers buy search is changing rapidly.

Early adoption of Performance Max was mixed and vertical-specific—now the tide is shifting.

Advertisers have embraced PMax with mixed readiness, many fighting the loss of control they have come to expect from Google ads over the years. In the earlier days of PMax, I felt that hesitance as well, especially for non-retail verticals and for very complex advertisers. I also knew from years of navigating similar evolutions in Google Ads that there would inevitably be new features and modifications based on feedback from agencies and advertisers. Sure enough, our testing has shown that while we may not be ready to say goodbye to keywords, PMax does get us one step closer to the option of scalable keyword-less targeting.

Google is now transitioning to a more visual SERP with a reliance on visual formats and a generative search experience. Google has bet big on Performance Max, Broad Match, and AI-driven products in general. Google has also released more insights, creative tools, targeting, and testing levers in the platform to improve the product and allow for more insightful ways to leverage data across marketing efforts, including informing audience selection and creative. Additionally, Google is testing and launching automated and generative asset creation to help advertisers with the hurdle of costly and time-consuming creative iterations.

Let’s recap how PMax has evolved in the last year.

There continue to be new features launched to improve this product and improve performance for campaigns. I’ve detailed out a few below that are more significant in terms of our utilization.

  • Uplift experiments and PMax vs. rSC (standard shopping) testing: These tests allow advertisers to compare the performance of PMax campaigns against other types of campaigns, such as standard shopping campaigns, and against other PMax campaigns with different settings or strategies.
  • Brand exclusions and account level negatives: These features allow advertisers to prevent their PMax campaigns from showing ads for certain brands or keywords. This can be helpful for preventing ads from showing for competitors or for keywords that are not relevant to the advertiser's business.
  • Video builder upgrades and improved asset-level group reporting: These features make it easier for advertisers to create and manage their PMax campaigns. The video builder allows advertisers to create videos for their PMax campaigns without the need for any video editing experience.
  • The improved asset-level group reporting provides advertisers with more insights into the performance of their PMax campaigns.
  • Final URL expansion helps you optimize your Pmax campaigns' performance. This will help find a more relevant landing page based on the user's search query and intent and allow for a customized dynamic ad headline that matches the landing page content.
  • SA360 floodlight bidding support which allows advertisers to use their own conversion data to bid on PMax campaigns. 
  • The ability to not input assets and run shopping/feed ads through the PMax ad type. While it’s not 100% guaranteed that auto-generated assets won’t run, our testing proves that the majority of spend goes to shopping placements.
  • Allowing for scripts to be leveraged to showcase spend by tactic (video views, etc.).

We were an early adopter and rolled out a very systematic approach to testing.

Our team has been highly committed to testing PMax since the initial product launch announcement, working to incorporate all its new functionalities and experiment options. We’ve tested PMax against standard shopping and Dynamic Search Ads (DSA). We’ve also tested new customer bidding, and even multiple asset groups with distinctly different creative versus just a single asset group and more. While we tend to see higher performance across our retail clients utilizing the feed-based and local solutions, we are now seeing growth in other verticals, even with sensitive lead-gen advertisers such as healthcare, as a result of new features.

Here’s an example of our testing. A healthcare client recently did a head-to-head test with DSA and was able to scale 10x in investment and lead volume by tapping into the power of PMax. In a few short months, the shift to PMax drove a 13% increase in leads with an 8% reduction in CPA. Setting thoughtful, flexible targets; breaking out campaigns by business need; having objective targeted creative; and giving machine learning the space to find the right people at the right time with first-party data allowed the brand to scale.

Here is what we anticipate will come next:

  • Visual/image site links will be favored over text site links.
  • There is a possibility that PMax is the end goal for Google Ads, with limited to no ability to bid on selected keywords (i.e., a keyword-less future).
  • 2024 could be the year we lose other match types. I’m hoping we keep exact match, but we know broad match will stay for some time given the investment in the new broad match.

A look ahead at the road ahead for keyword bidding.

Over the last year, Google has been an excellent partner, listening to feedback from experts in the field and investing in more data insights, testing ability, training, and targeting controls. In addition, the new generative creative levers in the platform can arm us as search marketers to harness the power of AI while utilizing our knowledge of the Google network and specific brand needs to drive client wins.

Are we ready for a fully keyword-less search buying experience today? No, but given the progress of Performance Max over the last year, it’s becoming more likely that pairing broad match and Performance Max will be in every best practice deck from Google partners for the foreseeable future.

Media Paid Search AI

Performance Marketers Should be at the Center of AI Transformation

Performance Marketers Should be at the Center of AI Transformation

AI AI, Data, Digital transformation, Media, Performance Media 4 min read
Profile picture for user adam

Written by
Adam Edwards
EVP, Performance Media

A computer generated skeleton with guidelines around it

The meteoric rise of GPT-4, as well as generative AI tech more generally, has the digital marketing world focused on the wide-reaching implications on our industry. Understandably, the majority of the attention has been on the impact of ideating and scaling creative and content more efficiently. After all, generative AI unlocks the power to generate high quality content, and lots of it, like never before.

Performance marketers have been an underutilized resource to date, but their years of experience using AI for marketing success make them well suited to play a large role in broader AI adoption. Blind disciples of every generative AI shortcut will get burned and those resistant to change will become irrelevant. Nobody knows this more than performance marketers. 

As it relates to the digital marketing AI arms race, Google, and to a lesser extent Meta, weren’t nearly as proactive at highlighting their work relative to Microsoft (the largest investor in OpenAI, the company responsible for GPT-4). The irony is that Google and Meta had been at the forefront of incorporating their long-standing investments in AI, which was already deployed in almost every corner of Google and Meta Ads platforms and products.

Google and Meta represent nearly half of all digital ad spending in the US and represent an even larger share of the typical performance media budget. AI integration in Google and Meta has most prominently centered around machine learning algorithms for bidding and ad serving. That said, there are examples of generative AI as well (suggesting ad copy and creating distinct ad copy from permutations of existing headlines and body copy), and AI’s tentacles can be felt everywhere in the Google and Meta ad ecosystem. Prominent examples include:

  • Performance Max (Google) and Advantage+ (Meta) are effectively end-to-end automated campaigns that use AI to target, generate ads and optimize toward set goals.
  • Automated bidding sets dynamic bids in real time using machine learning to more efficiently optimize toward the highest ROI.
  • Responsive Search Ads (Google) uses AI to mix and match different portions of copy to deliver the best permutation for the individual searcher (right ad to the right audience at the right time).
  • Recent Google Marketing Live (GML) and Meta Connect 2023 conferences announced products around AI-powered assets, AI-generated images, generative AI to create ad copy and auto enhancements to text placement, brightness, etc.

In that same vein, performance marketers, most of whom earned their stripes running or overseeing Google and/or Meta Ads, are particularly well suited to guide advertisers through this next major stage in digital transformation. The nearly half decade of experience most performance marketers have both harnessing and reining in AI tools justify them playing a central role guiding marketing teams in developing and deploying generative AI adoption.

What about this experience gives performance marketers an advantage? 

  • Threading the needle between uncritical adoption and complete resistance to change
  • Understanding of the importance of high-quality data inputs 
  • Understanding the importance of setting guardrails and tweaking those over time 

Bringing healthy skepticism to the table.

Seasoned performance marketers have had to adapt and learn new types of automation many times over, and can share their war stories. From broad match keywords, Meta auto-placements and iteration after iteration of automated bidding on Google gone awry, we’ve seemingly seen it all. Google and Meta were trailblazers in incorporating AI into ad products, and reps would very earnestly push adoption of products that could be buggy and at worst underperform manual alternatives. However, Google and Meta were also diligent about refining those products over time and performance marketers who were not willing to continue testing at all over the last few years were quickly left behind. Broad match keywords, automated bidding, Advantage+ shopping campaigns and many more products delivered more scale at comparable efficiency to non-AI driven products. 

As AI plays a more permanent role across creative, customer journey, audience identification and more, this balance will be crucial. Blind disciples of every generative AI shortcut will get burned and those resistant to change will become irrelevant. 

Garbage in = garbage out.

One of the biggest distinctions in a strong performance marketer versus a mediocre one is her understanding that the inputs to automation can have a profound effect on outcomes. Performance marketers who press the easy button and switch from hundreds of manual bids per week to auto-pilot don’t get strong results. Worse yet, they’re quick to declare, “It doesn’t work!” Data volume and quality are the foundation of an effective AI deployment strategy. Knowing which data sources to use and exclude, and which campaigns to match with each specific type of automated bidding, is a crucial skill. Performance marketers know to incorporate lead quality data to B2B auto-bidding, initiate testing on campaigns with higher conversion volumes, and not to launch immediately after a strong holiday or back to school period.

In this sense, performance marketers have years of “prompt engineering” reps without even realizing there was a name for it. Marketing organizations stand to get AI into market faster, and benefit sooner from the positive results, by tapping into that experience. 

Performance marketers are masters at fine tuning.

The last level of mastery that performance marketers have achieved has to do with learning the intricacies of the algos. We have applied max CPCs, cost caps and negative keywords to rein in the occasionally deleterious effects of AI unchecked. At a high level, AI can be fickle and human intelligence is crucial to avoid these blips. We have seen a top performing ad set stop delivering seemingly out of nowhere, only to have a minor 5% increase in ROAS target return it to normalcy. We’ve learned to mine for insights around how, why and where AI is working:

  • Is stronger performance because we’re seeing increased CTR or conversion rate?
  • Are we getting in front of the same audience more cost effectively or reaching a better audience?
  • Did we create better ads, or did the platforms get better at matching them to the right people?

We ask these questions daily. That curiosity bordering on paranoia allows performance marketers to squeeze the most out of AI, as well as limit downside risk. 

Performance marketers have a feel for AI’s rhythms, like a mechanic knowing just which bolt to tighten to get the rattling sound in the car to stop. This mileage, or put anachronistically “human intelligence,” is tough to replicate. 

This AI mileage and its broad applications are why performance marketers should have a seat at the table. As an agency leader I’m better equipped to weigh in on how we utilize AI to address tasks, reporting, data integration, scripts and implement processes around AI because of that performance DNA.

Learn how performance marketers play a central role in guiding marketing teams in developing and deploying generative AI adoption. performance marketing Generative AI Google automation b2b marketing AI Data Performance Media Media AI Digital transformation

Generate Content at a Fast Clip with Fan-Focused AI Highlights

Generate Content at a Fast Clip with Fan-Focused AI Highlights

AI AI, Emerging media, Experience, VR & Live Video Production 4 min read
Profile picture for user Lewis Smithingham

Written by
Lewis Smithingham
SVP of Strategic Industries

VR headsets and production equipment images are collaged together

With an explosion of connected technology—from VR to virtual worlds, TikTok to Instagram brands and more—the business of broadcast is now the business of content, commerce and culture delivered fit to format. Essentially, broadcast today is all about having the best content pipeline that’s able to deliver to myriad audiences across channels.

59% of Gen Z watch longer videos they discovered on short-form video apps, demonstrating the need for broadcast rights-holders to embrace ecosystem-level thinking. We’ve worked alongside brands like Meta, Hasbro, TikTok and Verizon to evolve their broadcasting approach and meet the habits of today’s viewers through experiences that are immersive, interactive, and reach audiences where they are. Now, we’re developing an AI solution that will further revolutionize this next-generation broadcast workflow to create more engaging, personalized content for consumers with Fan-Focused AI Highlights.

Fan-Focused AI Highlights clips hyper-relevant content at speed and scale.

Fan-Focused AI Highlights, currently in development, uses AI and machine learning to instantly clip highlights in live broadcasts. The AI model is capable of segmenting individual people and objects in live broadcasts and effectively eliminates the need for manual selection and editing, a typically time-intensive process.

The speed and volume of content unlocked by Fan-Focused AI Highlights is crucial to delivering the snackable content today’s sports viewers crave. Gen Z now consumes more highlights (50%) than live content (35%), validating the appetite for a moment-based approach to content delivery that is also more personalized.

EVP, Global Head of Experience at Media.Monks and former NCAA player Jordan Cuddy offers one example of how this trend is impacting the world of sports. “With Lionel Messi now signed onto Inter Miami, many of his fans may not care to watch American soccer,” she says. “Rather than sit through a 90-minute game, they just want to see the eight minutes where he’s touching the ball.” Her point is backed up by the fact that 80% of Gen Z fans not only follow a professional athlete online but seek to watch the events those athletes participate in, as well as follow the brands they engage with. With Fan-Focused AI Highlights, you could automatically clip together a reel of the game focused on Messi’s—or any athlete’s—best plays with ease.

Deliver on the hunger for affinity-based content.

The same approach above could apply to even more niche content and viewer interests. Imagine a basketball game that AI automatically slices into social media content focused on footwear worn by the athletes, then pushed out to an audience of sneakerheads by an athletic apparel brand. This is easily achieved with Fan-Focused AI Highlights—helping brands and broadcast rights holders alike reach audiences in more relevant ways, while also expanding the quantity and value of your broadcast rights.

We’re in a new era where people are no longer defined by demographics broken up by where they live; now it’s about identity groups. Rather than carve up territories on a map, broadcasters can creatively package up content for numerous subcultures simultaneously, leveraging the power of AI and machine learning to distribute custom highlight content to tailored interest-based audiences more accurately and effectively. This is a massive opportunity for rights holders, as 73% of sports viewers perceive rights owners’ use of fan data as “disappointing” (23.4%) or “below expectations, but catching up” (49.7%).

Adapt broadcast content to fit today’s viewing habits.

Fan-Focused AI Highlights is the latest solution within our software-defined production offering, which effectively eliminates the need for a large physical plant—like large control rooms or OB trucks that cost tens of thousands to rent per day or the dozens of crew members to maintain them—in favor of versatile, nimble broadcast workstreams. Single-use appliances designed for one task alone make way for NVIDIA GPUs in the cloud (or a server rack), adding additional efficiency, flexibility and reduced cost, while remote teams allow rights holders to hire the best talent for the job regardless of their proximity to the event.

Software-defined production has even enabled us to do what was never done before. Working with UNC Blue Sky Innovations, we streamed the first sporting event in stereoscopic 3D at 60 frames per second and an 8K resolution, directly to VR headsets. The custom-designed pipeline features a RED Digital Cinema camera; RED CPUs that decode, color correct and de-warp footage directly from that camera; a Blackmagic controller for live switching and encoding (from NVIDIA GPUs for a high-quality bitrate); and a 1GB network to deliver the feed to an AWS instance on its way to VR headsets.

All this equipment took up the modest space of a standard foldout table—a small footprint for an innovative pipeline and history-making broadcast. Still, broadcast professionals are a traditionally superstitious bunch, and it’s easy to see why moving much of the equipment and processes to software could leave them wary: what if you run into connectivity issues or a data center goes down? The same data centers that AWS uses also host banks and other extremely sensitive operations, meaning there are multiple safeguards in place to ensure service isn’t interrupted. And if one does go down, we can spin it up on another one. With multiple redundancies in place, any technical difficulty with software is faster and easier to fix than if your truck generator went down.

A sustainable approach to innovation.

In addition to reduced risk and additional flexibility, software-defined production offers another important benefit: sustainability. Media.Monks won a Sustainability in Leadership award at NAB Show by greatly reducing the carbon footprint of broadcasts with AWS. In addition to avoiding travel-related emissions, the software-defined production workstream is powered by 95%+ renewable energy, further reducing environmental impact.

With Fan-Focused AI Highlights added to the mix, brands can continue to deliver even more personalized, relevant content designed for today’s audiences with less emissions, risk, cost and people on the ground.  As viewers crave a more moment-based approach to the media and entertainment they consume, this revolutionary broadcast model helps brands expand the value of their broadcast rights in innovative new ways.

Find out how our Fan-Focused AI Highlights solution creates more engaging, and personalized content for consumers. AI live broadcast services livestream Experience VR & Live Video Production AI Emerging media

Salesforce Just Announced No-Cost Access to Data Cloud—Here’s How to Get Started

Salesforce Just Announced No-Cost Access to Data Cloud—Here’s How to Get Started

AI AI, Data Strategy & Advisory, Industry events, Transformation & In-Housing 3 min read
Profile picture for user Jeremy Bunch

Written by
Jeremy Bunch
GM, Pre-Sales and Advisory Services

cloud entering the void

Dreamforce, Salesforce’s annual user conference, is never without its fair share of exciting announcements—and this year’s event kicked off with news that showcase Salesforce’s continued investment in generative AI and its Data Cloud solution.

First off, Salesforce announced its new Einstein 1 platform. Built on Salesforce’s improved metadata framework, the platform allows companies to connect any of their data to build low-code, AI-powered apps. As Salesforce prepares the launch of its generative AI interface, Einstein Copilot, this fall, Einstein 1 will give marketers a taste of how generative AI can fuel new CRM experiences.

But that’s not all: unleashing the power of AI relies on robust enterprise data, and Salesforce is teeing up Data Cloud to become the central data hub for Einstein Copilot. To help brands find their footing in this brave new territory, Salesforce is offering no-cost access to Data Cloud for certain existing customers, meaning there’s no better time than now to build a solid data foundation in preparation for AI’s implementation in your business. Let’s dive into deeper detail of what was announced, what it means, and how you can get started.

So, what was announced at Dreamforce?

Salesforce Sales Cloud and Service Cloud customers with Enterprise or Enterprise Unlimited editions will be granted access to Salesforce Data Cloud, in addition to two Tableau creator licenses. As part of this access, Salesforce will provide 250,000 credits, which enable customers to onboard onto Data Cloud and develop Sales and Service Cloud use cases at no additional cost.

Data Cloud is the foundation that unlocks the generative AI features within the Salesforce platform, which in turn help clients drive efficiencies for their business. With this new move, Salesforce is making Data Cloud more accessible to customers and allowing them to begin leveraging the power of the platform more quickly—which is key, because the best data pipeline wins when it comes to realizing the potential of AI. 

What are the best beginner use cases for Data Cloud?

Access at no additional cost, coupled with 250,000 credits, grants customers the ability to start using the Data Cloud platform and build foundational use cases that then lead to more advanced use cases and further utilization. So, what kinds of use cases should you start with? Salesforce has identified the following two:

  1. Unify Prospects for Targeted Selling: Consolidate data across multiple Sales Cloud organizations to identify opportunities with priority customers and increase revenue.
  2. Unify Customers for Personalized Service: Consolidate data across multiple organizations to empower services agents with a unified, 360 view of the customer.

These use cases are geared towards customers looking to consolidate data across multiple organizations—be it due to being a portfolio company with multiple companies or brands, or having separate Salesforce instances intentionally stood up.

How can I get started?

Prior to activating any Data Cloud use case, customers should first evaluate their data management standards to ensure a proper data foundation. While zero cost access to Data Cloud presents a relatively low-risk opportunity, unwinding a poorly thought-through integration can lead to complex and time-consuming work later down the line. Therefore, it’s imperative for customers to take the time to develop a clear plan, considering what data to ingest and also how to ingest and organize that data.

At Media.Monks, we aim to help customers feel confident in their ability to leverage Data Cloud to drive value for their businesses. Our team of Data Consultants and Salesforce Architects assess customers’ first-party data strategy and technical architecture in order to build a detailed implementation plan, ensuring customers are set up for success and are receiving long-term value from their Data Cloud implementation.

Gain more insights from Dreamforce.

Join us for our Dreamforce to You events happening across the globe. There, we’ll dive deeper into how to design and build an effective data strategy and pipeline and how to unlock the power of Data Cloud for your business. Learn more about the events, hosted in London and Chicago. Stay tuned for more Dreamforce to You events around the world!

Data Strategy & Advisory Transformation & In-Housing Industry events AI

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss