Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

NVIDIA GTC 2026: Orchestrate the Autonomous Workforce

NVIDIA GTC 2026: Orchestrate the Autonomous Workforce

AI AI, Industry events 5 min read
Profile picture for user mediamonks

Written by
Monks

A wide-angle, slightly blurred shot of an outdoor plaza at the NVIDIA GTC 2026 conference in San Jose. Large, 3D white letters spelling out "NVIDIA" stand in the center, with the green NVIDIA logo to the left. People are captured in motion, appearing as blurred figures walking across the stone-tiled ground, creating a sense of a busy, electric atmosphere. In the background, there are green banners, white event tents, trees, and city buildings under a clear blue sky.

The atmosphere at GTC 2026 was electric, defined by a move away from the speculative AI hype of previous years toward the grit of true industrialization. While 2024 and 2025 focused on the awe of discovery, 2026 is centered on the reality of implementation. Throughout the halls of the San Jose Convention Center, the conversation shifted from chatbots to token budgets and agentic workflows. NVIDIA CEO Jensen Huang set a definitive tone: the era of AI as a conversational novelty has ended, giving way to a new reality where AI is no longer just a tool we use, but a teammate embedded directly into our professional workflows.

For several years, the industry’s focus remained almost entirely on training—the massive, capital-intensive process of teaching models to understand the world. The world now prioritizes inference: the moment those models are put to work to generate actual value. In his keynote, Huang underscored this by projecting $1 trillion in AI infrastructure orders through 2027, a signal that the global economy is now betting on the sustained production of intelligence.

Since the beginning of the year, we have maintained that the industry has moved beyond the AI pilot phase. This shift fundamentally redefines the creative supply chain, moving us toward the development of AI factories. So, in order to maintain real-time relevance, CMOs must now transition from managing manual tasks to orchestrating an autonomous, high-performance workforce augmented by AI. 

New architectures enable productivity at the speed of thought.

If the previous generation of hardware was the “big bang” of model creation, the new Vera Rubin architecture is about the work of model execution. This platform is a structural redesign of how AI is put to work. By integrating specialized processors—specifically the new Groq 3 LPX—NVIDIA has solved the primary bottleneck for global brands: the sluggishness of AI. While older systems felt like waiting for a high-powered calculator to finish a task, this new architecture allows AI to process information at the speed of thought.

For a brand, this technical leap translates directly into always-on productivity. In the past, AI was a “pull” technology—a tool that sat idle until a human prompted it. In contrast, the efficiency of the Vera Rubin platform changes the physics of the creative supply chain. It provides the horsepower required for AI teammates to work in the background, 24/7, without the prohibitive costs or lag times that previously stalled enterprise adoption.

Agents are increasingly executing more complex enterprise tasks. 

If the Vera Rubin architecture is the factory floor, then OpenClaw and NemoClaw are the workers. GTC 2026 showcased the maturation of agentic AI—systems that don't just process text, but can see, plan and act autonomously. Huang described OpenClaw as the "operating system for personal AI," a framework that allows these agents to move beyond simple chat interfaces and execute complex missions across enterprise workflows.

The challenge for any global brand is that autonomy without control is a liability. This is where NemoClaw enters the picture. While OpenClaw provides the raw capability for agents to act, NemoClaw provides the enterprise-grade "how." It’s a production-ready stack that layers in essential security sandboxes, privacy routers and policy engines. These ensure that an agent doesn't drift outside of brand guidelines or legal guardrails.

To bridge the gap between powerful technical frameworks and day-to-day brand operations, we deploy Monks.Flow, our AI ecosystem for marketing orchestration. Rather than treating agents as isolated tools, Monks.Flow creates a bespoke system of intelligent agents that reason, plan and execute across the entire marketing lifecycle. This approach transforms the traditional creative supply chain into a fluid, real-time engine, allowing brands to move from a morning strategy session to a full-scale deployment by the afternoon.

We deploy Monks.Flow as a systems integration partner, providing the connective tissue required to make this technical potential a practical reality. By orchestrating elite talent alongside agentic machines, we help brands move past fulfilling manual tasks and toward managing a high-velocity workforce that operates at the speed of social conversation.

Data is key to giving AI definitive direction.

If the hardware provides the horsepower and the agents provide the labor, data provides the direction. One of the most significant themes of GTC 2026 was the reinforcement of structured data as the definitive foundation for reliable AI. As Huang noted during the keynote, "Structured data remains the definitive ground truth for enterprise applications."

This is where many brands still face a silent bottleneck. While the industry has been enamored with the creative potential of unstructured data—images, videos, and conversational text—the reality is that autonomous agents require organized, governed data to act with precision. To address this, NVIDIA highlighted cuDF, its GPU-accelerated library that brings massive speed to data processing. By moving data analytics from CPUs to GPUs, tasks that previously took hours are now reduced to minutes, enabling the real-time feedback loops required for an agentic workforce.

In our talent and machines model, this data layer connects brand strategy directly to market execution. By mechanizing the Four Cs—company, consumer, competitor and culture—we can provide the agents in the factory with a real-time flight simulator, allowing them to pressure-test creative concepts against cultural white space before a single dollar of media is committed.

The success of this orchestration relies on a new standard of data accountability. Because every reasoning decision, content reference, and prompt seed is drawn from a structured data layer, it becomes part of a fully auditable trail. This transforms the black box of AI into a transparent system of record, ensuring that high-stakes marketing missions are grounded in proprietary brand DNA and meet enterprise-grade standards for safety while operating at the speed of social conversation.

Orchestration will win the relevance race.

The convergence of the Vera Rubin architecture and agentic AI signals a fundamental shift in the creative supply chain. GTC 2026 provided the definitive blueprint for this new industrial reality, moving the industry beyond the novelty of discovery toward the precision of execution. For global brands, the AI pilot phase has officially transitioned into the era of the high-performance AI workflow.

This shift signals the arrival of zero-distance marketing. As agentic systems collapse the legacy gaps between brand awareness and the transaction, the traditional marketing funnel is effectively flattened into a single point of interaction. Discovery and conversion now happen simultaneously, driven by intelligent agents that identify and capture intent in the exact moment of need.

Winning the race to relevance is now a matter of orchestrating at the speed of culture. Structural advantage no longer comes from manual tasks or isolated AI experiments, but from a CMO’s ability to scale operations. The post-agency era marks a definitive shift from fulfilling individual briefs to building proprietary AI factories—environments where elite talent and agentic machines collaborate in a continuous, real-time loop. 

The question is no longer "How can AI help our teams?" but "How quickly can we build the system that orchestrates our future?" By acting as a systems integration partner, we are helping brands bridge the gap between this technical potential and practical, day-to-day application, ensuring that the factory floor is ready for the demands of a real-time world.

Explore how NVIDIA GTC 2026 shifts AI from hype to industrial execution with agentic workflows, the Vera Rubin architecture, and autonomous AI factories. NVIDIA GTC 2026 marks the rise of the autonomous workforce, where agentic AI teammates move beyond chat to execute complex enterprise missions. agentic ai vera rubin autonomous workforce zero-distance marketing creative supply chain AI Industry events

Speeding Up Production with AI for Gemini • Driving Engagement With a Dream F1 Track

  • Client

    Google

  • Solutions

    Social CampaignsAI & Emerging Technology ConsultingImmersive Brand Storytelling

  • 16 hours from shoot to live
  • 138+ million total views (1.6 million organic views)
  • 370+ million total impressions
  • 162 million views
  • Top 3 in organic social performance for Google in 2025

00:00

00:00

00:00

Case Study

0:00

Showcasing the speed of Gemini through a fan-fueled experiment.

By tapping into the high-performance world of Formula 1, Google sought to demonstrate the speed and creative intelligence of Nano Banana, Gemini’s AI image generator and photo editor. To do so, they needed compelling social content that showcased the model’s creative power in real-time. But with a challenging four-week deadline and a single 30-minute session with McLaren F1 driver Oscar Piastri, the challenge was significant.

Our answer was “Oscar Piastri’s Dream Track,” a social-first campaign that established Gemini as the essential co-driver in the creative process.

  • A race track created by Oscar Piastri F1 driver with a papaya theme using Google Gemini Oscar Piastri F1 driver holding an RC car controller smiling

A social-first race from digital prompts to physical reality.

The campaign launched with a high-speed experiment where Oscar created his personal dream track. Captured for Instagram Reels and YouTube Shorts, the video featured the driver using specific Nano Banana prompts to visualize the circuit. We then brought this AI-generated vision to life using floor projection, creating a larger-than-life track complete with Singapore Grand Prix Easter eggs that Oscar raced using a radio control car. 

The original video served as the catalyst for a multi-phase rollout: it functioned as a call to action for influencers to invite fans to use the same prompts to design and submit their own dream tracks, turning it into a community-wide creative challenge.

Oscar Piastri's AI dream track brought to life so he could race it for real.

Accelerating creativity with an agentic AI ecosystem.

To execute this campaign in a fraction of the usual time, we leaned heavily on the capabilities of Gemini and Monks.Flow, our agentic AI ecosystem for marketing orchestration. Monks.Flow excels at revealing deep audience insights and behaviors, so we fed its agents with detailed data on Singapore F1, McLaren and Oscar Piastri. This allowed us to generate unique creative angles that served as the foundation for our brainstorming.

Finally, we deployed AI models to scout the perfect creators and to reverse-engineer Gemini Nano Banana, testing the specific prompts required to get the best visual outputs. The rigorous prompt engineering phase enabled us to not only guide Oscar on set but also teach influencers and fans exactly how to prompt the tool to design their own high-quality tracks.

  • AI generated F1 racetracks using Google's Gemini Nano Banana model

Driving high-speed engagement and community creativity.

The results show how an AI-first approach can drive high-quality engagement. Amassing 162 million total views and 138+ million total impressions, we successfully transformed a technical product demo into a viral, community-led creative session.

Oscar Piastri F1 driver standing with the Google team and Singapore Monks.

Want to talk artificial intelligence? Get in touch.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

What 2025 Revealed About AI, and What It Unlocks in 2026

What 2025 Revealed About AI, and What It Unlocks in 2026

AI AI, AI & Emerging Technology Consulting 5 min read
Profile picture for user mediamonks

Written by
Monks

A portrait of a woman in profile, facing right, with her blonde hair blurred as if in motion. She wears a black turtleneck against a dark, moody background featuring abstract magenta and purple rectangles and vertical lines. Her face is illuminated, while the rest of the image has a blurred, dreamlike quality.

2025 served as the definitive pivot point where artificial intelligence matured from a technical curiosity into a foundational organizational layer. Throughout the year, the strategic focus evolved from testing isolated tools toward architecting unified operating models that redefine the mechanics of modern work. This progression represents the shift from the "art of the possible" to the “architecture of the actual”—a transition into structured systems that deliver high-fidelity results at global scale.

The signals surfacing across 2025 have now crystallized into a strategic mandate: the industrialization of intelligence through workflow orchestration, proprietary data flywheels, and the persistent activation of brand DNA. From these signals, we can define the strategic conditions brands will navigate throughout 2026.

Marketing operations are entering the era of orchestration.

In 2025, marketing teams began moving away from isolated AI pilots to instead implement coordinated, agentic systems capable of executing work across multiple steps, continuously and at scale. These orchestrations, which redesign how collaboration is structured within the organization, connect strategy, creation, execution, and measurement within a single, connected system rather than as handoffs between silos.

This shift also presents brands with a clear exit from “pilot purgatory,” the cycle of fragmented, small-scale tests that often lack the structural weight to drive meaningful business change. By moving beyond isolated experiments and into full-scale orchestration, organizations are replacing curiosity-led pilots with a strategic architecture that connects thinking across the marketing lifecycle. This ensures that intelligence isn’t just a bolt-on tool, but a foundational component capable of dismantling legacy silos and driving high-velocity growth.

What this means for 2026: Orchestrated workflows will drive the industrialization of intelligence, serving as the bedrock for always-on marketing operations that unify creative production, commerce and optimization. Marketing teams will increasingly realign their structures, moving beyond the bottleneck of manual execution toward the strategic orchestration of agentic systems. 

Experience became the primary competitive lever.

As marketing operations became more orchestrated in 2025, experience design evolved to generate new data that could enable further personalization and consumer insights, operating as a sort of flywheel. By inviting consumers to collaborate and co-create within a generative framework, brands can capture rich, contextual signals that were previously trapped in black-box media or biased polling. This turns every interaction into a dual-purpose event: providing a meaningful consumer experience while simultaneously filling critical data gaps with owned, actionable information. When experiences are architected this way, the strategic starting point changes, leading with the fundamental question: “What data am I after?”

Under this architecture, participation is no longer just an engagement metric; it functions as a primary data-generation event, feeding high-fidelity, first-party signals directly into a brand’s agentic ecosystem.

AI serves as the connective tissue here, enabling experiences to ingest real-time data and output hyper-personalized assets without the friction of manual production. A primary example of this is our work with the Boomtown music festival, “Boomtown Unboxed,” which transformed attendee engagement into a scalable data engine and hyper-personalized creative. The platform utilized first-party event data captured throughout the festival to dynamically assemble high-fidelity recap footage unique to each individual attendee.

By treating the experience itself as a massive data-capture environment, AI became the unlock to transform attendance into insight, informing creative assembly and deepening emotional resonance. Creative automation allowed the experience to adapt to each participant at a level of granularity that legacy workflows simply cannot match.

What this means for 2026: As content saturation renders traditional engagement episodic, experience design must shift into an always-on system that continuously harvests intelligence to sustain

Authenticity emerged as a strategic asset.

In 2025, authenticity shifted from a philosophical ideal to a critical operational capability. As generative tools lowered the technical barrier to content creation, the market saw a surge in homogenized, generic outputs that lacked the distinct soul of the brands behind them. On the flip side, strategic brands sought to encode their unique visual heritage, tone of voice and proprietary audience insights into their AI systems, enabling creative at scale that is deeply authentic to the brand.

The most durable competitive advantage no longer comes from mastering off-the-shelf tools, but from training foundational models based on the brand's own history. By ingesting proprietary mascots, intellectual property, and creative principles, brands can ensure their AI-assisted work is instantly and recognizably their own. This move, from one-off prompting to a living brand brain, allows for the scaling of expression without the dilution of meaning.

Conversely, the market has seen the consequences of misalignment. When brands rely on generic public models to represent their identity, they risk falling into the uncanny valley of brand representation. You’ve likely seen a handful of high-profile missteps throughout the year, where the use of artificial, generic models felt misaligned with the brand’s core values or the diversity of its audience. Such outputs often feel like an intrusion rather than an extension, eroding the very trust the brand worked for decades to build.

What this means for 2026: As AI becomes embedded across content operations, authenticity will function as a performance driver. Governance and brand-specific foundational models will become essential components of modern marketing systems, ensuring that scale strengthens recognition rather than creating fragmentation. 

Discoverability is being redefined by AI interfaces.

As AI agents become central to everyday planning and retrieval, discoverability is no longer a matter of simple keyword ranking. Over the past year, discoverability has come to depend on branded content’s ability to be reliably retrieved, understood and cited by generative systems as a definitive source of truth.

This has birthed the era of Generative Engine Optimization (GEO). While traditional SEO optimized for visibility on a results page, GEO optimizes for inclusion within an AI-generated synthesis. This shift demands a move away from keyword density toward contextual accuracy, structured metadata, and verifiable credibility. 

Consequently, discoverability has transformed from a tactical marketing challenge into a foundational infrastructure requirement. Brands that invest in structured knowledge bases and machine-readable content ecosystems create the conditions for AI agents to reference them with confidence, reducing the risk of ambiguity or hallucination. Content must now serve two audiences simultaneously: it must remain emotionally resonant for humans while being architecturally legible for machines. Modular formats, authoritative sourcing and multimodal assets are the new table stakes for reducing inference guesswork by AI intermediaries.

What this means for 2026: Search strategy will expand beyond the logic of search result rankings. Success will be defined by citation and trust, as brands architect content ecosystems that serve as the primary nodes of recommendation within agentic interfaces. 

In 2026, intelligence maturation becomes a structural necessity.

The shift from 2025’s experimentation to 2026’s execution represents the final maturation of the AI-native enterprise. Competitive advantage now follows the industrialization of intelligence, moving past task-level gains toward a cohesive agentic architecture that unifies strategic intent, creative craft, and operational execution.

This evolution has transformed what was once a luxury of curiosity into a foundational structural necessity. Performance in this landscape is defined by the depth of system design and the purposeful activation of a brand’s proprietary DNA. By dissolving legacy silos and architecting unified flows, organizations can finally turn the complexity of orchestration into their most enduring source of compounding advantage.

2026 marks the industrialization of intelligence. Explore the shift from isolated AI pilots to orchestrated agentic systems and marketing operations. 2026 marks the industrialization of intelligence. Explore the shift from isolated AI pilots to orchestrated agentic systems and marketing operations. agentic ai Generative Engine Optimisation (GEO) brand DNA marketing operations AI & Emerging Technology Consulting AI

AWS re:Invent 2025 Recap: Building the Infrastructure of the Agentic Era

AWS re:Invent 2025 Recap: Building the Infrastructure of the Agentic Era

AI & Emerging Technology Consulting AI & Emerging Technology Consulting, Industry events 5 min read
Profile picture for user mediamonks

Written by
Monks

A photograph of a large, crowded convention center hall. A large, curved sign with a colorful pink, orange, and purple gradient background reads "Welcome to re:Invent". The space is illuminated with purple and blue lights, and the floor has a geometric pattern. Numerous attendees are walking around the hall.

Another AWS re:Invent has wrapped, leaving the industry to digest a whirlwind of announcements from Las Vegas. With over 1,000 sessions and countless product launches, it is easy for marketers to get lost in the noise of new instance types and database upgrades. However, for customers looking to stay competitive, a single, urgent narrative emerged from the chaos: the era of the passive AI assistant is ending, and the era of the autonomous AI agent has arrived.

Discussion about the potential of agentic AI isn’t particularly new. But if the beginning of 2025 was about the promise of autonomous agents, re:Invent was about implementing the plumbing required to make them work at scale reliably, with proper governance, and at scale—moving past simply building agents to building them well. This maturation of infrastructure, from silicon to software, is a continuous effort focused on the reliability, resilience and enterprise compliance needed to support the agentic era. By simplifying these foundational layers, AWS is accelerating the work we do for global customers, allowing us to move faster from concept to secure, end-to-end autonomous workflows.

For customers, this shift requires a new strategic roadmap. Here is what you need to know about the transition to an agent-led future.

Frontier agents are transitioning from reactive chat to complex, 24/7 orchestration.

The headline coming out of AWS leadership is a strategic pivot from simple assistants to autonomous AI agents, governed with strong foundations and data-driven development. To understand the difference, think of a chatbot as a reactive tool that waits for your prompt to generate a single response. An agent, by contrast, is designed to collaborate over time, handle multi-step tasks, and work independently to achieve a goal.

AWS introduced the concept of Frontier Agents, or AI teammates capable of handling highly technical tasks like DevOps and Security. While these initial use cases are technical, the implication for marketing operations is profound. We are moving toward a reality where an AI agent can not only write a campaign email but orchestrate the entire deployment: autonomously segmenting the audience, setting up A/B tests, monitoring performance in real-time, and even adjusting ad spend based on ROI targets without needing a human to click “send” at every step.

This creates a “serverless-first” culture where the bottleneck is no longer content creation, but orchestration. To succeed, customers will manage a workforce of silicon agents executing strategy at the speed of software.

Expert agents require more than just a powerful model.

Building high-quality agents requires a closed-loop system, not just a smart LLM. It starts with trusted, permissioned data that is transformed into a rich, multi-layer context. By moving beyond basic search methods and using techniques like hybrid retrieval (combining keywords and context) and graph traversal, organizations can give agents the precision and "common sense" they need for enterprise use.

However, data is only one piece of the puzzle. At re:Invent, AWS emphasized that agents must operate within a strict architectural contract to remain safe and predictable. This includes "least-privilege" security—giving agents only the specific tools they need—and clear decision boundaries. Observability has also become foundational; every decision an agent makes and every tool it calls must be traced and attributable back to a source. By embedding automated quality checks and human-in-the-loop safeguards, organizations can turn unpredictable AI into reliable, enterprise-grade systems of record and action.

Expertise delivers the AI “last mile” of value.

A consistent theme across the 2025 tracks was that while AWS provides the powerful building blocks, like Amazon Bedrock, the “last mile” of value is found in the integration. The industry is moving away from treating AI as a standalone tool and toward integrated AI services that bridge the gap between cloud infrastructure and specific business outcomes. Closing this gap is how organizations are finally escaping proof-of-concept purgatory and realizing significant gains in efficiency and engagement.

On the operational side, we are seeing the emergence of brand intelligence systems that solve the “hidden tax” of internal friction. A representative example is a solution we recently built for a global technology leader, which moved beyond a standalone tool to become a core enterprise integration. By seamlessly connecting agentic architecture with the brand’s existing data environments and daily workflows, we provided over 1,800 users with definitive, reference-backed answers instantly. This integrated enabler cleared manual bottlenecks and reduced the message cycles previously needed to approve time-sensitive assets.

On the engagement side, a focus at re:Invent was the transformation of live media and broadcast workflows. The challenge in modern media isn't just storage, but the inability to identify and extract moments of value within a live stream in real-time. Our demo at the event illustrated this industry shift through the lens of a “sneakerhead” basketball fan. By using agentic workflows to scan live footage for visual cues and automatically triggering rendering pipelines, we demonstrated how live video can evolve from a passive broadcast into a searchable, personalized experience. Such innovations show how the media supply chain is becoming a dynamic revenue engine by connecting fan interests to personalized content at scale—provided you have the integrated architecture needed to bridge the gap between cloud infrastructure and the complex, real-time demands of a live broadcast. 

The move to micro-models allows for specialized, cost-effective intelligence.

Finally, re:Invent 2025 addressed the cost barrier that has kept many customers from building bespoke AI solutions. The prevailing trend isn't just about bigger models anymore; it is about specialization.

While the “teacher-student” architecture—using massive, high-intelligence models to train and evaluate smaller micro-models—has been a known engineering strategy for some time, AWS is now making it accessible for every enterprise. Announcements like Amazon Nova 2 and Nova Forge are designed to democratize this process, lowering the barrier for organizations to build their own frontier models.

This enables marketing or technical teams to build proprietary micro-models that are hyper-specialized. You might have one small model specifically trained to write in your brand voice, another dedicated to checking legal compliance, and a third for analyzing customer sentiment. This approach reduces latency and cost while dramatically improving accuracy, as each model is an expert in its narrow lane.

Adapt to become the architect of the future.

The experimental phase of generative AI is evolving into an era of industrial-grade execution, moving past the novelty of chat interfaces and into a reality where success depends on the sophistication of your infrastructure. The ones who win in this new landscape won't just be those with the best creative ideas, but those with the most robust agentic plumbing: structured data, specialized micro-models, and autonomous workflows that run 24/7.

For customers, the mandate is to look beyond the immediate output of AI and focus on the architecture behind it. By investing in structured knowledge graphs and embracing the shift from human-in-the-loop to human-on-the-loop orchestration, organizations can unlock a level of personalization and efficiency that was previously impossible. The infrastructure is built; the agents are ready. The question is no longer what AI can do for you, but what you are prepared to let it build.

Discover how AWS re:Invent is launching the era of autonomous AI agents and learn about reliable, governed infrastructure for enterprise-scale success. Discover how AWS re:Invent is launching the era of autonomous AI agents and learn about reliable, governed infrastructure for enterprise-scale success. AWS reinvent autonomous ai agents enterprise ai infrastructure agentic ai AI & Emerging Technology Consulting Industry events

Monks and NVIDIA

Orchestration Partner for the Intelligence Era.

A close-up studio shot of the NVIDIA A6000 graphics processing unit (GPU), a powerful piece of NVIDIA technology crucial for accelerating complex calculations in Generative AI content production workflows.
Monks logo + NVIDIA logo in black
Monks employees working together on a technical problem over a glass board

Accelerate AI adoption.

Our strategic collaboration with NVIDIA empowers organizations with cutting-edge AI solutions that drive unparalleled efficiency, and freedom. The bottom line is that our solutions deliver great returns quickly without software or vendor lock-in. This is the kind of flexibility and cost calculus companies look for in rapidly changing times.

This powerful alliance seamlessly integrates Monks' comprehensive services and solutions with NVIDIA's industry-leading accelerated compute and SDKs. Through our partnership, you gain access to trusted expertise, global scale, agile execution, and end-to-end services designed to meet the needs of the ambitious.

San Jose Convention Center decorated for GTC Event

Agentic AI Advisory Group and Monks Foundry build pioneering applications of agentic AI.

Learn more

Accelerated Solutions

  • Monks.Flow - Shift your team from operators to orchestrators, steering AI agents toward transformational outcomes. Learn more.
  • LiveVision™ - Modernize your broadcasting infrastructure with real-time AI to create next-generation viewing experiences. Learn more.
  • Email.Flow - Unlock a new level of customer engagement and loyalty by delivering hyper-personalized email communications with AI-powered consistency. Learn more.

Our end-to-end partner services

    • End-to-End AI-Driven Marketing Solutions

      Leverage NVIDIA’s AI Blueprints and our industry expertise to deliver personalized, scalable campaigns. From developing data strategies to creative execution, we ensure your marketing efforts take advantage of all the opportunities of AI and accelerated compute.

      Our capabilities include deploying NVIDIA AI blueprints that enable faster, more secure, portable and cost-effective AI model deployment. We manage the entire process—from design and integration to optimization—to help your business innovate seamlessly without lock-in constraints.

    • Scaled Content Production and Personalized Customer Engagement

      We leverage AI-powered automation to produce high-quality content at scale and deliver hyper-personalized experiences tailored to individual customer preferences. Our solutions drive higher engagement, loyalty and conversion by enabling real-time audience feedback, AI-driven selective encoding and contextual content analysis.

    • Embedded Deployment

      We seamlessly integrate NVIDIA’s AI capabilities into your existing platforms and workflows using your enterprise data. Our engineering team ensures smooth deployment, interoperability, and ongoing management, enabling you to maximize your AI investment and turn your enterprise data into actionable insights and smarter operational processes.

    • Media and Entertainment Services

      Using NVIDIA Media2 tools such as NVIDIA NIM microservices and Holoscan for Media, we help media and entertainment clients innovate with real-time insights, optimize large archives and enhance monetization efforts—creating smarter, more efficient content workflows that resonate with audiences and maximize value.

Going beyond broadcast possibilities with software defined production.

A person slamdunking a basketball
Someone wearing a VR headset
A focused desk setup with an open laptop running a "Welcome to DGX Spark" setup screen, next to a small NVIDIA Technology box, symbolizing the integration of high-performance tools for Generative AI development and content acceleration.

Case Study

NBA in VRFor Meta & the NBA, we engineered the software-defined production platform to turn their VR broadcast ambition into an operational reality for 56 games across three seasons.

See Full Case Study
Monk Thoughts Our approach combines technical expertise with customer centricity. This enables us to deliver solutions faster and more effectively, setting a new industry standard.
Mike Dobell Headshot
Verizon booth display at NAB Show showcasing the NVIDIA Holoscan and Verizon 5G solution that powered FanDuel's automated live racing broadcast, achieving a 50% latency reduction and winning Best of Show.

Award-winning work

Automating Live Sports with Enterprise AI

Alongside partners Verizon, NVIDIA, FanDuel, and Haivision, our AI-driven workflow optimizes broadcast production with intelligent camera switching and private 5G, enhancing live broadcasts through automation and real-time feedback.

Learn more

How can we help you innovate? Let’s talk.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

3 Key Takeaways and New Tools from Google Marketing Live 2025

3 Key Takeaways and New Tools from Google Marketing Live 2025

AI AI, Industry events, New paths to growth, Performance Media 5 min read
Profile picture for user evansparling

Written by
Evan Sparling

A group photo of Monks standing in front of a banner that reads "Google Marketing Live."

Google Marketing Live 2025 showed that the way people search, shop and make decisions is shifting—and Google’s ad ecosystem is shifting with it. With AI baked deeper into Search, new transparency tools for performance reporting, and ad formats designed for faster conversion, this year’s announcements reflect a platform trying to meet users in the moment while giving marketers better ways to steer outcomes. Here’s what stood out and how to make it work for you.

These are our three takeaways that defined GML 2025.

Takeaway 1: Ad delivery is getting more flexible across Google’s network.

Google is shifting away from channel-based ad setups and leaning into more fluid, moment-driven experiences. Instead of building separately for Search, YouTube, or Display, ad products are increasingly designed to find users wherever they are—scrolling, streaming, or shopping. This expansion means more opportunities to reach users but also more demand for creative that fits each touchpoint, which often requires brands to scale video visuals and messaging quickly with the help of AI (or not). Measurement tools are also being updated to support this shift, aiming to track how these moments connect and contribute to sales across the journey. Flexible measurement (going beyond pixel-based attribution by incorporating incrementality, MMM, etc.) is essential as customer paths rarely follow a straight line.

Takeaway 2: AI is now embedded in Search and how brands connect. 

The rollout of AI Mode and ads in AI Overviews marks a shift in how users navigate Search and how brands show up. These tools change not just ad placement, but the buying journey. Search is becoming more visual, more video-led, and more human in tone, which results in a search and shopping experience that’s more tailored and productive for users. For advertisers, what used to require multiple campaign types and formats is continuing to evolve into a single system of outcome-based products. This year Google’s messaging this as their “power pack”—Performance Max, AI Max and Demand Gen—for brands that use AI to reach consumers. If advertisers want to capitalize on the relevance and performance Google says the “power pack” provides, media buyers must focus on giving the AI the right quality inputs, in high volumes (conversion data, creative assets, etc.). 

Takeaway 3: Google is rolling back the black box for visibility and transparency.

Advertiser pressure for more transparency is starting to pay off. Google is introducing new Performance Max insights, lower spend thresholds for incrementality testing, and agentic tools like “Your Google Ads Expert” to make results easier to explain and optimize. But blind spots remain. For example, there’s still no placement-level reporting for ads in AI Mode or Overviews. Progress, yes. Total clarity, not yet.

These are the new features our team expects to be most impactful for advertisers.

AI tools are reshaping how we search, shop and advertise.

Search is no longer just a typed query in a box. With tools like Gemini, Google Lens and AI Overviews, the buying journey is becoming more visual, conversational and context-aware. The path from awareness to purchase is increasingly possible in one scroll, without leaving Google’s ecosystem. Google’s newest tools reflect this shift:

  • Smart Bidding Exploration (in beta) blends flexible ROAS targets with new bidding logic to uncover valuable queries you may be missing.
  • AI Overviews are live on mobile in the US, with desktop and other markets coming next. These ad placements are designed to align with broader search intent.
  • AI Mode, currently in testing, introduces a conversational, multimodal search experience with an AI-powered shopping layer launching in the US soon.
  • Agentic tools like “Your Google Ads Expert” and “Your Google Analytics Expert” (in beta) aim to speed up insights and surface optimizations. “Your Marketing Advisor,” a Chrome-based AI assistant, will soon help teams manage tasks and surface recommendations across tools.

Put it into practice: These evolutions in the SERP are reshaping user behavior and redefining what ad success looks like. For advertisers, your inputs—site content, product feeds, conversion data, creative assets, etc.—matter more than ever as the content and experience will be derived automatically with AI. Invest in shoring up those foundations to make sure you’re showing up accurately and effectively in these new SERP experiences.

AI Max for Search gives you automation with a clearer view.

AI Max for Search Campaigns is a one-click upgrade that uses AI to match your landing pages, ads and keywords to real-time search intent. Google reports early tests showed up to 27% more conversions at similar CPA or ROAS, especially when using exact and phrase match. Unlike Dynamic Search Ads, which auto-generate content with limited reporting, AI Max surfaces clear insights into which queries, headlines and landing pages are driving performance. It’s still automated, but with a clearer view of what’s happening behind the scenes.

Put it into practice: Try AI Max on a campaign where broad match is performing well but hasn’t hit its ceiling. Use the new reporting to spot high-converting queries and creative, then scale what’s driving results. 

Performance Max now shows where results are coming from.

Performance Max has always prioritized automation over transparency. But Google is finally pulling back the curtain. Channel-level reporting now shows results across Search, YouTube, Shopping and other surfaces. Asset-level insights and fuller search term visibility offer more granular data to understand what’s actually working. For brands running full-funnel campaigns, this is a significant improvement.

Put it into practice: Shift budget to top-performing surfaces using channel data by influencing Google's spending. Update or remove underperforming assets within your campaign. If YouTube is lagging, shorten your video creative or adjust your audience signals.

Monk Thoughts Having channel-level visibility in PMax makes the campaign more accountable, customizable, and measurable—turning it from a black box into a smarter, more collaborative tool for growth.

Video ads in Search and Shopping compress the funnel.

Video placements are now being tested directly within Search and Shopping results, giving advertisers a shot at influencing high-intent shoppers without relying on separate awareness plays. The line between discovery and purchase is disappearing, and Google wants to keep the entire journey within its ecosystem. Users aren’t skipping steps in the funnel, they’re completing all of them in a single scroll.

Put it into practice: Add horizontal and vertical video assets to your ad groups. Focus on short-form content that delivers value fast, such as how-to clips, testimonials or product highlights.

Monk Thoughts This is the new prime real estate. If your video doesn’t stop the scroll and say something meaningful, you’re wasting a huge moment.

Measurement tools are improving, but still require setup.

Google maintained its focus on measurement this year, sharing advertiser stories about the value of Meridian and unveiling updates to measurement features within Google Ads.  For example, they lowered the threshold significantly for in-platform incrementality testing, making it more accessible for brands to measure what tactics are creating incremental results. 

Additionally, Data Manager is Google Ads’ latest tool aimed at improving signal quality and measurement reliability. It helps advertisers connect and validate first-party data from websites, apps, CRMs, and in-store systems, making campaign data cleaner, more actionable, and privacy-compliant. It also supports better attribution by ensuring tags and signals are set up correctly. 

Put it into practice: Use Data Manager to set up and quality check your tagging configuration, confirm that key data sources are linked to your Google Ads account, and connect first-party data from third-party platforms like BigQuery, Salesforce, Shopify, Google Sheets, and more. A clean setup leads to better optimization and clearer insights.

Turn GML 2025 updates to real business outcomes.

GML 2025 showed that performance marketing is becoming more creative, more automated and more measurable. These updates are your chance to simplify workflows and scale impact. If you’re connecting creative, data and AI in one system, you’re going to move faster than your competitors.

Need help connecting the dots?
Let’s talk. We help brands turn updates like these into growth strategies that drive results.

GML 2025 rolled out new Google Ads features focused on AI, tracking and automation. Learn how to apply them to your performance strategy.
3 Key Takeaways and New Tools from Google Marketing Live 2025 GML 2025 rolled out new Google Ads features focused on AI, tracking and automation. Learn how to apply them to your performance strategy.
Google AI Overviews Google advertising industry AI agentic ai AI brand experience ai experiences Performance Media Industry events AI New paths to growth
`

March 17-21

Shaping the Future of AI at NVIDIA GTC

Discover how Monks and NVIDIA are transforming marketing—read about the Agentic AI Advisory Group and the Monks Foundry.

Lewis speaking with someone at a booth during GTC conference
Outside the GTC conference showing the main sign

NVIDIA GTC is making its grand return to the San Jose McEnery Convention Center, uniting industry leaders and innovators to explore groundbreaking advancements in AI. With workshops, a keynote presentation, and exhibits throughout the event, attendees will have the opportunity to dive deep into the latest advancements in AI technology. We’re thrilled to unveil key AI partnerships, introduce our cutting-edge Monks.Flow—an end-to-end AI Professional Managed Service—and deliver insights in multiple speaking sessions throughout the week.

Be sure to visit us at booth #3263 in the Generative AI Pavilion to learn more about how we are redefining the future of marketing and creativity through AI.

Event Details

Here's what you can expect from us this year.

    • AI on Location: Deploying Mobile Private Networks and Edge Compute for Next Gen Content Pipelines

      This session explores the integration of containerized cloud applications with NVIDIA Holoscan for Media to revolutionize broadcast workflows. Utilizing Verizon’s 5G Private Network with Enterprise AI, attendees will learn how to implement cloud-driven workflows on deployable hardware, enhancing production in an IP-secure environment. Discover how to leverage NIMs and cutting-edge technologies like SMPTE 2110, 5G wireless video, and AI analysis to develop innovative broadcast pipelines.

      • Date: Thursday, March 20 | Time: 9:00 AM - 9:40 AM
      • Speakers: Tony Walasik, Product Manager, Monks; Scott Connolly, Head of Media Technology, Verizon Innovation Labs
      Learn more
    • Novel Uses for AI to Drive Monetization in Large M&E Archives

      Lewis Smithingham will discuss the journey of archival media, using the 1928 discovery of "The Passion of Joan of Arc" as a backdrop. While many media companies have vast amounts of dormant footage with minimal metadata, learn how NIMS-based contextual analysis can make this content searchable based on visual and contextual data. Explore how TAMS (time-addressable media) frameworks can enhance content retrieval and drive new monetization opportunities, including applications in predictive editing and personalized narratives.

      • Date: Thursday, March 20 | Time: 3:00 PM - 3:40 PM
      • Speaker: Lewis Smithingham, EVP Media, Entertainment, Gaming and Sports, Monks
      Learn more
    • Monks.Flow | Experience the future of AI-driven marketing solutions

      Be sure to visit us at booth #3263 in the Generative AI Pavilion, where we will demo Monks.Flow, which accelerates AI-enabled marketing teams by streamlining the strategy, creation, delivery and performance stages of their workflows. This end-to-end, AI-driven marketing solution will be showcased in a brand campaign video that features NVIDIA's cutting-edge technology and agentic mesh to enhance performance, efficiency and creativity. Join us to learn more about how we are redefining the future of marketing and creativity through AI.

      Learn more about Monks.Flow
    • Sir Martian | Meet Sir Martian, our AI-powered robot with lots of personality!

      Sir Martian is a groundbreaking fusion of AI innovation and artistic expression, transforming the traditional experience of street portraiture into an interactive, otherworldly encounter. This humanoid alien robot, affectionately named after S4 Capital Executive Chairman Sir Martin Sorrell, invites visitors to engage in a dynamic conversation while it creates a personalized line art portrait. This experience is not just about capturing a likeness but also the essence of the individual, informed by the dialogue shared. Be sure to find Sir Martian at the AWS booth on the floor throughout the week.

      Learn more about Sir Martian

Subject Matter Experts

Monks on the ground

See Monks.Flow in action: request a demo.

Learn more about our AI-driven solutions for our clients.

S4 Capital’s Monks Launches NVIDIA Foundry & Agentic AI Advisory Group

S4 Capital’s Monks Launches NVIDIA Foundry & Agentic AI Advisory Group

Studio Studio, Technology Consulting 3 min read
Profile picture for user mediamonks

Written by
Monks

Green building facade that says "NVIDIA GTC: What's next in AI starts here." In the foreground is a sculpture of the hashtag "#GTC25."

Monks, the global, purely digital, data-driven, unitary operating brand of S4 Capital plc, today announced its collaboration with NVIDIA to launch an Agentic AI Advisory Group and the Monks Foundry, supported by NVIDIA-certified engineers. The partnership with NVIDIA, shared at the GTC AI Conference, is already reshaping the marketing and advertising industry by delivering smarter, more tailored and impactful content for brands. The Monks Agentic Advisory will consist of an approximately 50-person advisory team that operates as a nimble advancement and consulting team for the Monks Foundry, which is tracking to have 150 certified engineers fully dedicated to building and deploying custom generative AI models tailored to enterprise data and domain-specific knowledge within the year.

“We’ve built on our technical expertise to create an agile and efficient approach to Applied AI with a more nimble team of engineers and advisors.” says Michael Dobell, EVP of Innovation at Monks. “In today's landscape, a simple structure can put us one step ahead, allowing us to deliver with a speed and depth that legacy holding companies and networks struggle to match, setting a new standard for what’s possible in the industry.”

GTC attendees can see how agentic AI, using Monks.Flow, integrated with NVIDIA Omniverse and NVIDIA NIM™, creates a high-quality 30-second creative spot in a fraction of the time required by traditional production methods. By developing with Omniverse technology and leveraging NVIDIA Blueprints, Monks can orchestrate interoperability between creative teams and tools, allowing for scalable real-time development of hyper-realistic content. By harnessing the capabilities of NIM microservices on performant hardware, Monks was able to accelerate delivery of agentic workflows by over 2.8x. In addition, NIM microservices perfectly slotted into a cloud-agnostic strategy, utilizing the Kubernetes deployment strategy to easily run all of the backing models in all major cloud providers, for any client needs. Attendees can explore this innovation at the Monks booth in the GTC AI Pavilion, while audiences at home can now view a fully AI generated PUMA Film showcasing the incredible creative output.

This end-to-end workflow demonstrates how cutting-edge AI, coupled with Omniverse's capabilities, is transforming the future of content creation. By consolidating traditionally segmented areas—insights, strategy, creative, execution, media, performance and assets at scale—into a seamless, end-to-end agentic pipeline, the workflow results in work that is more adaptive, efficient and visually stunning—all at unprecedented speed.

“NVIDIA Omniverse and OpenUSD are enabling developers to build AI applications that assist artists and creative teams with efficient and scalable production techniques for digital marketing assets,” says Richard Kerris, Vice President of Media and Entertainment at NVIDIA. “Monks.Flow’s integration demonstrates how AI and digital twins can transform industries and set new benchmarks for innovation.”

At GTC this week, Lewis Smithingham, EVP of Strategic Industries at Monks, will present “Novel Use Cases for AI in Large Media Archives." Additionally, Tony Walasik, Senior Product Manager at Monks, will join Scott Connolly, Head of Media Technology at Verizon Innovation Labs, to discuss “AI on Location: Deploying Mobile Private Networks and Edge Compute for Next-Gen Content Pipelines.”

The Monks booth (#3263) offers attendees a more in-depth look at Monks.Flow, which leverages advanced AI models to transform businesses. By automating workflows and linking talent, software and microservices, Monks.Flow orchestrates marketing operations for the age of agentic AI, providing solutions for every stage of the creative lifecycle and offering a seamless journey from strategy to delivery. It begins with insights and strategy, where clients gain valuable data and develop targeted plans to guide their marketing efforts. This is followed by creation and adaptation, enabling the efficient production and customization of content to meet specific needs. Finally, Monks.Flow ensures delivery and performance, optimizing content to reach the right audience with maximum effectiveness. This platform has already delivered performance boosts for brands like Headspace, Hatch and Chime.

Learn how the Monks Foundry, Agentic AI Advisory Group and NVIDIA are delivering groundbreaking content creation with groundbreaking enterprise AI solutions. AI agentic ai Nvidia Studio Technology Consulting

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss