Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

How Live Remote Production Keeps Sustainability Goals Grounded

How Live Remote Production Keeps Sustainability Goals Grounded

Emerging media Emerging media, Experience, Sustainability, VR & Live Video Production 4 min read
Profile picture for user mediamonks

Written by
Monks

A person on a laptop

As the effects of the climate crisis become more apparent and time is running out to act, sustainability has become a key focus for brands and consumers alike. Brands are setting up ambitious targets to turn the tide—and we’re no different, holding ourselves to UN Sustainable Development Goals and formalizing our commitment to become net-zero by signing the Climate Pledge in 2021.

Among our foundational environmental, social and governance goals, we aim to become a climate-neutral, environmentally conscious business—and a catalyst for change in our industry, leading by example and helping brands become more sustainable themselves. So while programs like carbon offsets are steps in the right direction, the key to meeting sustainable goals is to design operations that limit carbon emissions to begin with, and our live remote production workstream fits the bill.

Sustainable workstreams shouldn’t be a tradeoff.

Sometimes the road to being green can feel like making a series of concessions, but working sustainably often means working smarter. When reliance on digital grew throughout the Covid-19 pandemic, we adapted by designing a remote broadcasting workflow that limited the number of people needed on location. This production pipeline, with Amazon Web Services (AWS) as its backbone, did more than help us deliver incredible, live digital experiences to people everywhere—like immersing audiences in a VR Post Malone performance that transforms the experience of listening to an album. It also significantly reduced our carbon footprint.

Our live remote production connects teams to broadcasting software, eliminating the need for gas-guzzling broadcasting trucks on location. It also cuts out the need for specialized talent to travel to shoot an event: we’ve reduced our number of on-site broadcast talent by 83%. Adding to those benefits, our primary workload is based out of the AWS Northern Virginia region, which was powered by over 95% renewable in 2021.

Who wouldn’t want to trade lugging around energy-intensive equipment prone to breaking down—be careful not to trip on a wire and shut down the whole show!—for the cloud? This pipeline recently earned us the coveted Sustainability in Leadership award at NAB Show, the leading conference dedicated to the evolution of broadcast.

NAB awards ceremony with people on stage

Lewis Smithingham, far right, accepts the Sustainability in Leadership award at NAB Show.

Monk Thoughts We're incredibly honored and excited for this award because we believe sustainability can and should drive innovation and creativity throughout the production process.
Headshot of Lewis Smithingham

The benefits extend to talent as well. Gone are the long, grueling hours spent on set. The ability to collaborate remotely broadens our talent pool, ensuring the best person is cast for each job, even if they’re working across the country—or in a different one. Our experiential team has also developed a suite of tools like LiveXP that furthers the connection between audiences and the action beyond the screen.

One key benefit of live remote production is risk mitigation. First, the greatly diminished environmental impact reduces the social risk of a brand missing their pledged environmental, social and governance (ESG) commitments. Second, cloud-based tools are also more reliable. Traditional broadcast production teams have only a finite set of equipment that may break down. In the cloud, we can plan for redundancies and quickly spin up a new virtual machine should something happen to an existing one. You can learn more about what live remote production looks like on Amazon’s blog

Let sustainability be the byproduct of innovation.

Sustainability and innovation go hand in hand, and our low carbon, live remote production uniquely enables compelling virtual and hybrid experiences—filling a crucial whitespace in broadcasting as the culture shifts to more immersive and creator-led channels.

“Broadcasting is culture. It’s the vehicle by which culture spreads,” says Smithingham. Increasingly, culture is happening within immersive, interactive spaces like gaming, social feeds and metaverse worlds—behaviors that are challenging broadcasting to adapt. “A core undercurrent across all of this is if something is not interactive to younger generations, it feels broken and it feels disconnected.”

Look at award shows for example. Exclusive and invite-only, viewers must watch the fun from a mediated distance, chattering in backchannels like social media. We flipped the script by partnering with Logitech For Creators to host the first music award show in the metaverse—The Song Breaker Awards—which invited everyone to not only attend but actually become part of the show in Roblox.

What about hybrid experiences? When it comes to sharing an esports experience with audiences around the world, you’d be forgiven for using traditional sports broadcast as a blueprint. But with sports facing a decline in young viewership, it’s clear the formula is worth shaking up. In celebration of Valorant’s first anniversary, we translated the game’s rich lore into a recognizable IRL environment, then pit attendees against livestream viewers in a challenging bout. Viewers could frustrate on-site players by sharing hashtags in the chat that triggered traps—setting a new precedent for gamified broadcast experiences.

Both experiences were designed to connect with consumers in ways that weren’t possible before. If an experience calls for disruptive ways of working, why not use that as a forcing function to move a part of the operations to a low-carbon approach?

Live remote production balances efficiency and sustainability.

By leveraging live remote production, our experiential team can deliver interactive experiences that drive culture and ROI with incredible efficiency, reducing greenhouse gas emissions in the process. The best part? Those benefits contribute directly to brands’ ESG goals and industry mandates. This is a win-win as we connect with consumers in innovative ways, contribute to carbon reduction goals, and reduce risk along the way.

As consumers increasingly demand sustainability from the brands they engage with, offering environmentally friendly solutions becomes an urgent need. And because sustainability challenges us to accomplish our goals in new or different ways, it can unlock innovative ways of reaching audiences. So, whether it’s a matter of saving the planet or simply saving consumers from boredom in fresh ways, we’re down for the challenge.

Our live, remote production worksteam limits carbon emissions in a practical way—achieving sustainable goals and delighting audiences in the process. sustainability amazon livestream film production digital production virtual experiences hybrid events Experience VR & Live Video Production Emerging media Sustainability

How to Extend Real-World Events into Virtualized Experiences

How to Extend Real-World Events into Virtualized Experiences

AI & Emerging Technology Consulting AI & Emerging Technology Consulting, Experience, Immersive Brand Storytelling, Impactful Brand Activations, Metaverse, Web3 4 min read
Profile picture for user mediamonks

Written by
Monks

A virtual Macy's thanksgiving parade with buildings a parade floats

Web3 and metaverse spaces offer exciting opportunities for brands to build incredible, new worlds and ways to interact inside them. But metaverse spaces are not reserved only for flights of fancy; they can also be stages to reimagine real-world events and experiences for global, digital-native audiences.

During the pandemic doldrums, we all saw how difficult it can be to translate the magic of in-person events from the stage to the screen—but increasingly immersive platforms solve these challenges by enabling a sense of presence within. While not a replacement for the experiences that inspire them, these activations can level up their ambition and become strategic, long-term additions to a brand’s digital strategy. Here’s how two brands were able to transform iconic IRL experiences into significant, long-term elements of their digital experience strategy.

Start with community.

 Metaverse spaces serve as environments where people can connect with one another in real time, much like events in the real world. In fact, communal experiences are a hallmark of successful metaverse-related activations and Web3 projects, making community building a key design consideration. This was the case for Macy’s Thanksgiving Day Parade, which we marched into the realm of Web3 with a virtual recreation of New York’s Fifth Avenue—the IRL route for the parade—and a series of galleries featuring NFTs inspired by the parade’s iconic balloons and other popular projects like Cool Cats, VeeFriends and more.

A virtual downtown new york at the Macy's parade with balloons and buildings

Throughout its history, the parade has brought more and more people together through mass media channels: originally broadcast locally in New York in 1939, today more than 44 million people watch the parade each year. By building an immersive space where people can interact and engage with one another, even more attendees could enjoy the Macy’s Thanksgiving Day Parade in a totally new way no matter where they are: the event brought in over 90,000 users.

And for some virtual attendees, that’s just the start of the journey. With the ability to buy NFTs from some of the most popular projects, each with their own communities built around them, those who make a purchase are initiated into new entrepreneurial or artistic social circles.

Add value and cohesion through interoperability.

Culture doesn’t happen in a silo, so neither should a brand activation. One of the key promises of metaverse and Web3 technology is interoperability, or the ability of two or more systems to exchange and use information. Interoperability can mean enabling compatibility between different platforms or—perhaps more compelling for virtualizing IRL experiences—seamlessly bringing together both the real and virtual worlds.

For that latter point, look at ComplexLand, a 3D digital platform inspired by ComplexCon—the cultural mecca bringing together the Complex Networks community, the hottest cultural trends and hype-fueled brands. Reimagined in a virtual format, ComplexLand provided a seamless shopping experience so attendees could fill both their physical wardrobes along with their virtual ones. Attendees could even unlock food deliveries by interacting with virtual food trucks parked throughout the world, bringing a little piece of the IRL festival experience to audiences from afar.

A shoe drop happens in ComplexLand
An avatar changing outfits

When it comes to interoperability in the more technical sense, the virtual Thanksgiving Day Parade was designed for attendees to take a little something with them across the metaverse. After purchasing an NFT, visitors could carry them into other worlds thanks to blockchain technology. They also had the chance to vote for which NFT project on display deserved to become a balloon in next year’s IRL show, further bridging together the real and virtual world (Cool Cats ultimately won).

Monk Thoughts We're working closely with the Web3 focused team at Macy's to envision how year on year, a community can be built around Macy's and Web3 partners.
Tim Dillon headshot

Build towards the future.

As metaverse spaces mature and the hype cycle winds down, brands are beginning to look beyond one-and-done activations for ways to build meaningful interactions that fuel long-term value. From setting the foundation for new revenue streams to iterating toward increasingly sophisticated metaverse experiences, reimagining an event can become a springboard for innovation.

You can trace this concept in action through the evolution of ComplexLand, with each annual edition building on those of previous years. Originally launched in 2019, the first ComplexLand was a single-player experience, though the following year added more opportunities for attendees to engage with others: sharing drops, having one-to-one conversations and interacting with branded non-player characters. This year’s version added Web3 capabilities—like minting NFTs—to enable new forms of connection and creative expression. The journey has led ComplexLand to become Complex Networks’ second-largest source of revenue and a key part of its events strategy.

The virtual Thanksgiving Day Parade similarly builds on Macy’s earlier Web3 efforts. Last year, the retailer celebrated the 95th anniversary of the parade by launching a series of 9,500 NFTs based on classic balloons. The addition of the immersive parade route and NFT galleries not only brings the magic of the holiday season to people everywhere; and introduces new audiences to both virtual spaces and NFTs.

Monk Thoughts The program allows Macy's to continue to build deeper engagement with the community and partners while still being cause driven. Whether it’s in a Discord chat or in a virtual gallery, it opens up many new opportunities for collaboration both on-chain and off.
Viktor Bezic headshot

Build in authenticity by seizing the spirit of the event.

Finally, consider the overall purpose of your event or activation. Reinforcing a sense of purpose helps build authenticity into the overall experience. Macys captured the spirit of Thanksgiving with their virtual parade by donating all proceeds from NFT purchases to Big Brothers Big Sisters of America, a non-profit organization dedicated to supporting mentoring relationships for youth.

From the abstract world of ComplexLand to a virtual Fifth Avenue, both Complex Networks and Macy’s were able to expand the reach and relevance of their iconic IRL events. At the very least, these reimagined experiences offered moments of surprise to those familiar with their original in-person iterations. But more significantly, they serve as iterative steps that symbolize both brands’ willingness to continue building maturity in Web3 and the metaverse. By folding their core values into features that are inherently unique to the space, both brands authentically set the stage to meet digital-native audiences where they’re at.

Learn how two brands were able to transform iconic IRL experiences into significant, long-term elements of their digital experience strategy. virtual experiences hybrid events digital experience metaverse Web3 NFT Experience AI & Emerging Technology Consulting Immersive Brand Storytelling Impactful Brand Activations Metaverse Web3

experiential

Craft unforgettable moments that bring people and brands together.

An exhibit flashes a yellow light up and down the structure
A person views LED screens in a dark exhibit hallway

Step into the future of experiential.

Our experiential team blends creative and technical expertise to deliver experiences offline, online and everywhere in between. Combining best-in-class talent and tools to work across any medium and industry to meet the needs of a new, hyper-connected generation.

We are a full, in-house team of makers, creatives, engineers, developers, artists, and technologists combining worlds through innovation and interaction.

We are physical, virtual, and hybrid.

Monk Thoughts Experiences inspire emotions and create memories. Experiential connects brands and their audiences on this unforgettable level.
Alexandre Fittipaldi headshot

Physical

Invite people to step into the world of your brand through highly PR-able spaces and activations where they can take part and engage in new ways.

From event spaces, retail, land-art, to ambitious exhibition centers, we craft future-forward experiences augmented by emerging technology, making use of our entire end-to-end experiential expertise.

An experiential look into the future of living.

A large airplane hanger all lit up in the night
A large white circular exhibit piece with large screens on the inside of it

Case Study

Ellinikon Experience CentreA permanent, large-scale exhibition inviting people to experience Europe’s greatest urban redevelopment project to date.

See Full Case Study
Monk Thoughts Our focus is to understand our clients’ needs, then create tailored, immersive experiences that reach their KPIs.
Rafael Fittipaldi headshot

Want to talk experiential? Let’s chat.

Hey👋

Please fill out the following quick questions so our team can get in touch with you.

Virtual

From product launches to trade shows, our end-to-end virtual events platform drives excitement across the full experience journey. Our suite of custom tools, including our proprietary LiveXP platform, fuels connection and interactivity in real time.

00:00

00:00

00:00

00:00

00:00

00:00

Roblox avatars from Song Breaker Awards
An avatar in roblox dancing

Case Study

Song Breaker AwardsWe teamed up with Logitech for Creators to host the first-ever music awards show experience in the metaverse.

See Full Case Study

Hybrid

The future of experiences is hybrid, so we design experiences for digital participants and in-person audiences alike. From physical set pieces that online viewers can control to inviting audiences to participate in real time, we bridge the online and offline experience together to engage with audiences everywhere.

Monk Thoughts We love to transform innovative ideas into immersive experiences, always at the intersection of art, space and technology.
Marula Vaz headshot

What makes us unique.

  1. Skills Across the Spectrum • From the initial idea to the finishing touches on-site, we provide an end-to-end experience solution using in-house experts in every region.

  2. A person working in a workshop on a sculpture
  3. A person standing in front of a large amount of LED screens

    We’re a team of creative makers where creativity meets tech execution to launch experiences and events that drive culture.

  4. A person shining light on a plant
    A bunch of large panels showing colorful interchangeable designs
  5. Someone working on computers and production equipment

    From boots on the ground to remote production, our event teams span the globe—so you can meet audiences anywhere in the physical or virtual world.

  6. People sitting behind a broadcast booth watching screens with video and audio
    A person working on two laptops doing a remote shoot of a live event
  7. Computer screen showing the Game of Go

    Custom-built tools and proprietary technology let us push platforms to the limits and create experiences that have never been seen before.

  8. A hand motioning over a blue ball
    a controller with a hand pushing buttons
  9. With multidisciplinary, in-house experts in every region, we amplify brand impact at scale through incredible experiences.

Swipe
For More!

Drag
For More!

Get in touch.

Hey👋

Please fill out the following quick questions so our team can get in touch with you.

More on experiential

Our Map of the Metaverse Worlds: Find a Virtual Home Now

Our Map of the Metaverse Worlds: Find a Virtual Home Now

AI & Emerging Technology Consulting AI & Emerging Technology Consulting, Experience, Metaverse 7 min read
Profile picture for user mediamonks

Written by
Monks

A colorful island showing the different metaverse worlds

Given the way the metaverse has captured marketers’ imaginations for the last year, it’s easy to feel the need to make moves in the space for fear of missing out—or maybe to simply be seen as an innovator. But with roots in gaming and digital art, the lifeblood of a metaverse world is the culture that calls it home. While hopping into the hot platform of the minute may be tempting, it’s important to carefully consider what value your brand can bring to show up authentically.

Monks Thoughts The metaverse is a new canvas for creativity, but the hyped up trend wave could snuff all the incredible out of a good thing. We must see past the hype and look to the future with purposeful creativity.

00:00

00:00

00:00

Jouke Vuurmans Chief Creative Officer

The idea of people coming together in virtual environments isn’t new, as any fan of online games will tell you. But as these worlds become more mainstream, we’re seeing a shift in the role they play in our lives, whether it’s a pivot from competition to cooperation or enabling people to push beyond limits that hold them back IRL. This is virtualization in action: a set of new audience behaviors and cultural norms resulting from 30 years of digital transformation, hyper-accelerated over the past five years. These behaviors vary from one metaverse platform to the next, meaning an understanding of its culture is crucial to success in the space. In some ways, it’s not so different from identifying which city or neighborhood is the best location for a brick-and-mortar store.

Do you need to find your way into the metaverse? Don’t despair, because we’ve mapped out some of its preeminent worlds. Discover what differentiates one from the next with the information below. With a better understanding of each space, you’ll be able to better envision your brand’s place within the metaverse—wherever that may be.

An avatar in roblox dancing
Inside the merch store within Roblox for the Song Breaker Awards

Roblox.

Despite its quick rise to fame in recent years, Roblox dates back to September 2006, and today has a monthly average user base of 190 million. Its worlds are user-created, meaning they can vary drastically in look and feel; you never know what you’re going to find there. Users have the ability to develop their own assets (models, textures, audio and more), adding to the variety on the platform. This ability to create and sample a diverse array of activities is what makes it so appealing to players. Roblox is free and multiplatform—available on PC, mobile, Xbox One and VR platforms—and its developer tools are surprisingly accessible for those without deep coding experience.

More than a game, Roblox is a creation platform that has allowed millions of amateur developers to try their hand at making games and virtual environments for the first time. Players can both create and share individual assets, build robust games or simply play. The level of creation enabled by the platform, as well as its younger-skewing audience, makes it ripe for memes. Don’t expect avatars to mirror players’ physical likeness; fanciful avatars are the norm here. One great example of a brand embracing gamified elements in Roblox is the Song Breaker Awards, presented by Logitech For Creators. The experience reinvented the awards show format to be more accessible and interactive, inviting viewers to participate in a narrative that unfolded throughout the show.

Myla Unique Minor and Renee Montgomery in the metaverse celebrating on a basketball court

Horizon Worlds.

Meta’s foray into the metaverse is the newest virtual environment on our map—and the only one that requires a headset to enter. Since its launch in December 2021, Horizon Worlds now has 350,000 monthly active users. Like Roblox, environments in Horizon Worlds are largely user-created, meaning there’s a lot of variety in the worlds you can build or step into. The Unity-powered platform has a cartoon-like look and feel, with environments ranging from the fantastical to the ordinary, like a virtual comedy club or recording studio. Based in VR, Horizon Worlds requires an Oculus headset to enter, although its creation tools are accessible and intuitive.

Built by Meta, developer of some of the biggest social platforms on the internet, Horizon Worlds is first and foremost a space to socialize and create. While users can build competitive environments, connection among communities is key. The space also lends well to cultural moments like live sports or musical performances, which users can immerse themselves within from afar. Given the platform’s connection to Facebook—users can join with an existing Facebook account, although Meta just recently announced a unique account system—avatars and identities in Horizon Worlds are meant to reflect one’s real-world identity. This also makes safety and moderation a key consideration on the platform. Speaking of identity, a stand-out Meta’s Going Beyond: Women’s History Month event, made in collaboration with the NBA, is a stand-out experience. Throughout an interview focused on representation, viewers had a front-row seat.

Birdseye view of the library showing the whole playing area
Tiles that read different uncensored materials to read from

Minecraft.

Officially launched in November 2011, Minecraft is home to 170 million monthly average users. The blocky, open-world simulation game places users within a unique, procedurally generated landscape that they can explore and manipulate to their desire. What’s really driven Minecraft’s popularity over the decade is its marketplace of downloadable content and customization tools, allowing for the design of diverse worlds and environments. While Minecraft is a paid download, its wide availability on PC, consoles and mobile make it widely accessible to audiences.  

Minecraft’s culture is focused on building. Players take enjoyment in creating environments together and sharing them with the community—or even breaking apart pre-made environments. Users are afforded complete control of virtual spaces they inhabit, allowing for a high sense of ownership and collaboration. One of our favorite Minecraft activation is the Uncensored Library, which cleverly uses the game to circumvent state censorship and offer access to articles banned around the world. Of course, we have a soft spot for the time our VP of Platforms and Products Brook Downton built our New York office in Minecraft, too.

Fortnite.

Epic Games’ wildly popular shooter, powered by the developer’s own Unreal Engine, launched in July 2017 and boasts 280 million monthly average users. Having risen to fame at the height of the “battle royale” trend in gaming, Fortnite offers a handful of different game modes that take place on an island that grows and evolves over time. It features a cartoon-like art style, similar to what you might expect from a CGI-rendered animated film. As a free-to-play, cross-platform title available on consoles, mobile and PC, Fortnite has a very low barrier of entry for players.

Fortnite is an evolving space, refreshed seasonally with new competitive modes and events featuring limited-edition skins (avatars) depicting characters and celebrities from pop culture. The promise that there’s always something new is what keeps players coming back. In late 2021, Fortnite launched Party Worlds, or social spaces where players can access minigames, concerts, movie screenings and other content. The space demonstrates the kinds of ways that metaverse worlds can uniquely bring people together in shared, persistent social spaces.

An avatar in decentraland playing duolingo game

Decentraland.

Decentraland launched in February 2020 and is frequented by 330,000 users each month. There’s a big difference between Decentraland and the other platforms mentioned above: it’s a Web3-based environment built on the Ethereum blockchain. While platforms like Roblox and Horizon Worlds invite users to jump into self-contained worlds, Decentraland is a seamless, persistent landscape in which plots of land are bought, sold and redeveloped by the community—just like in the real-world real estate. Decentraland is accessible in a browser, though the need to connect a crypto wallet can be a technical barrier of entry to users.

The culture in Decentraland is more plugged into the Web3 space and skews very digitally mature. In addition to in-world games and activities, this environment is a place to flaunt what you’ve got: art galleries designed to show off NFT collections are popular, and a bustling marketplace allows users to trade ownership of unique digital outfits and objects. Duolingo expertly captured the playfulness of the space by dropping a giant statue of Duo, its infamous mascot, into Decentraland’s leisurely Terra Zero area. Holding a billboard that cycles through snarky push notifications reminding visitors to do their language lessons, the activation cleverly emulates Duo’s pesky habit of popping up right when the leisurely activities tee off.

The Sandbox.

 Our final metaverse world covered here is also one of the oldest, having launched back in May 2012. Since then, the Sandbox is enjoyed by 300,000 monthly users. Like Decentraland, it’s a Web3-based world where users can purchase land and build their own monetized environments. The platform is relatively consistent in look and feel, taking a voxel art style reminiscent of Minecraft as a nod to its 2D roots. Available on PC and mobile platforms, the Sandbox handles onboarding very easily: new users have the choice to connect a crypto wallet or a social account if they lack one.

As far as interactions go, the Sandbox offers a mix that you’ll find in other platforms. Like Roblox, users can easily construct their own games without coding experience. A play-to-earn model rewards creators and players, incentivizing play on the platform. And similar to Decentraland, a marketplace of NFTs in the form of avatars and unique parcels of land add to the opportunities for users to earn real-world value from their digital creations. One cool example we love is a collaboration between Tony Hawk and Autograph to build the biggest skatepark in the metaverse. In addition to hanging out in the space, visitors can purchase NFTs inspired by Hawk’s career.

Find your place in the metaverse.

Just like how the universe contains too many planets to count, the metaverse is a vast space comprising unique worlds—each with their own distinct culture. From video game worlds to Web3-native environments, each platform offers different tools for people and brands to engage with one another. Before jumping into the metaverse for the sake of it, carefully consider the audience you want to reach and how your brand can uniquely add value inside the world cultivated by its community. After making the right culture fit, you’ll have made the crucial first step in building impactful, authentic metaverse experiences.

The metaverse is a natural progression of the internet, and it reflects a cultural shift brought on by the ongoing process of virtualization. The metaverse is everywhere––a universal and connected experience that transcends geographical barriers and presents exciting opportunities for brands to show up. But, the stakes are high and the barrier to entry is steeper than ever. Advertising in the metaverse shouldn’t look like advertising at all. Brands need to strike a balance between being present and being authentic by providing utility and meaning for people through creativity and technological innovation. In short, brands must create experiences people actually want. While this isn’t a new idea, marketers will have to stretch their thinking for a new, fully-virtualized medium and a highly engaged audience quick to criticize disingenuous marketing efforts.

Want your own map of the metaverse worlds? Download it below: 

For your desktop (with stats)

For your desktop (no stats)

For your phone (no stats)

Do you need to find your way into the metaverse? Don’t despair, because we’ve mapped out some of its preeminent worlds. metaverse virtual experiences gaming game engine brand virtualization Experience AI & Emerging Technology Consulting Metaverse

ComplexLand • An Immersive Virtualization of an Iconic Cultural Festival

  • Client

    Complex Networks

  • Solutions

    ExperienceInnovation SprintsRetail Concept InnovationImpactful Brand Activations

00:00

00:00

00:00

Case Study

0:00

Reimagining an icon.

ComplexCon is an institution among youth culture and style icons: a cultural mecca that brings the Complex Networks community and the hottest brands together to celebrate convergence culture. Realizing that trendsetters are increasingly just as interested in their digital identities as their physical ones, we leveraged this insight into new consumer behaviors to design ComplexLand: a free, immersive 3D digital platform featuring exclusive drops, ecommerce features, performances from top-selling artists and unique brand partnerships—the likes of Gucci, Versace and more.

Balancing accessibility with exclusive experiences.

While many virtual events try (and sometimes fail) to capture the energy of a crowded room, ComplexLand stands out as a single-player experience focused on global accessibility, community and lots of shoppable merch—a key feature enabled by Shopify’s robust system. By introducing exclusive brand partnerships that make it fun for visitors to shop, the platform has become Complex Network’s second-largest source of revenue, and the ultimate example of how an authentic, entertaining experience can drive sales.

What’s more, it’s far from complex when it comes to usability. The experience is powered by WebGL, meaning attendees can reach the fully realized virtual theme park on both mobile and desktop devices—no app or download required. Part sci-fi treasure hunt and part virtual bazaar, players are free to roam the map and discover musical performances, food deliveries, celebrity panel discussions and screenings—then brag about it with others in a persistent chat room.

Our Craft

A virtual experience that makes shopping easy.

  • An avatar visits the Complex store
  • A bunch of shoppable merch is look through at the Complex store
  • Avatar in Complexland landscape with colorful mountain
  • Avatar in Complexland having a chat with another avatar

A future-proofed partnership.

Striking a meaningful connection of game mechanics and street culture while evoking the festival atmosphere, ComplexLand provides a digital space where people can shape their virtual identities and participate in compelling branded experiences. And just like culture itself, the annual event is in constant evolution. A year after the initial launch of ComplexLand in 2020, its second edition brought even more opportunities for attendees to engage with others in a multiplayer experience, like sharing drops, having one-to-one conversations and even interacting with branded non-playable characters. In its third iteration, we opened the possibility to make NFTs, which creators can use to build their communities and express their creative identity. 

Since the start of our partnership years ago, ComplexLand has grown into a profitable media and retail platform that combines commerce and entertainment. It’s the first of its kind to condense more than 70 brands into one shared virtual experience—allowing the institution to establish new partnerships with the hottest brands driving culture today. All thanks to our joint commitment to leverage the newest Web3 technologies and create a place where people can express themselves and connect with others.

An avatar in ComplexLand
A virtual pair of shoes
Press From the first virtual event in the metaverse in December 2020 came a franchise that the publisher now sees as a permanent addition to its events business. And it’s a potentially lucrative addition at that.
Read on Digiday

Results

  • $700,000+ in sales during the 5 days of ComplexLand 1.0
  • Complexland 2.0’s gamified virtual shopping increased sponsorship revenue by 60%
  • Since ComplexLand’s launch, it’s brought 200+ brands to the annual event
  • 2x FWAs

  • 1x The Drum Experience Awards

Want to talk innovation? Get in touch.

Hey 👋

Please fill out the following quick questions so our team can get in touch with you.

Can’t get enough? Here is some related work for you!

The Creator Economy, A True Game-Changer

The Creator Economy, A True Game-Changer

4 min read
Profile picture for user mediamonks

Written by
Monks

Three images of people gaming, taking selfies, and singing

In case you haven’t seen the writing on the wall, the days of passive spectatorship are over and done. With the rise of social media, brands have been handed the opportunity to engage in two-way conversations with their audience, as the power of conversation shifted towards consumers. Still, as innovation never sleeps, this system may soon lose its popular standing: word has it that Web3 is slowly but surely becoming a bigger dot on the horizon, paving the way for an internet era that’s governed by the collective. While people like you and me are enjoying this strong wind in the sails, brands are figuring out how to stay afloat in following the currents of our industry. After all, brands are always searching for the best strategy to connect with new and existing audiences, our Social Innovation Lab says in their latest report, The Year of Digital Creators.

It’s all in the name, because collaborating with digital creators has proven to be one very effective way for brands to work their crowd. With influencers and creators—yes, they are different and the report will tell you why—the “creator economy” has rapidly expanded to a $20 billion industry. As such, it’s a growing focus for brands in catching the attention of consumers. This fast-expanding segment of our industry is driven by the ease of producing high-quality creative content, forging connections over shared interests and passions, and serving the desire for an authentic community. 

Gaming, our Social Innovation Lab highlights, is a great example of an industry where this creator economy can thrive, as it offers a digital space where influencers and creators, like livestreamers, are consistently strengthening connections with their communities at the intersection of content and entertainment. While the report touches on several areas where digital creators are adding depth, let’s use this space to dive into gaming and explore why it’s not only super fun, but also super efficient in speaking to your audience. 

Getting Ahead Of The Game 

As social distancing reigned these past two years, many people moved from physical to virtual worlds in search of new forms of entertainment. As such, it may not come as a surprise that the global gaming market has grown 21.9% compared to pre-pandemic levels, Statista reports. In 2020, there were nearly 2.7 billion active gamers worldwide and this number continues to grow, especially among Gen Z. The gaming industry has an incredible global reach—and leading brands are taking note, increasingly embracing this culture with open arms. 

“The gaming industry, particularly the culture, is becoming more crucial for every brand that wants to survive in the next twenty to thirty years. This has everything to do with upcoming generations, who will grow older and with time gain more buying power,” says Funs Jacobs, our Gaming Category Lead. “81% of Gen Z’ers have played video games—the highest share of any generation—so if you fail to understand this culture, you’re going to miss the connection with this audience and every following generation.” Check out our podcast episode with Funs Jacobs. 

So, collaborations between brands and gaming platforms are not just becoming more common, but also more serious, with the former owning virtual spaces inside video games or even producing their own unique, artistic gamified experiences. That said, the gaming industry is completely new territory and vastly different from what brands are used to creating in collaboration with influencers. Nevertheless, many industries don’t shy away from a challenge. Fashion and luxury, which is known for its innovative spirit and commitment to speaking to the moment and shifts in culture, was one to quickly tap into the upcoming gaming trend. Nike, for example, recently built its metaverse store Nikeland in Roblox, allowing nearly 7 million visitors worldwide to try on virtual products and play various games. Through such gaming platform collaborations, fashion brands are able to gamify their virtual products, thereby making the shopping experience all the more exciting. 

Livestream to Streamline Your Community Engagement 

Talking about excitement, live action is indispensable to a gamified brand experience in an era of digital creators. While gaming was originally a digital experience that combined gameplay, interactivity and narrative, it now also entails streaming technologies that enable the creation of online communities centered on the acts of playing and watching. Nowadays, popular game streamers are able to interact and connect with fans around the world and across platforms like Twitch, Instagram and YouTube. Brands want to reach audiences far and wide, whereas streamers want to be sponsored and earn an income from doing what they love. So, it wasn’t long before the two joined forces—it’s a true win-win. 

Community-building is central to the gaming experience. “Communities are being formed in and around gaming, which is fascinating,” says Jacobs. “However, many brands don’t have a strong community at the moment. They may have fans, but they don’t have that 360-degree relationship with their consumers—and that is something that needs to change in order for brands to survive.” Through collaborations with game streamers, brands are able to tap into diverse digital communities. Within these communities, the work of game streamers especially contributes to building an environment in which fans are not just entertained, but also gain a sense of belonging. 

Our Social Innovation Lab argues that “community” is the new version of word-of-mouth, and the opinions of people who are influential online can either boost or block sales. Belonging, information-sharing and the demand for a product are all stimulated by the powerful influence that digital communities can wield. Moreover, they provide a very useful space for brands to gather insights and feedback. So, by getting to know the digital community and looking closely at its behaviors, preferences and needs, brands have a unique opportunity to deliver tailored products, services and content. In other words, the digital community is an innovative, effective and fast-paced way for brands to sell directly to their consumers in the social media space—but more about community commerce can be found in the report. 

Forecasting The Future Of Creators  

While subcultures and tight-knit communities interacting across platforms characterize gaming culture today, it could look completely different tomorrow. The industry is growing, innovating and evolving faster every day. Fortunately, we now know that brands are paying close attention to these important developments. Always ahead of the industry, our Social Innovation Lab expects that one will be metaverse integration, predicting that in the next five years game streamers will interact with fans through their avatars in the metaverse while wearing virtual products and playing games together. Again, both efficient and super fun. 

What else do we think might change? Explore the report and find out more about the current state and future of digital creators.

Our Social Innovation Lab dives into gaming and explores why it’s not only super fun, but also super efficient in speaking to your audience. gaming social media marketing Web3 virtual experiences shoppable content

Scrap the Manual: Virtualization of Real World Objects into Live Shows

Scrap the Manual: Virtualization of Real World Objects into Live Shows

16 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

Linear strobes cross the title of the article: virtualization of real world objects into live shows

What if you could scan an object in your environment and bring it into a live show? In this “How Do We Do this” episode of the Scrap the Manual podcast, we respond to an audience-provided question from one of you! Tune in to learn more about the differences between 3D scanning and photogrammetry, content moderation woes, and what it could take to make this idea real.

You can read the discussion below, or listen to the episode on your preferred podcast platform.

00:00

00:00

00:00

Angelica: Hey everyone! Welcome to Scrap The Manual, a podcast where we prompt “aha” moments through discussions of technology, creativity, experimentation, and how all those work together to address cultural and business challenges. My name is Angelica.

Rushali: And my name is Rushali. We are both creative technologists with Labs.Monks, which is an innovation group within Media.Monks with a goal to steer and drive global solutions focused on technology and design evolution.

Angelica: Today, we have a new segment called “How Do We Do This?” where we give our audience a sneak peek into everyday life at Labs and open up the opportunity for our listeners to submit ideas or projects. And we figure out how we can make it real. We'll start by exploring the idea itself, the components that make it unique and how it could be further developed, followed by doing a feasibility check. And if it isn't currently feasible, how can we get it there? 

Which leads us to today's idea submitted by Maria Biryukova, who works with us in our Experiential department. So Maria, what idea do you have for us today?

Maria: So I was recently playing around with this app that allows you to scan in 3D any objects in your environment. And I thought: what if I could scan anything that I have around me—let's say this mic—upload it live and see it coming to life on the XR stage during the live show?

Angelica: Awesome, thanks Maria for submitting this idea. This is a really amazing creative challenge and there's a lot of really good elements here. What I like most about this idea, and is something that I am personally passionate about, is the blending of the physical world with the digital world, because that's where a lot of magic happens.

That's where, when AR first came out, people were like, “Whoa, what's this thing that's showing up right in front of me?” Or in VR when they brought these scans of the real world into the very initial versions of Google cardboard headsets, that was like the, “Whoa! I can't believe that this is here.”

So this one's touching upon the physicality of a real object that exists in the real world…someone being able to scan it and then bring it into a virtual scene. So there's this transcending of where those lines are, which are already pretty blurred to begin with. And this idea continues to blur them, but I think in a good way, and in a way that guests and those who are part of the experience can have a role where they're beyond being a passive observer, into being an active participant into these.

We see this a little bit with WAVE, where they have a virtual space and people are able to go to this virtual concert and essentially pay like $5 to have a little heart get thrown to Justin Bieber's head. Lovingly, of course, but you get the point. Where this one, it takes it another step further in saying, “Okay, what if there's something in my environment?”

So maybe there is an object that maybe pertains to the show in a particular way. Let's say that it's like a teddy bear. And all the people who have teddy bears around the world can scan their teddy bear and put it into this environment. So they're like, “Oh, that's my teddy bear.” Similar to when people are on the jumbotron during sports events and they're like, “Hey, that's my face up there.” And then they go crazy with that. So it allows more of a two way interaction, which is really nice here. 

Rushali: Yeah. That's the part that seems interesting to me. As we grow into this world where user-generated content is extremely useful and we start walking into the world of the metaverse, scanning and getting 3D objects that personally belong to you—or a ceramic clay thing, or a pot that you made yourself—and being able to bring it into the virtual world is going to be one of the most important things. Because right now, in Instagram, TikTok, or with any of the other social platforms, we are mostly generating content that is 2D, or generating content that is textual, or generating audio, but we haven't explored extremely fast 3D content generation and exchange the way that we do with pictures and videos on Instagram. So we discussed the “why.” It's clearly an interesting topic, and it's clearly an interesting idea. Let's get into the “What.”

Angelica: Yeah. So from what we're hearing on this idea, we have scanning the object, which will connect to 3D scanning and photogrammetry, which we can get a little bit into the differences between the two different types of technologies. And then when the scan is actually added into the environment, is it cleaned up? Is it something where it acts as its own 3D model without any artifacts from the environment that it was originally scanned in? And a part of that is also the compositing. So making sure that the object doesn't look like a large ray of sunlight when the event is very moody and dark. It needs to fit within the scene that it's within.

And we're hearing content moderation, in terms of making sure that when guests of any kind become a little bit more immature than the occasion requires, that it filters out those situations to make sure that the object that needs to be scanned in the environment is a correct one.

Rushali: Absolutely. What was interesting while you were walking through all the different components was the way that this idea comes together: it's just so instantaneous and real time that we need to break down how to do this dynamically. 

Angelica: Yeah. And I think that's arguably the most challenging part, aside from the content moderation aspect of it. Let's use photogrammetry as an example. Photogrammetry is the process of taking multiple pictures of an object from as many sides as you can. An example of this is with Apple's Object Capture API. You just take a bunch of photos. It does its thing, it processes it, it thinks about it. And then after a certain amount of time (sometimes it's quick, sometimes it's not…depends on how high quality it needs to be), it'll output a 3D model that it has put together based on those photos.

Rushali: Yeah. So the thing that I wanted to add about photogrammetry, that you described very well, was that in the last just five years, photogrammetry has progressed from something very basic to something outrageously beautiful very quickly. And that one of the big reasons for that is how the depth sensing capability came in and became super accessible.

Imagine someone standing on a turntable and taking pictures from each and every angle and we are turning the turntable really slowly and you end up having some 48,000 pictures to stitch together and then create this 3D object. But a big missing piece in this puzzle is the idea of depth. Like is this person's arm further away or closer? And when the depth information comes in, suddenly it becomes a lot more evolved to have a 3D object with that depth information. So iPhones’ having a depth sensing camera, closer to last year and the year before that, have really enhanced the capabilities. 

Angelica: Yeah, that's a good point. There is an app that had been doing this time-intensive and custom process for a very long time. But then when Apple released the Object Capture API, they said, “Hey, actually, we're going to revamp our entire app using this API.” And they even say that it's a better experience for iPhone users because of the combination of the Object Capture API and leveraging the super enhanced cameras that are now coming out of just a phone.

Android users, you're not out of the woods here. Some Samsung phones, like the Samsung 20 and up have a feature embedded right in the phone software where you can do the same process that I was just mentioning earlier about a teddy bear.

There's a test online where someone has a teddy bear to the left of a room, they scan it and then they're able to do a point cloud of the area around it. So they could say, “Okay, this is the room. This is where the walls are. This is where the floor is.” And then paste that particular object that they just scanned into the other corner of the room and pin it or save it. So then if they leave the app, come back in, they can load it, and that virtual object is still in the same place because of the point cloud scanning their room, their physical room, and they put it right back where it was. So it's like you got a physical counterpart and a digital counterpart. It's more than just iPhones that are having these enhanced cameras. That experience is made possible because Samsung cameras are also getting better and better over time. 

The process I was just explaining about the teddy bear, the point cloud, and placing it into an environment…that is a great example of 3D scanning, where you can move around the object but it's not necessarily taking a bunch of photos and stitching them together to create a 3D model. The 3D scanning part is a little bit more dynamic. But it is quite light sensitive. So, for example, if you're in a very sunny area, it'll be harder to get a higher quality 3D model from that. So keeping in mind, the environment is really key there. Photogrammetry as well, but 3D scanning is especially sensitive to this.

Rushali: So children, no overexposure kindly… 

Angelica: …of photos. Yes. [laughter]

Though, that scanning process can take some time and it can vary in terms of fidelity. And then also that 3D model may be pretty hefty. It may be a pretty large file size. So then that's when we're getting into the conversation of having this be uploaded to the cloud and offload some of that storage there. Not beefing up the user's phone, but it goes somewhere else and then the show could actually funnel that into the experience, after the content moderation part of course. 

Rushali: You've brought up a great point over here as well, because a big, big, big chunk of this is also fast internet at the end of the day because 3D files are heavy files. They are files that have a lot of information about the textures.

The more polygons there are, the heavier the files. All of that dramatic graphics part comes into play and you are going to get stuck if you do not have extremely fast internet. And I *wink, wink* think 5G is involved in this situation. 

Angelica: Yeah, for sure. 5G is definitely a point of contention in the United States right now, because they're converting to that process, which is affecting aviation and the FAA and other things like that. So it's like, yeah, the possibilities with 5G are huge, but there's some things to work out still.

Rushali: So that's the lay of the land of 3D scanning and photogrammetry. And we do have apps right now that in almost real time can give you a 3D scan of an object in their app. But the next part is the integration of this particular feature with a live show or a virtual ecosystem or putting it into a metaverse. What do you think that's going to look like?

Angelica: This will involve a few different components. One: being the storage onto the cloud, or a server of some kind that can store, not just one person’s scan, but multiple people's scans. And I could easily see an overload situation where if you say to an audience of Beliebers, “Hey, I want you to scan something.”

They're like, “Okay!” And you got 20,000 scans that now you dynamically have to sift through and have those uploaded into the cloud to then be able to put into the experience. I can anticipate quite an overload there.

Rushali: You're absolutely on point. You're in a concert: 20,000 to 50,000 people are in the audience. And they are all scanning something that they have either already scanned or will be scanning live. You better have a bunch of servers there to process all of this data that's getting thrown your way. Imagine doing this activity, scanning an object and pulling it up in a live show. I can 100% imagine someone's going to scan something inappropriate. And since this is real time, it's gonna get broadcasted on a live show. Which brings into the picture the idea of curation and the idea of moderation.

Angelica: Because adults can be children too.

Rushali: Yeah, absolutely. If there's no moderation…turns out there's a big [adult product] in the middle of your concert now. And what are you going to do about it? 

Angelica: Yeah exactly. Okay, so we've talked about how there are a lot of different platforms out there that allow for 3D scanning or the photogrammetry aspect of scanning an object and creating a virtual version of it, along with a few other considerations as well.

Now we get into…how in the world do we do this? This is where we explore ways that we can bring the idea to life, tech that we can dive a bit deeper into, and then just some things to consider moving forward. One thing that comes up immediately (we've been talking a lot about scanning) is how do they scan it? There's a lot of applications that are open source that allow a custom app to enable the object capture aspect of it. We talked about Apple, but there's also a little bit that has been implemented within ARCore, and this is brought to life with the LiDAR cameras. It's something that would require a lot of custom work to be able to make it from scratch. We would have to rely on some open source APIs to at least get us the infrastructure, so that way we could save a lot of time and make sure that the app that's created is done within a short period of time. Because that's what tends to happen with a lot of these cool ideas is people say, “I want this really awesome idea, but in like three months, or I want this awesome idea yesterday.” 

Rushali: I do want to point out that a lot of these technologies have come in within the last few years. If you had to do this idea just five years ago, you would probably not have access to object capture APIs, which are extremely advanced right now because they can leverage the capacity of the cameras and the depth sensing. So doing this in today's day and age is actually much more doable, surprisingly.

And if I had to think about how to do this, the first half of it is basically replicating an app like Qlone. And what it's doing is, it's using one of the object capture APIs, but also leveraging certain depth sensing libraries and creating that 3D object. 

The other part of this system would then be: now that I have this object, I need to put it into an environment. And that is the bigger unknown. Are we creating our own environment or is this getting integrated into a platform like Roblox or Decentraland? Like what is the ecosystem we are living within? That needs to be defined.

Angelica: Right, because each of those platforms have their own affordances to be able to even allow for this way of sourcing those 3D models dynamically and live. The easy answer, and I say “easy” with the lightest grain of salt, is to do it custom because there's more that you can control within that environment versus having to work within a platform that has its own set of rules.

We learned this for ourselves during the Roblox prototype for the metaverse, where there are certain things that we wanted to include for features, but based on the restrictions of the platform, we could only do so much.

So that would be a really key factor in determining: are we using a pre-existing platform or creating a bespoke environment that we can control a lot more of those factors?

Rushali: Yeah. And while you were talking about the ecosystem parts of things, it sort of hit me. We're talking about 3D scanning objects, like, on the fly as quickly as possible. And they may not come out beautifully. They may not be accurate. People might not have the best lighting. People might not have the steadiest hands because you do need steady hands when 3D scanning objects. And another aspect that I think I would bring in over here when it comes to how to do this is pulling in a little bit of machine learning so that we can predict which parts of the 3D scan have been scanned correctly versus scanned incorrectly to improve the quality of the 3D scan.

So in my head, this is a multi-step process: figuring out how to object capture and get that information through the APIs available by ARCore or ARKit (whichever ones), bring the object and run it through a machine learning algorithm to see if it’s the best quality, and then bring it into the ecosystem. Not to complicate it, but I feel like this is the sort of thing where machine learning can be used. 

Angelica: Yeah, definitely. And one thing that would be interesting to consider is that the dynamic aspect of scanning something and then bringing it live is the key part in all this. But it also has the most complications and is the most technology dependent, because there's a short amount of time to do a lot of different processes.

One thing that I would recommend is: does it have to be real time? Could it be something that's done maybe a few hours in advance? Let's say that there's a really awesome Coachella event where we have a combination of a digital avatar influencer of some kind sharing the stage with the live performer. And for VIP members, if they scan an object before the show, they will be able to have those objects actually rendered into the scene.

So that does a few different things. One: it decreases the amount of processing power that's needed because it's only available for a smaller group of people. So it becomes more manageable. Two: it allows for more time to process those models at a higher quality. And three: content moderation. Making sure that what was scanned is something that would actually fit within the show.

And there is a little bit more of a back and forth. Because it's a VIP experience, you could say: “Hey, so the scan didn't come out quite as well.” I do agree with you, Rushali, that having implementation of machine learning would help in assisting this process. So maybe having it a little bit of time before the actual experience itself would alleviate some of the heaviest processing and the heaviest usage that can cause some concerns when doing it live. 

Rushali: And to add to that, I would say this experience (if we had to do it today in 2022) would probably be something on the lines of this: you take the input at the start of a live show, and use the output towards the end of it. So you have those two hours to do the whole process of moderation, to do the whole process of passing it through quality control. All of these steps that need to happen in the middle. 

Also, there's a large amount of data transfers happening as well. You're also rendering things at the same time and this is a tricky thing to do in real time as of today. You need to do it with creative solutions, with respect to how you do it. And not with respect to the technologies you use, because the technologies currently have certain constraints. 

Angelica: Yeah, and technology changes. That's why the idea is key because maybe it's not perfectly doable today, but it could be perfectly doable within the next few years. Or even sooner, we don't know what's going on behind the curtain of a lot of the FAANG companies. New solutions could be coming out any day now that enable some of these pain points within the process to be alleviated much more. 

So we've talked about the dynamic aspect of it. We've talked about the scanning itself, but there are some things to keep in mind either for those scanning an object. What are some things that would help with getting a clean scan? 

There's the basics, which is avoid direct lighting. So don't do the theater spotlight on it because then that’ll blow out the picture. Being uniformly lit is a really important thing here, making sure to avoid shiny objects. While they're pretty pretty, they are not super great at being translated into reliable models because the light will reflect off of them.

Those are just a few, and there's definitely others, but those are some of the things that during this process would be a part of the instructions when the users are actually scanning this. After the scan is done, like I mentioned, there are some artifacts that could be within the scan itself. So an auto clean process would be really helpful here, or it has to be done manually. The manual part would take a lot more time, which would hurt the feasibility aspect of it. And that's also where maybe the machine learning aspect could help with that. 

And then in addition to cleaning it up would be the compositing, making sure that it looks natural within the environment. So all those things would have to be done either as a combination of an automated process or a manual process. I could see where the final models that are determined to be put into the show,  those can be a more manual process to make sure that the lighting suits the occasion. And if we go with the route that you mentioned, which is do it at the very beginning of the show, then we have a bunch of time (and I say a bunch it's really two hours optimistically) to do all of these time-intensive processes and make sure that it's relevant by at the end of the show.

Moderation is something we've also talked about quite a bit here as well. There's a lot of different ways for moderation to happen, but it's primarily focused on image, text and video. There is a paper out of Texas A&M University that does explore moderation of 3D objects, more to prevent the NSFW (not safe for work) 3D models from showing up when teachers just want their students to look up models for 3D printing. That's really the origin of this paper. And they suggested different ways that the learning process of moderation could be done, which they mention is the human-in-loop augmented learning. But it's not always reliable. This is an exploratory space that there's not a lot of concrete solutions in. So this would be something that would be one of the heavier things to implement, just looking at the entire ecosystem of the concept of what would need to be implemented.

Rushali: Yeah, if you had to add a more sustainable way. And when I say sustainable, I mean, not with respect to the planet, because this project is not at all sustainable, considering there’s large amounts of data being transferred. But coming back to making the moderation process more sustainable, you can always open it up to the community. So the people who are attending the concert decide what goes in. Like maybe there's a voting system, or maybe there is an automated AI that can detect whether someone has uploaded something inappropriate. There's different approaches within moderation that you could take. But for the prototype, let's just say: no moderation because we are discussing, “How do we do this?” And simplifying it is one way of reaching a prototype.

Angelica: Right, or it could be a manual moderation.

Rushali: Yes, yes.

Angelica: Which would help out, but you would need to have the team ready for the moderation process of it. And it could be for a smaller group of people.

So it could be for an audience of, let's say 50 people. That's a lot smaller of an audience to have to sift through the scans that are done versus a crowd of 20,000 people. That would definitely need to be an automated process if it has to be done within a short amount of time.

So in conclusion, what we've learned is that this idea is feasible…but with some caveats. Caveats pertaining to how dynamic the scan needs to be. Does it need to be truly real time or could it be something that can take place over the course of a few hours, or maybe even a few days or a few weeks? It makes it more or less feasible depending upon what the requirements are there.

The other one is thinking about the cleanup, making sure that the scan is fitting with the environment, it looks good, all those types of things. The moderation aspect to make sure that the objects that are uploaded are suited to what needs to be implemented. So if we say, “Hey, we want teddy bears in the experience,” but someone uploads an orange. We probably don't want the orange, so there is a little bit of object detection there.

Okay, that's about it. Thanks everybody for listening to Scrap The Manual and thank you, Maria, for submitting the question that we answered here today. Be sure to check out our show notes for more information and references of things that we mentioned here. And if you like what you hear, please subscribe and share. You can find us on Spotify, Apple Podcasts, and wherever you get your podcasts. 

Rushali: And if you want to suggest topics, segment ideas, or general feedback, feel free to email us at scrapthemanual@mediamonks.com. If you want to partner with Media.Monks Labs, feel free to reach out to us over there as well. 

Angelica: Until next time…

Rushali: Thank you!

In this “How Do We Do this” episode of the Scrap the Manual podcast, we learn more about the differences between 3D scanning and photogrammetry, content moderation woes, and what it could take to make this idea real. 3D printing 3D content virtual experiences

Get Versed in the Metaverse

Get Versed in the Metaverse

AI & Emerging Technology Consulting AI & Emerging Technology Consulting, Experience, Extended reality, Metaverse 1 min read
Profile picture for user mediamonks

Written by
Monks

Metaverse Image

The Metaverse Demystified

As the next phase of transformation, virtualization is changing how we interact with the digital touchpoints in our lives—and has set the stage for the metaverse. As more headlines espouse the benefits the metaverse will bring to digital audiences, it’s important to step back and understand the overarching concepts and virtualized behaviors that shape the space. Our report provides a straightforward overview of the metaverse with the context you need to gain a better understanding of the reinvention of the web.

Making the Metaverse report cover page

You're one download away from:

  • Understanding exactly what virtualization is and how digital transformation plays into it.
  • Learning about tech-tonic trends and themes that are driving demand for immersive, new experiences.
  • Building an understanding of where your brand fits into the metaverse.

This experience is best viewed on Desktop

Download Now
This report provides a straightforward overview of the metaverse with the context you need to gain a better understanding of the space and the overarching concepts. This report provides a straightforward overview of the metaverse with the context you need to gain a better understanding of the space and the overarching concepts. metaverse brand virtualization virtual experiences virtual experiences Digital transformation AI & Emerging Technology Consulting Experience Metaverse Extended reality

A Backstage Look at the Metaverse’s First Music Award Show

A Backstage Look at the Metaverse’s First Music Award Show

5 min read
Profile picture for user mediamonks

Written by
Monks

Bretman Rock, GAYLE, and Lizzo Roblox avatars

Historically, award shows have been exclusive affairs: from serving looks on the red carpet to the after parties, much of the experience is mediated through the screen to viewers far away. The vicarious thrill of the glitz and glamour isn’t lost on viewers watching from home, but gains in digital have the potential to reinvent the award show format into experiences that are playful, interactive and open to anyone with an internet connection.

“Television award shows are seeing a decline in viewership, while social media and audience expectations have challenged what an award show should even be,” says Eric W. Shamlin, Media.Monk’s Global Head of Entertainment. “Today the audience is more demanding and in more control than ever before.”

Digital culture has already made its mark in mainstream ceremonies, as exhibited in TikTok’s influence over the Best New Artist category at this year’s GRAMMYS. But more broadly speaking, the internet has played a substantial role in shaping culture through music. This trend is proven by the Billboard Song Breaker Chart, a monthly music industry chart co-created with Logitech For Creators, to spotlight trendsetting creators who are driving music consumption through content creation and positively disrupting the traditional music business model.

This year the second annual Song Breaker Awards, presented by Logitech For Creators, is putting those names on display. The ceremony will take place in Roblox on April 30 at 10 a.m. PST, pulling back the velvet rope to deliver a creator-centered, fun-first experience honoring ten individuals driving culture. As reported by Fast Company, it’s the first music award show in the metaverse and will be hosted by Bretman Rock. The experience culminates in performances by singer-songwriter GAYLE and multi-GRAMMY award-winning artist Lizzo, who is making her metaverse debut with a new single. But more importantly, the show offers a blueprint to how metaverse worlds can uniquely celebrate and enable creativity at an unprecedented scale.

"In re-imagining what a creator-focused award show could look like in the metaverse, we sought a partner that could guide us in this brand new space,” said Meridith Rojas, Global Head of Talent and Entertainment, Logitech. "In order to create for this format, you need to know this format. Media.Monks' inclusion of narrative and game-play resulted in a truly vibrant and engaging environment and production that's sure to surprise and delight the audience."

An Awards Show Starring… You

Owing to the Song Breaker Awards’ roots in digital creativity, Roblox itself is a robust platform for creative expression. It makes game and experience development easy for audiences with little to no background in programming, which has sparked a thriving creator community of 49.5 million average daily active users in Q4 2021. This makes the platform an ideal space for an engaging twist on musical award shows—if you know what the community values.

We partnered with Kurt Bieg, Chief Game Designer at Simple Machine, to create an experience that feels authentic and engaging to the Roblox community. “We built the show around a story that kids will care about,” says Bieg, teasing a high-stakes plot that players will engage with in real time. But focusing on a participatory narrative doesn’t mean the award ceremony itself falls into the background. “As viewers will see, the honorees will play a key role in shaping the story. It supports Logitech For Creators’ promise, ‘Together we create,’” says Bieg.

 

Monk Thoughts Moving the classic award show into the metaverse allows for maximum engagement and increased levels of interactivity and storytelling. There’s no turning back and we’re proud to support Logitech in this bold move that will set the standard for awards shows of the future.
Eric Shamlin headshot

Once inside the world, attendees can explore a futuristic city tricked out with gamified and interactive elements, from larger-than-life selfies, a shop stocked with digital Lizzo and Logitech merch, and even a roller coaster that glides throughout the city’s borders. What makes this space unique to other high-profile Roblox experiences is that the entire event takes place within one seamless environment. “Usually, Roblox experiences play out across a series of scenes like dioramas, but ours is one persistent world,” says Bieg.

Song Breaker Awards Roblox entrance

In that respect, the Song Breaker Awards might be more like a physically embodied event than initially expected. “We treated the show more like a traditional stage, where set pieces are moving in and out,” says Brett Burton, Creative Director at Media.Monks. “It seems like an old-fashioned way of doing things, but it is technically challenging.”

Interactive Environments Range from Chill to Thrill

As people explore the city, they’re met with a series of experiences that build up excitement before the big show—perhaps the flashiest being the Selfie XL, a unique innovation developed for the Song Breaker Awards. Rather than take a photo, the camera recreates players’ avatars with a clone that towers over the city skyline for everyone to see.

The Selfie XL is made possible through a quirk in the way selfies  function in the platform. When you take a photo in Roblox, the program renders a copy of the scene and flattens it into a 2D image. “We thought, ‘If we’re cloning the models, why don’t we just make them ginormous and not flatten them?’” says Bieg, who expects fans will try to push the Selfie XL to its limits, like trying to fit as many people as possible in its field of view.

A roller coaster designed to look like Logitech mice makes for another way to immerse oneself in the city as it travels across various places of interest. It’s just one of many creative nods to popular Logitech products that players may recognize from their own setups. “We wanted to turn Logitech’s products into unique experiences that would surprise everyone,” says Burton. “We wanted to incorporate them in ways that didn’t feel heavy handed but fit the world.”

Taking another cue from amusement park design, these main attractions—outside of the musical performances and award ceremony itself—are cleverly spaced out to invite attendees to explore around the entire city. Burton notes that many nooks and crannies are designed to be meme-able to encourage sharing—think of them like the virtual version of selfie stations you might find at a zoo or a museum. As people wander, they may uncover a handful of Creator Coins that unlock exclusive dances and animations to uniquely express themselves throughout the show.

Doubling Down on Creative Expression

Speaking of expressing oneself, one of the most fun things about Roblox is outfitting your avatar and showing off your style. Meanwhile, one of the best parts of seeing a good show is hitting the merch store for a memento that expresses your fandom. Those who attend the Song Breaker Awards can do both by exploring an immersive shop filled with virtual Lizzo merch and gear based on Logitech products, like a Blue Yeti microphone arm or an Astro headset.

The approach to merch hearkens back to Logitech For Creators’ purpose in supporting digital content creators. “We thought around what would make the merch valuable to someone playing Roblox. The idea was to let people dress themselves like a walking influencer,” says Bieg. Some of the pieces, like a body suit in the shape of a mouse, are sillier—serving the community’s love for funniness and memes.

Inside the merch store within Roblox for the Song Breaker Awards

Of course, the headlining experience of the Song Breaker Awards is the show itself, as well as lifelike virtual performances by GAYLE and Lizzo using motion capture. Viewers will also get to see MeganPlays (known as “The Peachy Princess of Roblox”), musician and activist Jaden Smith and Twitch streamer Shroud throughout the event. But beware: we hear that a mischief maker may crash the party to throw things into disarray. Can the power of creativity and community set things right?

“From start to finish, the level of detail that went into designing a fully-immersive and interactive world inside Roblox is truly unmatched,” said Nick Cicchetti, Media.Monks Senior Producer. “Beyond that, the narrative and storytelling that ties the performances together with internet culture to bring everyone in the audience from passive viewer to active participant is something that can only be done in a new environment made possible by technology and creativity. I’m thrilled to push the boundaries of what’s possible in the metaverse to bring the Song Breaker Awards to life, and I can’t wait for everyone to experience it firsthand.” 

You can explore the Song Breaker Awards pre-show area in Roblox right now. Look forward to the main event on Saturday, April 30 at 10 a.m. PST with three additional screenings throughout the weekend.

Discover how we built the creator-centered Song Breaker Awards experience to Roblox, the first music award show in the metaverse. metaverse brand virtualization virtual experiences virtual events

Social Bites: Virtual Influencers Come Alive

Social Bites: Virtual Influencers Come Alive

4 min read
Profile picture for user mediamonks

Written by
Monks

A barbie doll and a young girl stand side by side

Virtual influencers are followed by millions. Mimicking the roadmap of a real-life celebrity—albeit with a few advantages—they give music concerts, collaborate with brands and even entertain kids from their YouTube channels. They come in all shapes, sizes and forms—ranging from CGI models that mimic humans to cartoon-style characters—and they’ve become fundamental players in the multi-million dollar industry of influencer marketing.

While some may mistake them for another trend in the new era of virtualization, a mere by-product of the metaverse, or a new development to demonstrate the power of artificial intelligence, these virtual personalities have been around for a while. You may have heard about Hatsune Miku, for example, a virtual singer created by the music technology company Crypton Future Media, Inc. She was released to the world in 2007 and has since performed sold-out concerts worldwide—including venues in LA, Singapore and Tokyo. Or perhaps you remember Lighting, a character from the Final Fantasy franchise with whom Louis Vuitton partnered in 2016 to model their Series 4 collection.

Either way, the concept of a virtual celebrity is not new. Its widespread growth, however, may have been deferred by a lack of access to certain technologies like CGI, or the necessary computing power for people to interact with them, things that we now hold—quite literally—in the palm of our hands. Moreover, as virtual influencers became more realistic and our lives moved increasingly online, people began to form communities around them, thus spurring a new level of engagement.

In the latest edition of Social Bites, the Social Innovation Lab explores the opportunities that virtual influencers bring for brands today, as well as how they are challenging our concept of beauty, talent and creativity. You can find the issue of Social bites here, and get into the swing of things with our key findings below. 

A Perfect Fit for Transmedia Storytelling

At their core, virtual influencers are computer-generated characters that engage directly with an audience on social media, live-stream commerce, in video games, or even in mainstream media. They have one main purpose: to increase followers, engagement and conversion. 

That said, many characters who now operate as virtual influencers were not born digital. Barbie, the fashion doll who debuted in 1959 long before the social media era, now communicates with fans through a popular YouTube channel and Instagram account. Now a virtual influencer, she moves across platforms and formats as needed to show up for her community. 

Because they are so diverse in form, virtual influencers offer endless possibilities in transmedia storytelling. They can seamlessly transition between different virtual environments to tell a single story, all while remaining recognizable to audiences. These benefits apply to marketing campaigns as well: the presence of a virtual character representing a brand can feel authentic anywhere from the metaverse to social media. On occasion, they even outperform their real-life counterparts when it comes to engagement.

What’s more, virtual influencers are never stuck in one place at a time. This great advantage extends to virtual versions of real-life celebrities. Last year, we worked with Pokémon and director Jason Zada to celebrate their 25th anniversary by hosting a computer-generated concert featuring Post Malone. The rapper performed for more than 10 million viewers on YouTube and Twitch, taking his fans on a journey “across the land”: a series of diverse biomes populated by Pokémon. We’re looking at a very scalable setup: in addition to virtual venues fitting more people than a physical stadium, it’s also possible to give the same concert several times, across multiple time zones. 

Ethical Considerations for Working With Virtual Creators

Just like real-life influencers, their virtual counterparts are diverse in their personalities, but they all have one dangerous thing in common: they can be shaped into any form their creators desire. They can advance unattainable standards—they don’t grow old, they don’t get tired and they can change their looks to match new trends at a moment’s notice. And while it may be tempting to use these unique qualities to your advantage, upholding such standards are counter to goals around diversity of representation. We recommend that brands working with virtual influencers focus on these matters as they would do with their real-life predecessors.

Tech companies are working on making virtual influencers showcase a larger diversity of body types and flaws. It’s about ethics, but also relatability. After all, people need to be able to connect with a creator to be truly engaged. The good news is that we’re already seeing progress in this respect. Angie, who was named “the imperfect virtual influencer” by CNN, offers a refreshing alternative. With her creased makeup, faint acne scars and uneven skin, she is challenging beauty standards in China and beyond—showcasing her imperfections for the world to see. 

That is to say that if done right, virtual influencers have the potential to reshape digital culture and our ideals of beauty, coolness or even what it means to be human. Brands that lead this evolution in marketing can strengthen their bonds with consumers, but only as long as diverse creators are involved and provide the space for consumers to feel seen.

The Immediate Evolution of Virtual Influencers

While virtual influencers operate under no location or time zone constraints, it’s true that the APAC region is leading the way in facilitating real-time interaction between them and their followers. Dior, for instance, created a digital avatar of its regional ambassador, Chinese celebrity Angelababy. As reported by the South China Morning Post, Angela 3.0’s surprise appearance generated more than 90,000 Weibo interactions in two hours.

Meanwhile, the ecommerce giant Taobao developed a gamified community where users create and dress virtual avatars in real-world items available on the platform. These 3D avatars can interact with others, perform daily tasks and use virtual coins to purchase items. 

There’s clear evidence that these brands have found virtual influencers to be a great tool to further engage their audience, and there’s a lot the rest of the world can learn from these advancements in APAC. Virtual influencers are here to stay, and the doors of opportunity are wide open for brands to experiment in this space. Especially for those who feel like working with a real-life influencer poses a risk, creating their own virtual influencer may be a perfect choice.

Looking for more social media insights? Tune into our weekly Social Innovation Lab podcast to hear from the brightest minds in social and learn how to create winning social media campaigns. Check out the latest episodes of the Social Innovation Lab podcast.

Our Social Innovation Lab explores the opportunities virtual influencers bring for brands today and how they are challenging our concept of beauty, talent and creativity. virtual influencers brand virtualization virtual experiences influencer marketing

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss