Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

Forrester Spotlights How to Power Up Production with Game Engine

Forrester Spotlights How to Power Up Production with Game Engine

4 min read
Profile picture for user mediamonks

Written by
Monks

Forrester Spotlights How to Power Up Production with Game Engine

With many consumers still at home around the world, brands’ need to show up for audiences digitally has never felt more urgent, particularly through immersive content and experiences that recapture some of what’s lost in interacting with a brand, loved ones or product in-person.

And with this need, a perhaps unlikely tool for marketers has emerged: game engines. The role of gaming in the marketing mix has also risen, providing new, engaging environments to meet consumers. Glu Mobile’s apartment decorating game Design Home, for example, lets players customize a home using real furniture from brands like West Elm and Pottery Barn—and is now even offering its own series of real-world products through its Design Home Inspired brand, effectively turning the game into a virtual retail showroom.

But game engines aren’t just for making content consumed in games. They also provide an environment for brands to build and develop 3D assets to use both internally and to power a variety of virtualized experiences. We recently announced receiving an Epic MegaGrant to automate virtual tabletop production using Unreal Engine, and additionally we use Unity Engine—which powers 53% of the top 1,000 mobile games within the App Store and Google Play Store—to build mobile and WebGL experiences with consumer audiences in mind.

3D Adds a New Dimension to Content Production

In addition to serving increasingly digital user behavior, the use of 3D content has the potential to help brands build efficiency across the enterprise and customer decision journey. A new report from Forrester, “Scale Your Content Creation With 3D Modeling” by Ryan Skinner and Nick Barber, details how 3D content solves a critical challenge that brands face today: the need to keep up with demand for content while faced with dwindling budgets. “Particularly in e-commerce, marketers have learned they need product images to reflect every angle, every variation, and a multitude of contexts; without updated and detailed imagery, engagement and sales suffer,” the authors write.

Cookies 2 Animated

Pick your flavor: game engines let brands tweak and change assets with ease and speed.

One solution mentioned in the report is building a CGI-powered production line. MediaMonks Founder Wesley ter Haar notes in the report that “This is very top of mind for us right now,” a sentiment that is reinforced by how we’re powering creative production at scale using tools like Unreal Engine and Unity, as mentioned above.

“What’s exciting about real-time 3D is that it ticks a bunch of boxes for a brand,” says Tim Dillon, SVP Growth at MediaMonks, whose primary focus is on our game engine-related work. “You can use it in product design, in your marketing, in retail innovation–it’s touching so many different end use cases for brands.” A CAD model used internally for product design, for example, could also be used in virtual tabletop photography, in a retail AR experience, in 3D display ads and more—reconfigured and recontextualized to accomplish several of a brand’s goals in producing content and building experiences.

When it comes to content, game engines make it easier for teams to create assets at scale through variations in lighting, environment or color—especially when augmented by machine learning and artificial intelligence. “When creating content in real time, brands not only make content faster but can react and adapt to consumer needs faster, too,” says Dillon. “Things like 3D variations, camera animation and pre-visualization become much faster to achieve—and in some cases more democratic too, by putting new 3D tools in our client teams’ hands to make these choices together,” he says.

Genesis Car Configurator

The Genesis car configurator lets users view their customizations in real time.

A case in point is the car configurator we built for Genesis, built in Unity and covered in their recent report: “25 Ways to Extent Real Time 3D Across Your Enterprise.” The web-based configurator offers a car customization experience as detailed and fluid as you’d find in a video game, letting consumers not only see what their custom model would look like with different features, but also within different environmental factors like time of day—all in real time.

Making a Lasting Impact Through Immersive Moments

Through greater adoption of immersive storytelling technologies and ultra-fast 5G connection, we are entering a virtualized era capable of placing a persistent 3D layer across real-world environments—already made possible through Unity technology and cloud anchors by Google, which anchor augmented reality content to specific locations that people can interact with over time. Consider, for example, a virtual retail environment that never closes and provides personalized service to each customer.

These experiences have become all the more relevant with the pandemic. In the Forrester report mentioned above, ter Haar says: “With COVID, we’re seeing greater interest to demo in 3D. The tactile and physical nature of seeing something makes it easier to buy.”

But perhaps more important for brands is that immersive experiences have the power to create real, lasting memories—a focus of a recent talk by Quentin de la Martinière (Executive Producer, Extended Realities at MediaMonks) and Ron Lee (Technical Director at MediaMonks) at China Unity Tech Week.

remkodewaal-spacebuzz-094

Spacebuzz takes students on an out-of-this-world journey through VR.

“We start from the strategy of who you want to engage with,” says Lee, delving into the storytelling potential of 3D content. “From there, we try to understand the vision we want to build to grab the user’s attention and put them in the world.” This includes deciding on the best venue for a 3D experience: augmented reality, virtual reality, mixed reality or on the web? By making the right selection, brands can build experiences that explain while they entertain.

Lee and de la Martinière showed Spacebuzz as an example of how immersive experiences can have lasting impact. Through a 15-minute VR experience built in Unity, school children are taken to space where they experience the “overview effect”: a humbling shift in awareness of the earth after viewing it from a distance.

“The technology and the story bring together the vision and the message of the experience,” says Lee. “Building that immersive environment in Unity and translating this information on a deeper level creates real memories for the kids who engage with it.” Likewise, brands can leave a memorable mark on consumers through 3D content. “These extremely personalized experiences allow the brand to leave a deep impression on audiences and intensify brand favorability,” Lee says.

From streamlining production to powering experiences across a range of consumer touchpoints, the value of 3D content is building for brands. Working closely with the developers of leading game engines that enable these experiences, like Unity and Unreal Engine, we’re helping brands add an entirely new dimension to their content and storytelling for virtualized audiences.

Game engines enable brands to scale up content and drive value across the enterprise—including the customer journey. Forrester Spotlights How to Power Up Production with Game Engine From new realities to memorable moments, game engines deliver.
Game engine unity unity engine unreal engine epic games extended reality 3D content

Getting Our Hands Dirty with VR Hand Tracking

Getting Our Hands Dirty with VR Hand Tracking

4 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

Getting Our Hands Dirty with VR Hand Tracking

Engrossed in virtual reality, you’re surrounded by digital, fantastic objects, each begging for you to reach out and touch. But until recently, most interaction in mainstream VR headsets has still been limited to using a controller. For some experiences, the controller presents a disconnect between what people feel in their hands versus see on the screen—at least until recently.

Last month, the Oculus released its Hand Tracking SDK for the Oculus Quest, allowing people to use their hands to navigate through menus and applications supporting the new SDK. While the update isn’t meant to replace controllers outright, it enhances users’ sense of presence within the virtual space by blurring the barriers between real and virtual even further, presenting new creative opportunities for brands that are eager to offer assistive content in the emergent medium. “Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more of a physical thing, more like a real experience,” says Geert Eichhorn, Innovation Director at MediaMonks.

This illusion of reality is intriguing for Seth van het Kaar, Unity Monk at MediaMonks. “One thing VR has shown through experience and research is that our eyes override our other senses,” he says. “So, if I appear to be putting my hand in a bucket of cold water in VR, I’ll get the placebo effect of it feeling cold. Through creativity, you can use that to your advantage.”

Monk Thoughts Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more like a real experience.
Portrait of Geert Eichhorn

Exploring the creative opportunities presented by the SDK, van het Kaar served as developer on a team of Monks experimenting with hand tracking to develop a working prototype that could take best advantage of the new interface. Here’s what the team learned in the process.

Find Opportunities to Get Hands-On

“Similar to how the development of voice as an interface has prompted brands to emulate human conversation as naturally as possible, we need to make these experiences feel as intuitive as possible, as you’re using your real hands,” says Eichhorn. As part of MediaMonks Labs, our research and innovation team, he’s focused not on using the latest tech for the sake of it, but rather finding the real-world application and value that it has for end-users.

Trying to identify what type experience would best benefit from this new input, the team wondered: what activities are very dexterous and require careful use of one’s hands? Shaving made sense: “It’s something that’s difficult for young adults and teens who are just learning to use these devices,” says Eichhorn. “And a lot of people still get things wrong, like going against the grain.” It’s also an intriguing use case in that shaving requires an element of precision, putting the usability of hand tracking to the test.

landscape-03

Inspired by clay, the Monk head grows noodles of hair that you can shave and trim.

By practicing grooming in VR using one’s own hands, users would be able to try out different tools and techniques without worrying about messing up their own hair. So, the team took our bald monk mascot and blessed him a head of hair, inviting Oculus Quest users to give him a shave and a trim in an experience inspired by the Play-Doh “Crazy Cuts” line of toys.

Start with Something Familiar

Interacting with one’s hands is incredibly intuitive; it’s one of the earliest ways that we engage with the world as infants. But that doesn’t mean any hand-tracking experience is inherently easier to use or design; experimenting with any new mode of interaction requires one to break free of any preconceived notions about design. In the case of hand tracking, how does one organize a series of options within an experience without the use of physical buttons (and in this case, no haptic feedback)?

To rise above the challenge, the team used common hand gestures as a starting point—for example, those used in rock/paper/scissors—to serve as an intuitive metaphor for interaction.  “The Oculus can track the difference between fingertips, so if I mimic scissors with them, that’s a funny interaction,” says van het Kaar. “In the app, you can select the scissors and now you’re like Edward Scissorhands,” a fictional film character whose hands made of scissors give him wild success as a hairstylist.

landscape-01

Move Beyond Limitations and Creative Constraint

In its experiments with the SDK, the team settled on a couple of learnings that could apply to subsequent hand-activated Oculus Quest experiences. First, there’s moving past the challenge felt in any VR environment: locomotion, or the relationship and (de)synchronization between one’s bodily movements and those of their virtual avatar.

Without haptic feedback, what should happen when the user’s hand comes in contact with a virtual object: should it move through the object, or should the object block their movement much like it would in reality? While the latter option might make sense on paper, the fact that users could still move their physical hand while the virtual one stays stationary could result in confusion. The team moved beyond the challenge by letting users push virtual objects freely—for example, the monk model that they shave—which snap back into place once released (which sounds like a fun interaction in its own right).

Monk Thoughts We need to make these experiences feel as intuitive as possible, as you’re using your real hands.
Portrait of Geert Eichhorn

The way that Hand Tracking SDK detects hands also presented a challenge: it seeks out the shape of a hand against a background, so it loses tracking once two of them overlap. “You can’t place a menu on the palm of your hand and tap an option on it, or interact with a virtual object on your wrist, for example,” says van het Kaar. To work around this challenge, a menu floats beside the user’s hand. While this doesn’t allow for haptic feedback by selecting options against one’s own body, this setup mitigates the risk of losing the tracking by having hands overlap.

Taking the time to experiment and apply these learnings allow us to develop increasingly realistic experiences in extended reality. From playing with hand tracking in VR to demonstrating how occlusion transforms experiences in AR, our team of makers are devoted to continually experimenting with new technologies, finding their most relevant use cases and establishing best practices for brands and our partners. As barriers continue to break down between the physical and virtual, it will be exciting to see what kinds of wholly new digital experiences emerge.

With the release of the Hand Tracking SDK for Oculus Quest, our innovation team went hands-on to experiment with a new form of input for virtual reality. Getting Our Hands Dirty with VR Hand Tracking We put a finger on how to build a hand-controlled experience for VR.
Vr virtual reality oculus oculus quest hand tracking sdk mixed reality extended reality

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

5 min read
Profile picture for user mediamonks

Written by
Monks

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

The Mobile World Congress took to Los Angeles this week, gathering together the wireless industry: network operators, software companies, manufacturers and creative partners. The focus on this year’s event was intelligent connectivity, and how 5G is set to bridge together several innovations (like big data, AI, the internet of things and extended reality) to reinvent the way we interact with content and each other, both in our professional and daily lives.

In his keynote kicking off the event, GSMA Director General Mats Granryd identified several real-world impacts that 5G will offer. Mental health practitioners could provide at-home therapy to lonely patients via hologram, for example; colleagues could better collaborate with one another in real time from across the world, and students could “literally carry your classroom in your pocket” with experiences make a greater impact than a simple video recording of a lesson.

It’s the Year 5G Finally Gets Real

At a gathering of so many innovators and mobile operators, you get the sense that anticipation for “what’s next” is high. Technologists have waited years for 5G to grow out of its status as a buzzword and into an actual offering. With its rollout to select cities in the US, the promise of the ultra-fast connection is almost upon us, and a sense of excitement permeated the conference. In conversation with Meredith Atwell Baker (President and CEO, CTIA), Ken Meyers (President and CEO, US Cellular) contrasted this attitude with the jump from 3G to 4G. “We didn’t sit back and think, ‘Oh, look at the app-based economy right in front of us,’” he said.

Monk Thoughts There are already things we have to do with our clients to think 5 years out. You have to take 5G as a given.

But that’s what’s happening now. On the panel “New Marketing Strategies: How to Make Money with XR,” RYOT Head of Content Nigel Tierney mentioned how even with 5G on the horizon, there are still limitations to solve: “We’re at the crux of unlocking possibilities.” Silkie Meixner, Partner, Digital Business Strategy at IBM, likewise mentioned how the firm is working now to help clients prepare for a future that’s ripe with opportunity and is set to change the way they work. “There are already things we have to do with our clients to think 5 years out,” she said. “You have to take 5G as a given.”

Buying in on Big Bandwidth

So, what does the 5-year, 5G plan look like? The simplest way to envision a 5G-infused future is to consider the significant boost in bandwidth it will provide: it can reach speeds of up to 100x faster than 4G, which had previously made significant impact on services like streaming music and video years ago. And what 4G connectivity has done for video, 5G could do for emerging media, including cloud-based gaming (like Google’s upcoming Stadia gaming service) or streamable AR and VR.

Monk Thoughts 5G innovation opens up an “era of advanced video experiences that will truly allow mobile to distinguish itself as an entertainment medium.

In his keynote presentation, Viacom CEO Bob Bakish discussed how the multinational entertainment conglomerate is looking forward to a near-future of premium video content, enhanced with the power of 5G and integrated with related media and platforms. This would usher in an “era of advanced video experiences that will truly allow mobile to distinguish itself as an entertainment medium.” It would also encourage brands and content creators to consider the many contexts in which audiences will connect to their content: on a smartphone, in a driverless car or somewhere else.

Bakish mentioned how developing such content through new partnerships would help network operators differentiate themselves by leveraging their partners’ IP. We saw the strategy in action at this year’s Comic Con, where we helped AT&T launch a VR experience that let Batman fans fly through Gotham City. The experience’s presence at the conference instilled trust in AT&T’s ability to output the kind of content that audiences desire after its acquisition of Time Warner and DC Comics IP.

Extending Extended Reality Even Further

While the Batman experience was site-specific, 5G offers opportunity to enhance and scale up such experiences for mass audiences. One of the biggest challenges affecting AR and VR right now is that it’s not easily streamable; users must discover and download applications for fully-featured experiences, which is partly why the much more limited (yet accessible) camera filter has risen as the most popular and ubiquitous use of the technology.

Image from iOS (3)

Managing Director of MediaMonks LA, Olivier Koelemij (right), sat on the panel to discuss the opportunities that 5G offers to extended reality.

Image from iOS (2)

Managing Director of MediaMonks LA, Olivier Koelemij (right), sat on the panel to discuss the opportunities that 5G offers to extended reality.

But 5G can do away with those constraints. “More bandwidth means we can be more ambitious and artistic with the content we create,” says Olivier Koelemij, Managing Director of MediaMonks LA, who sat on the same panel. “A better, more immersive story means our strategies to amplify it will become more ambitious in lockstep.”

This means there’s opportunity for brands to relate with audiences through more sophisticated, shareable digital experiences. Tierney attributes failure to poor storytelling and lack of meaningfulness, citing a need for brands to integrate personalization and data into the creative process and delivery—basically, they need to be more purposeful in their ideation and delivery to provide resonant interactive experiences.

“We don’t suggest a technological approach because it’s the hot trend,” says Koelemij. “Our content and technology must be fit for format, purpose and consumer.” He suggests viewing any creative problem through a pragmatic lens. “You should ask questions like: is extended reality helping us deliver a stronger message here? How can we integrate other digital elements to do this?” The goal is to home in on the right approach for your business goals through data and KPIs, ultimately delivering an experience that resonates with consumers.

Monk Thoughts Our content and technology must be fit for format, purpose and consumer.

Meixner described how IBM employed such a strategy to develop a VR-enabled training experience. The B2B solution not only makes training faster and scalable by teaching trainees skills that they’d otherwise gain in a classroom—it also collects data through interactions like motion analysis, which could be used to optimize the tool or develop new ones. The strategy shows how innovations can be developed and optimized through practical, real-world data that empowers and educates.

It’s clear from this year’s conference that 5G isn’t just about connecting people to friends or family via a wireless handset. It’s about truly integrating all of the devices and touch points we interact with each day, enabling truly transformable and new interactions. As the technology begins to roll out, brands must be prepared to adopt it with a sense of purpose to offer audiences meaningful, impactful and differentiated experiences.

2019 is the year that 5G finally became real. Find out what the tech means for brands from some of the brightest minds working in the mobile industry. MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival Hollywood switches gears from the big screen to the phone screen—and beyond.
mwc2019 mobile world congress mwc los angeles mwc la mobile world congress los angeles mobile world congress 5G 5G opportnunity extended reality virtual reality augmented reality vr ar olivier koelemij

What We Learned from Demoing Google’s New Depth API

What We Learned from Demoing Google’s New Depth API

4 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

What We Learned from Demoing Google’s New Depth API

Get ready for an upgrade: in early December, Google revealed its Depth API, a new functionality coming to ARCore that allows virtual objects and real-world environments to play nicer together, allowing for more convincing and immersive mixed reality experiences. A demonstrable way that Depth API achieves this is by enabling occlusion, or the illusion of virtual objects’ becoming obstructed behind real-world ones.

Convincing occlusion has historically been difficult to achieve, though Google has put together a video portraying demos of the new API that show off its features. One of those demos, which challenges the user to a virtual food fight against a levitating robot chef, was developed in collaboration with MediaMonks.

What’s exciting about Depth API is its ability to understand the user’s surroundings at an unprecedented speed and ease. “The API’s depth map is updated in real time, allowing AR apps to be aware of surfaces without complex scanning steps,” says Samuel Snider-Held, Creative Technologist at MediaMonks. This enables not only occlusion as mentioned above, but also the mimicry of real-time physics. For our virtual food fight against the AR-rendered robot, missing is part of the fun; users can take delight in the digital splatters of food on the objects around them without worrying about cleanup.

The Building Blocks to More Immersive AR

How does Depth API work, and what sets it apart from other methods of occlusion? “The Depth API uses an approach called ‘depth from motion,’ in which ARCore determines distances to objects by detecting variances between image frames while the camera is moving,” says Snider-Held. “The result is a high-resolution depth map that is updated in real time, allowing the device to better understand where objects are in relation to one another and how far away they are from the user.”

Depth API is software-based, requiring no new hardware for users with ARCore-enabled devices once it releases publicly. While sufficient occlusion significantly increases the verisimilitude of virtual objects, it follows a series of incremental updates that build on one another to allow for more realistic immersive experiences. Just last year—the same year ARCore debuted—Google released its Lighting Estimation API, which lights virtual objects to match the existing lighting conditions in the real-world setting, including light reflections, shadows, shading and more.

Screen Shot 2020-01-02 at 5.38.40 PM

Since then, a feature called Cloud Anchors allows multiple users to view the same virtual objects anchored in a specific environment. It’s the key feature powering the multiplayer mode of Pharos AR, an augmented reality experience we made in collaboration with Childish Gambino, Wolf + Rothstein, Google and Unity—which itself served as a de facto demo of what Cloud Anchors are capable of in activating entirely new mixed reality experiences.

“We have the creative and technical know-how to use these new technologies, understand why they’re important and why they’re awesome,” says Snider-Held. “We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.”

A Streamlined Way to Map Depth

Depth API wasn’t the first time that MediaMonks got to experiment with occlusion or spatial awareness with augmented reality. Previously, we got to experiment with other contemporary solutions for occlusion, like 6D.ai, which creates an invisible 3D mesh of an environment. The result of this method is similar to what’s achieved with Depth API, but the execution is different; translating an environment into a 3D mesh with 6D.ai is fastest with multiple cameras, whereas Depth API simply measures depth in real time without the need of scanning and reconstructing an entire environment.

Similarly, Tango—Google’s skunkworks project which was a sort of precursor to ARCore—enabled special awareness through point clouds “When we had Tango from before, it used something similar to a Kinect depth sensor,” says Snider-Held. “You’d take the point clouds you’d get from that and reconstruct the depth, but the new Depth API uses just a single camera.”

Monk Thoughts We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.
Samuel Snider-Held headshot

In essence, achieving occlusion with a single camera scanning the environment in real time offers a leap in user-friendliness, and makes it widely available to users on their current mobile device. “If we can occlude correctly, it makes it feel more cemented to the real world. The way that they’re doing it is interesting, with a single camera,” says Snider-Held.

Adding Depth to Creative Experiences

Depth API is currently opening invitations to collaborators and isn’t yet ready for a public release, but it serves as a great step in rendering more believable scenes in real time. “It’s another stepping stone to reach the types of AR experiences that we’re imagining,” says Snider-Held. “We can make these projects without caveats.”

For example, a consistent challenge in rendering scenes in AR is that many users simply don’t have large enough living spaces to render large objects or expansive virtual spaces. Creative teams would get around this by rendering objects in miniature—perhaps just contained to a tabletop. “With Depth API, we can choose to only render objects within the available space,” says Snider-Held. “It lets us and our clients feel more comfortable in making these more immersive experiences.”

As brands anticipate how they might use some of the newest features of fast-evolving mixed reality technology, they stand to benefit from creative and production partner that can bring ideas to the table, quickly implementing them with awareness of the current opportunities and challenges. “We bring creative thinking to the technology, with what we can do given our technical expertise but also with things like concept art, animation and more,” says Snider-Held. “We don’t shy away from new tech, and not only do we understand it, but we can truly make something fun and inventive to demonstrate why people would want it.”

MediaMonks built a demo featuring occlusion and realtime physics in AR, showcasing the functionalities of Google's new Depth API. What We Learned from Demoing Google’s New Depth API We go in-depth on ARCore’s new Depth API.
AR augmented reality mixed reality xr extended reality occlusion ar occlusion google arcore

Looking Back at 2019 and the Dawn of a New Era

Looking Back at 2019 and the Dawn of a New Era

4 min read
Profile picture for user mediamonks

Written by
Monks

Looking Back at 2019 and the Dawn of a New Era

The decade is drawing quickly to a close, and it’s been a wild ride. From new technologies to new members of our family (we welcomed BizTech, IMA, Firewood Marketing and WhiteBalance this year), 2019 presented us with a lot of thrilling changes—and some exciting opportunities as we enter a new era. Looking back, we polled managing directors from our offices around the world for their favorite trends and technologies that have emerged in the past year—and what they’re looking forward to next.

Extended Reality Gets Real

Interest in mixed and extended reality (the combination of real and virtual objects or environments, like augmented or virtual reality, enabled by mobile or wearable devices) has been growing. At the same time, mixed reality has made strides in maturity over the past year, like Google’s efforts in making virtual objects feel truly anchored to the environment with occlusion, in which virtual objects are responsive to their surrounding environment—for example, disappearing behind real-world objects.

For Martin Verdult, Managing Director at MediaMonks London, extended reality is among the innovations he’s become most excited about going into 2020, and not just for the entertainment potential: “Virtual and augmented reality will become increasingly prevalent for training and simulation, as well as offering new ways to interact with customers.” For example, our Spacebuzz virtual reality experience gives children a unique look at the earth and environment they may typically take for granted, using the power of immersive tech to leave an indelible mark.

Monk Thoughts Value comes from connecting an IP to a brand through a deeply engaging hyper reality experience.

As the technology that powers extended reality matures, so will its potential use cases. But when a technology is still evolving significantly in short time, it can be difficult for brands to translate their ideas or goals into clear, value-added extended reality experiences. “We have introduced creative sprints for our core clients to get these ideas in a free flow,” says Verdult.

Among Verdult’s favorite examples of augmented reality projects MediaMonks has worked on this year is Unilever’s Little Brush Big Brush, which uses whimsical, virtual animal masks to teach children proper brushing habits and turn a chore into playtime. Similarly, extended reality can bring products to life in an engaging way—or if used in a customer’s research phase, it can help customers interact with a product with minimal (or no) dedicated retail shelf space.

Little Brush Big Brush Case Video.00_00_15_17.Still009

Part of the Little Brush Big Brush’s charm is that it extends beyond simply AR, connecting to a web cartoon series and a Facebook Messenger chatbot to reward kids with stickers at key milestones. “Value comes from connecting an IP to a brand through a deeply engaging hyper reality experience,” says Olivier Koelemij, Managing Director at MediaMonks LA. “One that only a well-executed integrated production can offer, combining digital and physical in new and extraordinary ways.”

AI/Machine Learning Grows Up

One can’t reflect on past innovations and look to the future without mentioning artificial intelligence and machine learning. From programmatic delivery to enabling entirely new creative experiences—like matured extended reality powered by computer vision—to connecting cohesive experiences across the Internet of Things, artificial intelligence “will change our interaction with technology in ways we can’t imagine yet,” says Sander van der Vegte, Head of MediaMonks Labs, our research and development team that continually experiments with innovation.

The most creatively inspiring uses of AI are the ones that will help us understand the world and our fellow humans. In collaboration with Charité, for example, we programmed a 3D printer to exhibit common symptoms of Parkinson’s disease and its effect on motor skills. The result is a series of surreal art-objects that make real patients’ experiences tangible for the general population.

190719_Productshot_Groupshot_fin

Social Content and Activations Build Impact

Ask Sicco Wegerif (Managing Director at MediaMonks Amsterdam) what struck him this year, and he’ll tell you it’s the elevation of social content in purchasing—for example, how Instagram made influencer posts shoppable early this year. Wegerif notes that about a quarter of consumers have made a purchase on social media, signaling new opportunities for brands to build connections with consumers.

“Looking at this from an integrated and smart production perspective, we can help brands create so many assets and storylines that tap into this trend, especially when combining this with data so we can be super personal and relevant.” When social media is prioritized early in the creative and planning process, it can enable more meaningful experiences.

For example, our “People are the Places” activation for Aeromexico used Facebook content to transform the way users discover destinations around the world. Instead of researching and booking a city, users get to learn about people around the world—then purchase a ticket to where they call home. The social content enriches the experience and builds emotion into the experience. “It’s in essence a very simple thought that can change the whole CX,” says Wegerif.

Social Activations and Digital Experiences Weave Together

Speaking of social media, it can become a powerful tool to build relevance and connection with experiential. Jason Prohaska, Managing Director at MediaMonks NY, says: “Experience and social work hand-in-hand as part of the digital plan for many brands, and are no longer below the priority line.” With live experiential—which elevates the role of the online audience to interact, take part in and build buzz around experiences—brands can achieve greater strategic impact in how they build connection with their consumers.

But doing so successfully requires a confluence of data, influencers, experiential storytelling and production. The future of this looks good to Prohaska. “We expect 2020 to deliver several use case scenarios at scale for brand identity that may set benchmarks for personalization, automation, customer journey optimization, efficacy, performance and engagement.”

Koelemij looks forward to stronger investment in digital and consumer understanding as brands begin to integrate experiences even further going into 2020. “With most good work, success and performance can now be better attributed to digital as we get more advanced in understanding what success looks like,” he says, “especially in how we can measure it across blended activations.”

And that’s exactly how we’d like to spend 2020: helping brands achieve their goals with data-backed, insights-driven creative across the customer decision journey. Through added capabilities thanks to companies like WhiteBalance, Firewood, BizTech and IMA joining the S4Capital family in 2019, we achieve this by greatly prioritizing and enhancing key elements of the marketing mix for daring brands—and as we reflect on the past year, we can’t wait to see what’s next.

At the close of the decade and the dawn of a new era, we look back at some of the most exciting trends and developments in the past year. Looking Back at 2019 and the Dawn of a New Era We look back at past achievements and set expectations for 2020.
End of year recap recap tech trends ar augmented reality mixed reality extended reality 2019 new year s4capital social media marketing machine learning

Creative Considerations for Extended Reality Experiences

Creative Considerations for Extended Reality Experiences

4 min read
Profile picture for user mediamonks

Written by
Monks

Creative Considerations for Extended Reality Experiences

This week, Fast Company launched its annual Innovation Festival, featuring Fast Track sessions that take attendees behind the scenes and into the homes of some of the most innovative companies in New York City. Billed as “Fast Company’s unique take on the field trip,” Fast Track sessions engage brands and creatives through hands-on talks and experiences hosted by participating companies, including MediaMonks.

Our New York office opened the doors to the dojo, inviting brands into our home to discuss all things extended reality. In a panel session devoted to augmented reality and its application to music and entertainment, our Monks dove deep into the design and development process of Pharos AR—a mobile AR experience made in collaboration with Childish Gambino, Wieden+Kennedy, Unity and Google. Taking users on a cosmic journey set against an exclusive track from Childish Gambino, the app is notable for being the first multiplayer music video.

With a panel including Snider-Held (Creative Technologist, MediaMonks), Thomas Prevot (Senior Producer, MediaMonks) and Tony Parisi (Head of VR/AR Brand Solutions, Unity), the session served as a casual fireside chat. The conversation kicked off by establishing the state of VR and AR, often characterized by the conflicting feelings that VR is dead and that the clear use case for AR hasn’t yet been found. But both technologies are well established, each excelling in achieving different goals within different environments.

VID_20190412_160740.00_04_48_06.Still003

Figures from cave paintings spring to life in Childish Gambino's trademark neon aesthetic in the environment around the user in Pharos AR.

Showcasing our Batman experience as a strong example of the immersive powers of VR, Snider-Held noted that “These experiences are still very installation-based,” and that AR’s distribution through mobile offers the potential for greater reach with a simpler experience. In explaining the process of developing Pharos AR in particular, the group explored key considerations for challenges to consider when developing an extended reality experience.

AR Can Feel Real Without Being Photoreal

Constraint prompts creativity—an adage that applies just as well to AR as any other medium for art making. Because mobile AR experiences are designed for use across a variety of devices, they must be relatively lightweight to provide a smooth experience to the widest share of users. Failure to keep technical constraints at top of mind can instead result in a lagging, stuttering experience that breaks immersion.

While this is true for any digital experience, it’s especially true for AR, a medium which Parisi says aims to “intelligently interact with the real world.” This expectation to seamlessly integrate with the surrounding environment can make stuttering graphics stick out like a sore thumb. “You want to keep the frame rate above 30 frames per second,” says Snider-Held, “because the user will compare the motion on the screen with the action happening around them in reality.”

Monk Thoughts Stylistically, we’re trying to remain within the constraints of mobile processing in a visually appealing way.
Samuel Snider-Held headshot

The trio took this challenge as an opportunity to discuss the highly stylized look achieved with Pharos AR. While a photorealistic graphics might be impractical for a mobile device to realistically render in real time, a stylized look presents the opportunity to differentiate your experience through a strategic choice in aesthetic; for Pharos AR, the team took visual inspiration from Childish Gambino’s laser-punctuated stage shows, ensuring the app’s look and feel naturally integrated with the rest of the artist’s visual brand.

“Stylistically, we were trying to remain within the constraints of mobile processing in a visually appealing way,” said Snider-Held. An example of this is the use of particle effects, in which sparkles of light coalesce into a ghostly image of Childish Gambino as he dances to the music, animated via motion capture. “This is the best example on why you don’t need to do photorealism,” Parisi said. “We were able to capture the essence of Donald, because it’s his dance.”

Carefully Plan the Narrative Environment

Extended reality experiences are interactive by nature, meaning they rely on a different approach than how you would plot out and plan more linear experiences. There’s a careful balancing act between giving users the reigns to explore on their own versus stringing them along a narrative thread. MediaMonks Founder and COO Wesley ter Haar notes that “Narrative UI is key for onboarding and guiding the user in an AR experience,” making it incredibly important that you plan out users’ interactions and use environmental cues to shape a narrative.

While Pharos AR begins and ends through open-ended user interaction, it still follows a clear narrative through the virtual performance of Childish Gambino’s single, “Alogrhythm.” In exploring the primary path through the experience, the team began planning it in storyboard form, much like you would make for a film. “This process not only serves as visual research, but also in briefing the animation team and envisioning how actions will play out in the user’s environment,” said Snider-Held.

Monk Thoughts Narrative UI is key for onboarding and guiding the user in an AR experience.
black and white photo of Wesley ter Haar

That makes sense—but what sets storyboarding for AR or VR different than other forms of digital storytelling? Because users could play with the app in their homes, the team had to plan for other variables how large the virtual scene should be, and what actions would be possible for multiple users to make within a smaller environment—like the cramped living room of a New York apartment, to offer an example that the Fast Track attendees could relate to. The challenge demonstrates how important it is to map out the virtual scene for different scenarios and users.

An interesting insight uncovered in the panel was that the team didn’t just rely on visual methods of planning like maps and storyboards. Due to the nature of the background music building up as users explore the space around them, the team also developed a musical timeline that maps up how different interactions trigger the layering of the music. The step showcases how sensorial, environmental cues can shape the action within an immersive, extended reality experience.

Whether developing for VR or AR, extended reality experiences require developers to rethink the creative approach beyond the standard linear story. From considerations in setting, technical constraints and variations in number of users across platforms, extended reality development relies on a comprehensive understanding of the building blocks that make up a total user experience. Snider-Held capped off the session with an ambition on what MediaMonks aims to achieve with brands through such experiences: “We strive to further the use of the technology from impossible to probable, and experiment in how to further that, too.”

Drawing on the development of Pharos AR, MediaMonks offered Fast Track attendees a peek at key considerations in developing for AR, VR and what stands in between. Creative Considerations for Extended Reality Experiences VR and AR offer a new way to interact with the world–and require new creative approaches.
ar vr augmented reality virtual reality pharos ar childish gambino digital narrative digital storytelling extended reality mixed reality

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss