Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

5 min read
Profile picture for user mediamonks

Written by
Monks

MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival

The Mobile World Congress took to Los Angeles this week, gathering together the wireless industry: network operators, software companies, manufacturers and creative partners. The focus on this year’s event was intelligent connectivity, and how 5G is set to bridge together several innovations (like big data, AI, the internet of things and extended reality) to reinvent the way we interact with content and each other, both in our professional and daily lives.

In his keynote kicking off the event, GSMA Director General Mats Granryd identified several real-world impacts that 5G will offer. Mental health practitioners could provide at-home therapy to lonely patients via hologram, for example; colleagues could better collaborate with one another in real time from across the world, and students could “literally carry your classroom in your pocket” with experiences make a greater impact than a simple video recording of a lesson.

It’s the Year 5G Finally Gets Real

At a gathering of so many innovators and mobile operators, you get the sense that anticipation for “what’s next” is high. Technologists have waited years for 5G to grow out of its status as a buzzword and into an actual offering. With its rollout to select cities in the US, the promise of the ultra-fast connection is almost upon us, and a sense of excitement permeated the conference. In conversation with Meredith Atwell Baker (President and CEO, CTIA), Ken Meyers (President and CEO, US Cellular) contrasted this attitude with the jump from 3G to 4G. “We didn’t sit back and think, ‘Oh, look at the app-based economy right in front of us,’” he said.

Monk Thoughts There are already things we have to do with our clients to think 5 years out. You have to take 5G as a given.

But that’s what’s happening now. On the panel “New Marketing Strategies: How to Make Money with XR,” RYOT Head of Content Nigel Tierney mentioned how even with 5G on the horizon, there are still limitations to solve: “We’re at the crux of unlocking possibilities.” Silkie Meixner, Partner, Digital Business Strategy at IBM, likewise mentioned how the firm is working now to help clients prepare for a future that’s ripe with opportunity and is set to change the way they work. “There are already things we have to do with our clients to think 5 years out,” she said. “You have to take 5G as a given.”

Buying in on Big Bandwidth

So, what does the 5-year, 5G plan look like? The simplest way to envision a 5G-infused future is to consider the significant boost in bandwidth it will provide: it can reach speeds of up to 100x faster than 4G, which had previously made significant impact on services like streaming music and video years ago. And what 4G connectivity has done for video, 5G could do for emerging media, including cloud-based gaming (like Google’s upcoming Stadia gaming service) or streamable AR and VR.

Monk Thoughts 5G innovation opens up an “era of advanced video experiences that will truly allow mobile to distinguish itself as an entertainment medium.

In his keynote presentation, Viacom CEO Bob Bakish discussed how the multinational entertainment conglomerate is looking forward to a near-future of premium video content, enhanced with the power of 5G and integrated with related media and platforms. This would usher in an “era of advanced video experiences that will truly allow mobile to distinguish itself as an entertainment medium.” It would also encourage brands and content creators to consider the many contexts in which audiences will connect to their content: on a smartphone, in a driverless car or somewhere else.

Bakish mentioned how developing such content through new partnerships would help network operators differentiate themselves by leveraging their partners’ IP. We saw the strategy in action at this year’s Comic Con, where we helped AT&T launch a VR experience that let Batman fans fly through Gotham City. The experience’s presence at the conference instilled trust in AT&T’s ability to output the kind of content that audiences desire after its acquisition of Time Warner and DC Comics IP.

Extending Extended Reality Even Further

While the Batman experience was site-specific, 5G offers opportunity to enhance and scale up such experiences for mass audiences. One of the biggest challenges affecting AR and VR right now is that it’s not easily streamable; users must discover and download applications for fully-featured experiences, which is partly why the much more limited (yet accessible) camera filter has risen as the most popular and ubiquitous use of the technology.

Image from iOS (3)

Managing Director of MediaMonks LA, Olivier Koelemij (right), sat on the panel to discuss the opportunities that 5G offers to extended reality.

Image from iOS (2)

Managing Director of MediaMonks LA, Olivier Koelemij (right), sat on the panel to discuss the opportunities that 5G offers to extended reality.

But 5G can do away with those constraints. “More bandwidth means we can be more ambitious and artistic with the content we create,” says Olivier Koelemij, Managing Director of MediaMonks LA, who sat on the same panel. “A better, more immersive story means our strategies to amplify it will become more ambitious in lockstep.”

This means there’s opportunity for brands to relate with audiences through more sophisticated, shareable digital experiences. Tierney attributes failure to poor storytelling and lack of meaningfulness, citing a need for brands to integrate personalization and data into the creative process and delivery—basically, they need to be more purposeful in their ideation and delivery to provide resonant interactive experiences.

“We don’t suggest a technological approach because it’s the hot trend,” says Koelemij. “Our content and technology must be fit for format, purpose and consumer.” He suggests viewing any creative problem through a pragmatic lens. “You should ask questions like: is extended reality helping us deliver a stronger message here? How can we integrate other digital elements to do this?” The goal is to home in on the right approach for your business goals through data and KPIs, ultimately delivering an experience that resonates with consumers.

Monk Thoughts Our content and technology must be fit for format, purpose and consumer.

Meixner described how IBM employed such a strategy to develop a VR-enabled training experience. The B2B solution not only makes training faster and scalable by teaching trainees skills that they’d otherwise gain in a classroom—it also collects data through interactions like motion analysis, which could be used to optimize the tool or develop new ones. The strategy shows how innovations can be developed and optimized through practical, real-world data that empowers and educates.

It’s clear from this year’s conference that 5G isn’t just about connecting people to friends or family via a wireless handset. It’s about truly integrating all of the devices and touch points we interact with each day, enabling truly transformable and new interactions. As the technology begins to roll out, brands must be prepared to adopt it with a sense of purpose to offer audiences meaningful, impactful and differentiated experiences.

2019 is the year that 5G finally became real. Find out what the tech means for brands from some of the brightest minds working in the mobile industry. MWC Los Angeles Rolls Out the Red Carpet for 5G’s Arrival Hollywood switches gears from the big screen to the phone screen—and beyond.
mwc2019 mobile world congress mwc los angeles mwc la mobile world congress los angeles mobile world congress 5G 5G opportnunity extended reality virtual reality augmented reality vr ar olivier koelemij

What We Learned from Demoing Google’s New Depth API

What We Learned from Demoing Google’s New Depth API

4 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

What We Learned from Demoing Google’s New Depth API

Get ready for an upgrade: in early December, Google revealed its Depth API, a new functionality coming to ARCore that allows virtual objects and real-world environments to play nicer together, allowing for more convincing and immersive mixed reality experiences. A demonstrable way that Depth API achieves this is by enabling occlusion, or the illusion of virtual objects’ becoming obstructed behind real-world ones.

Convincing occlusion has historically been difficult to achieve, though Google has put together a video portraying demos of the new API that show off its features. One of those demos, which challenges the user to a virtual food fight against a levitating robot chef, was developed in collaboration with MediaMonks.

What’s exciting about Depth API is its ability to understand the user’s surroundings at an unprecedented speed and ease. “The API’s depth map is updated in real time, allowing AR apps to be aware of surfaces without complex scanning steps,” says Samuel Snider-Held, Creative Technologist at MediaMonks. This enables not only occlusion as mentioned above, but also the mimicry of real-time physics. For our virtual food fight against the AR-rendered robot, missing is part of the fun; users can take delight in the digital splatters of food on the objects around them without worrying about cleanup.

The Building Blocks to More Immersive AR

How does Depth API work, and what sets it apart from other methods of occlusion? “The Depth API uses an approach called ‘depth from motion,’ in which ARCore determines distances to objects by detecting variances between image frames while the camera is moving,” says Snider-Held. “The result is a high-resolution depth map that is updated in real time, allowing the device to better understand where objects are in relation to one another and how far away they are from the user.”

Depth API is software-based, requiring no new hardware for users with ARCore-enabled devices once it releases publicly. While sufficient occlusion significantly increases the verisimilitude of virtual objects, it follows a series of incremental updates that build on one another to allow for more realistic immersive experiences. Just last year—the same year ARCore debuted—Google released its Lighting Estimation API, which lights virtual objects to match the existing lighting conditions in the real-world setting, including light reflections, shadows, shading and more.

Screen Shot 2020-01-02 at 5.38.40 PM

Since then, a feature called Cloud Anchors allows multiple users to view the same virtual objects anchored in a specific environment. It’s the key feature powering the multiplayer mode of Pharos AR, an augmented reality experience we made in collaboration with Childish Gambino, Wolf + Rothstein, Google and Unity—which itself served as a de facto demo of what Cloud Anchors are capable of in activating entirely new mixed reality experiences.

“We have the creative and technical know-how to use these new technologies, understand why they’re important and why they’re awesome,” says Snider-Held. “We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.”

A Streamlined Way to Map Depth

Depth API wasn’t the first time that MediaMonks got to experiment with occlusion or spatial awareness with augmented reality. Previously, we got to experiment with other contemporary solutions for occlusion, like 6D.ai, which creates an invisible 3D mesh of an environment. The result of this method is similar to what’s achieved with Depth API, but the execution is different; translating an environment into a 3D mesh with 6D.ai is fastest with multiple cameras, whereas Depth API simply measures depth in real time without the need of scanning and reconstructing an entire environment.

Similarly, Tango—Google’s skunkworks project which was a sort of precursor to ARCore—enabled special awareness through point clouds “When we had Tango from before, it used something similar to a Kinect depth sensor,” says Snider-Held. “You’d take the point clouds you’d get from that and reconstruct the depth, but the new Depth API uses just a single camera.”

Monk Thoughts We’re not scared to take on tech that’s still in its infancy, and we can do it with a quick turnaround with the backing of our creative team.
Samuel Snider-Held headshot

In essence, achieving occlusion with a single camera scanning the environment in real time offers a leap in user-friendliness, and makes it widely available to users on their current mobile device. “If we can occlude correctly, it makes it feel more cemented to the real world. The way that they’re doing it is interesting, with a single camera,” says Snider-Held.

Adding Depth to Creative Experiences

Depth API is currently opening invitations to collaborators and isn’t yet ready for a public release, but it serves as a great step in rendering more believable scenes in real time. “It’s another stepping stone to reach the types of AR experiences that we’re imagining,” says Snider-Held. “We can make these projects without caveats.”

For example, a consistent challenge in rendering scenes in AR is that many users simply don’t have large enough living spaces to render large objects or expansive virtual spaces. Creative teams would get around this by rendering objects in miniature—perhaps just contained to a tabletop. “With Depth API, we can choose to only render objects within the available space,” says Snider-Held. “It lets us and our clients feel more comfortable in making these more immersive experiences.”

As brands anticipate how they might use some of the newest features of fast-evolving mixed reality technology, they stand to benefit from creative and production partner that can bring ideas to the table, quickly implementing them with awareness of the current opportunities and challenges. “We bring creative thinking to the technology, with what we can do given our technical expertise but also with things like concept art, animation and more,” says Snider-Held. “We don’t shy away from new tech, and not only do we understand it, but we can truly make something fun and inventive to demonstrate why people would want it.”

MediaMonks built a demo featuring occlusion and realtime physics in AR, showcasing the functionalities of Google's new Depth API. What We Learned from Demoing Google’s New Depth API We go in-depth on ARCore’s new Depth API.
AR augmented reality mixed reality xr extended reality occlusion ar occlusion google arcore

Looking Back at 2019 and the Dawn of a New Era

Looking Back at 2019 and the Dawn of a New Era

4 min read
Profile picture for user mediamonks

Written by
Monks

Looking Back at 2019 and the Dawn of a New Era

The decade is drawing quickly to a close, and it’s been a wild ride. From new technologies to new members of our family (we welcomed BizTech, IMA, Firewood Marketing and WhiteBalance this year), 2019 presented us with a lot of thrilling changes—and some exciting opportunities as we enter a new era. Looking back, we polled managing directors from our offices around the world for their favorite trends and technologies that have emerged in the past year—and what they’re looking forward to next.

Extended Reality Gets Real

Interest in mixed and extended reality (the combination of real and virtual objects or environments, like augmented or virtual reality, enabled by mobile or wearable devices) has been growing. At the same time, mixed reality has made strides in maturity over the past year, like Google’s efforts in making virtual objects feel truly anchored to the environment with occlusion, in which virtual objects are responsive to their surrounding environment—for example, disappearing behind real-world objects.

For Martin Verdult, Managing Director at MediaMonks London, extended reality is among the innovations he’s become most excited about going into 2020, and not just for the entertainment potential: “Virtual and augmented reality will become increasingly prevalent for training and simulation, as well as offering new ways to interact with customers.” For example, our Spacebuzz virtual reality experience gives children a unique look at the earth and environment they may typically take for granted, using the power of immersive tech to leave an indelible mark.

Monk Thoughts Value comes from connecting an IP to a brand through a deeply engaging hyper reality experience.

As the technology that powers extended reality matures, so will its potential use cases. But when a technology is still evolving significantly in short time, it can be difficult for brands to translate their ideas or goals into clear, value-added extended reality experiences. “We have introduced creative sprints for our core clients to get these ideas in a free flow,” says Verdult.

Among Verdult’s favorite examples of augmented reality projects MediaMonks has worked on this year is Unilever’s Little Brush Big Brush, which uses whimsical, virtual animal masks to teach children proper brushing habits and turn a chore into playtime. Similarly, extended reality can bring products to life in an engaging way—or if used in a customer’s research phase, it can help customers interact with a product with minimal (or no) dedicated retail shelf space.

Little Brush Big Brush Case Video.00_00_15_17.Still009

Part of the Little Brush Big Brush’s charm is that it extends beyond simply AR, connecting to a web cartoon series and a Facebook Messenger chatbot to reward kids with stickers at key milestones. “Value comes from connecting an IP to a brand through a deeply engaging hyper reality experience,” says Olivier Koelemij, Managing Director at MediaMonks LA. “One that only a well-executed integrated production can offer, combining digital and physical in new and extraordinary ways.”

AI/Machine Learning Grows Up

One can’t reflect on past innovations and look to the future without mentioning artificial intelligence and machine learning. From programmatic delivery to enabling entirely new creative experiences—like matured extended reality powered by computer vision—to connecting cohesive experiences across the Internet of Things, artificial intelligence “will change our interaction with technology in ways we can’t imagine yet,” says Sander van der Vegte, Head of MediaMonks Labs, our research and development team that continually experiments with innovation.

The most creatively inspiring uses of AI are the ones that will help us understand the world and our fellow humans. In collaboration with Charité, for example, we programmed a 3D printer to exhibit common symptoms of Parkinson’s disease and its effect on motor skills. The result is a series of surreal art-objects that make real patients’ experiences tangible for the general population.

190719_Productshot_Groupshot_fin

Social Content and Activations Build Impact

Ask Sicco Wegerif (Managing Director at MediaMonks Amsterdam) what struck him this year, and he’ll tell you it’s the elevation of social content in purchasing—for example, how Instagram made influencer posts shoppable early this year. Wegerif notes that about a quarter of consumers have made a purchase on social media, signaling new opportunities for brands to build connections with consumers.

“Looking at this from an integrated and smart production perspective, we can help brands create so many assets and storylines that tap into this trend, especially when combining this with data so we can be super personal and relevant.” When social media is prioritized early in the creative and planning process, it can enable more meaningful experiences.

For example, our “People are the Places” activation for Aeromexico used Facebook content to transform the way users discover destinations around the world. Instead of researching and booking a city, users get to learn about people around the world—then purchase a ticket to where they call home. The social content enriches the experience and builds emotion into the experience. “It’s in essence a very simple thought that can change the whole CX,” says Wegerif.

Social Activations and Digital Experiences Weave Together

Speaking of social media, it can become a powerful tool to build relevance and connection with experiential. Jason Prohaska, Managing Director at MediaMonks NY, says: “Experience and social work hand-in-hand as part of the digital plan for many brands, and are no longer below the priority line.” With live experiential—which elevates the role of the online audience to interact, take part in and build buzz around experiences—brands can achieve greater strategic impact in how they build connection with their consumers.

But doing so successfully requires a confluence of data, influencers, experiential storytelling and production. The future of this looks good to Prohaska. “We expect 2020 to deliver several use case scenarios at scale for brand identity that may set benchmarks for personalization, automation, customer journey optimization, efficacy, performance and engagement.”

Koelemij looks forward to stronger investment in digital and consumer understanding as brands begin to integrate experiences even further going into 2020. “With most good work, success and performance can now be better attributed to digital as we get more advanced in understanding what success looks like,” he says, “especially in how we can measure it across blended activations.”

And that’s exactly how we’d like to spend 2020: helping brands achieve their goals with data-backed, insights-driven creative across the customer decision journey. Through added capabilities thanks to companies like WhiteBalance, Firewood, BizTech and IMA joining the S4Capital family in 2019, we achieve this by greatly prioritizing and enhancing key elements of the marketing mix for daring brands—and as we reflect on the past year, we can’t wait to see what’s next.

At the close of the decade and the dawn of a new era, we look back at some of the most exciting trends and developments in the past year. Looking Back at 2019 and the Dawn of a New Era We look back at past achievements and set expectations for 2020.
End of year recap recap tech trends ar augmented reality mixed reality extended reality 2019 new year s4capital social media marketing machine learning

Futureproof Your IHA Through External Partnerships

Futureproof Your IHA Through External Partnerships

3 min read
Profile picture for user mediamonks

Written by
Monks

Futureproof Your IHA Through External Partnerships

A common challenge that in-house agencies (IHAs) have always faced is difficulty in training and hiring the talent they need to pull off excellent creative. Unfortunately, this strain doesn’t seem to be going away. According to a survey by the ANA, 44% of US IHAs cite attracting top-tier talent as a primary creative content concern. And it’s not just about merely acquiring talent: an even bigger challenge they face lies in keeping their talent energized.

It’s no surprise, then, that so many external partnerships for IHAs revolve around two key capabilities: executing ideas in new and interesting ways, or offering access to specialized skillsets. Both are key in today’s digital landscape, which is defined as an age of hyperadoption, in which users adopt and drop new behaviors at an unprecedented rate. In addition to all of the channels that are cropping up, you don’t even know which will stick around a few years down the line.

As brands gauge the next big channels they’ll use to connect with consumers, they must adopt new digital skillsets in lockstep. But given the talent concerns mentioned above, how can IHAs keep up with these shifting user behaviors? The answer lies in new breeds of partnership that give IHAs the skills and tools they need to fulfill the brand promise in ways that not only stand out and “wow” consumers, but make sense to them.

Stand Out by Innovating Strategically

In his talk at the IHAF Conference this week, which brings together and celebrates hundreds of in-house agency professionals, Forrester analyst Jay Pattisall discussed the importance of creative differentiation. Most digital experiences look and feel the same, opening an opportunity for brands to stand out through best-in-class creative. Fitting well within the conference’s theme of “Futureproof,” Pattisall set his focus on recent shifts in the creative landscape, and where IHAs fit within it.

Monk Thoughts Differentiated creative combines an understanding of culture with real, heavy-lifting business impact that drives real bottom line value.
black and white photo of Wesley ter Haar

IHAs have thrived thanks in part to their unrivalled brand knowledge; they understand the purpose, intricacies and nuances of their brand. As Darren Abbott, SVP, Creative at Hallmark said while noting the power of IHAs to their brands: “We’re not part of Hallmark, we make it Hallmark.”

Yet executing their vision in an environment that encompasses so many emerging channels can be tough. New partnership models that aim to augment in-house teams’ understanding of technology, or that push them to think in new ways, can aid in both forecasting future opportunities and identifying the best channels available today for bringing the brand experience to life.

magnum template

AR, like this Snapchat game we made for Magnum Ice Cream, is loved by users and easily accessible for brands.

If you’re intrigued by some of today’s emergent technology, consider putting it through what MediaMonks Founder and COO Wesley ter Haar calls the “trend lens.” Discussed in his skill session at the IHAF Conference, “Extending Beyond the Horizon,” ter Haar described the trend lens as a strategy through which you can gauge the maturity of emerging tech as it rises up—or drops off from—the hype curve. It’s how we help brands arrive at solutions that best fit their capabilities and needs.

Let Your Brand Story Drive Tech Investment

The assessment specifically measures how a technology or platform meets user behavior (what consumers are doing with it) and distribution (how widely it’s adopted). VR, for example, isn’t distributed among consumers as well as AR is; this makes the former more ideal for installations and trade shows, while the latter serves as a popular way for consumers to simultaneously connect with brands and communicate with friends on mobile.

The trend lens works because it asks brands to really consider how their audience naturally behaves on a given channel. But brands must ensure that the creative idea is aligned with a clear business goal. At MediaMonks, for example, we don’t strive to sell brands on whatever the hot, novel technology of the day is. Instead, we experiment to push technology to its limit ourselves, then pay those learnings forward to help brands approach emerging tech strategically and tell their stories the best way they can.

Again, an IHA’s strength stems from its passion and knowledge of the brand. External partnerships that challenge their approach to creative and assess new opportunities granted by emerging tech are essential for futureproofing and connecting with consumers as the digital landscape continues to evolve.

External partnerships can prove essential in helping IHAs keep up with emerging tech opportunities when facing talent constraints. Futureproof Your IHA Through External Partnerships Don’t let talent constraints hold you back from chasing future-focused opportunities.
IHAs in house agencies in house agency IHAF creative differentiation innovation emerging tech ar augmented reality tech trends

Creative Considerations for Extended Reality Experiences

Creative Considerations for Extended Reality Experiences

4 min read
Profile picture for user mediamonks

Written by
Monks

Creative Considerations for Extended Reality Experiences

This week, Fast Company launched its annual Innovation Festival, featuring Fast Track sessions that take attendees behind the scenes and into the homes of some of the most innovative companies in New York City. Billed as “Fast Company’s unique take on the field trip,” Fast Track sessions engage brands and creatives through hands-on talks and experiences hosted by participating companies, including MediaMonks.

Our New York office opened the doors to the dojo, inviting brands into our home to discuss all things extended reality. In a panel session devoted to augmented reality and its application to music and entertainment, our Monks dove deep into the design and development process of Pharos AR—a mobile AR experience made in collaboration with Childish Gambino, Wieden+Kennedy, Unity and Google. Taking users on a cosmic journey set against an exclusive track from Childish Gambino, the app is notable for being the first multiplayer music video.

With a panel including Snider-Held (Creative Technologist, MediaMonks), Thomas Prevot (Senior Producer, MediaMonks) and Tony Parisi (Head of VR/AR Brand Solutions, Unity), the session served as a casual fireside chat. The conversation kicked off by establishing the state of VR and AR, often characterized by the conflicting feelings that VR is dead and that the clear use case for AR hasn’t yet been found. But both technologies are well established, each excelling in achieving different goals within different environments.

VID_20190412_160740.00_04_48_06.Still003

Figures from cave paintings spring to life in Childish Gambino's trademark neon aesthetic in the environment around the user in Pharos AR.

Showcasing our Batman experience as a strong example of the immersive powers of VR, Snider-Held noted that “These experiences are still very installation-based,” and that AR’s distribution through mobile offers the potential for greater reach with a simpler experience. In explaining the process of developing Pharos AR in particular, the group explored key considerations for challenges to consider when developing an extended reality experience.

AR Can Feel Real Without Being Photoreal

Constraint prompts creativity—an adage that applies just as well to AR as any other medium for art making. Because mobile AR experiences are designed for use across a variety of devices, they must be relatively lightweight to provide a smooth experience to the widest share of users. Failure to keep technical constraints at top of mind can instead result in a lagging, stuttering experience that breaks immersion.

While this is true for any digital experience, it’s especially true for AR, a medium which Parisi says aims to “intelligently interact with the real world.” This expectation to seamlessly integrate with the surrounding environment can make stuttering graphics stick out like a sore thumb. “You want to keep the frame rate above 30 frames per second,” says Snider-Held, “because the user will compare the motion on the screen with the action happening around them in reality.”

Monk Thoughts Stylistically, we’re trying to remain within the constraints of mobile processing in a visually appealing way.
Samuel Snider-Held headshot

The trio took this challenge as an opportunity to discuss the highly stylized look achieved with Pharos AR. While a photorealistic graphics might be impractical for a mobile device to realistically render in real time, a stylized look presents the opportunity to differentiate your experience through a strategic choice in aesthetic; for Pharos AR, the team took visual inspiration from Childish Gambino’s laser-punctuated stage shows, ensuring the app’s look and feel naturally integrated with the rest of the artist’s visual brand.

“Stylistically, we were trying to remain within the constraints of mobile processing in a visually appealing way,” said Snider-Held. An example of this is the use of particle effects, in which sparkles of light coalesce into a ghostly image of Childish Gambino as he dances to the music, animated via motion capture. “This is the best example on why you don’t need to do photorealism,” Parisi said. “We were able to capture the essence of Donald, because it’s his dance.”

Carefully Plan the Narrative Environment

Extended reality experiences are interactive by nature, meaning they rely on a different approach than how you would plot out and plan more linear experiences. There’s a careful balancing act between giving users the reigns to explore on their own versus stringing them along a narrative thread. MediaMonks Founder and COO Wesley ter Haar notes that “Narrative UI is key for onboarding and guiding the user in an AR experience,” making it incredibly important that you plan out users’ interactions and use environmental cues to shape a narrative.

While Pharos AR begins and ends through open-ended user interaction, it still follows a clear narrative through the virtual performance of Childish Gambino’s single, “Alogrhythm.” In exploring the primary path through the experience, the team began planning it in storyboard form, much like you would make for a film. “This process not only serves as visual research, but also in briefing the animation team and envisioning how actions will play out in the user’s environment,” said Snider-Held.

Monk Thoughts Narrative UI is key for onboarding and guiding the user in an AR experience.
black and white photo of Wesley ter Haar

That makes sense—but what sets storyboarding for AR or VR different than other forms of digital storytelling? Because users could play with the app in their homes, the team had to plan for other variables how large the virtual scene should be, and what actions would be possible for multiple users to make within a smaller environment—like the cramped living room of a New York apartment, to offer an example that the Fast Track attendees could relate to. The challenge demonstrates how important it is to map out the virtual scene for different scenarios and users.

An interesting insight uncovered in the panel was that the team didn’t just rely on visual methods of planning like maps and storyboards. Due to the nature of the background music building up as users explore the space around them, the team also developed a musical timeline that maps up how different interactions trigger the layering of the music. The step showcases how sensorial, environmental cues can shape the action within an immersive, extended reality experience.

Whether developing for VR or AR, extended reality experiences require developers to rethink the creative approach beyond the standard linear story. From considerations in setting, technical constraints and variations in number of users across platforms, extended reality development relies on a comprehensive understanding of the building blocks that make up a total user experience. Snider-Held capped off the session with an ambition on what MediaMonks aims to achieve with brands through such experiences: “We strive to further the use of the technology from impossible to probable, and experiment in how to further that, too.”

Drawing on the development of Pharos AR, MediaMonks offered Fast Track attendees a peek at key considerations in developing for AR, VR and what stands in between. Creative Considerations for Extended Reality Experiences VR and AR offer a new way to interact with the world–and require new creative approaches.
ar vr augmented reality virtual reality pharos ar childish gambino digital narrative digital storytelling extended reality mixed reality

Facebook’s New AR Ads: Get Ready for Your Close-Up

Facebook’s New AR Ads: Get Ready for Your Close-Up

4 min read
Profile picture for user mediamonks

Written by
Monks

Facebook ads are about to become a lot more fun. In the lead-up to the recent Advertising Week New York conference, Facebook announced three interactive ad formats that offer new ways for consumers to engage with brands: video poll ads, playable game ads and interactive AR ads built through Facebook’s Spark AR platform.

Each of these may look familiar, as they’re not entirely new. Video poll ads have been available in Instagram Stories for quite some time, prompting users to answer a question while watching video content. Like with Facebook’s Instant Experience formats, brands can use video polls to open up a web page in-app, allowing users to act on the video content (like downloading a mobile game as its trailer plays). Playable ads essentially function like minigames. While they were originally open to select brands, they are now available to all advertisers.

Perhaps most exciting are playable AR ads, which bring the familiar experience of selfie filters and Facebook Camera Effects into the mobile News Feed. AR ads became available to a subset of advertisers in July 2018 for testing, and the results, according to Facebook, look promising: the social network notes that AR ads drove a 27.6-point lift in purchases for WeMakeUp. By inviting users to try on lipstick shades, the ad also drove an average of 38 seconds of interaction—a significant increase in the 1.7 seconds users typically spend consuming content on the platform.

Monk Thoughts We’re still at a place where AR is a thing people know about, but you have to go through a lot of steps to get there.
Samuel Snider-Held headshot

They weren’t the only ones. Fellow beauty brand Bobbi Brown tested AR ads earlier this year against regular video ads. Their AR ads—also allowing users to try on new lip colors—tripled click-through rates and doubled website purchases compared to video, according to Glossy.

Bringing Together Fun and Function

One reason the examples above are so effective because they offer a low barrier of entry to try out and experiment with products at the top of the funnel: just a single tap. “We’re still at a place where AR is a thing that people know about and can be used for advertising, but you have to go through a lot of steps to get there,” says Samuel Snider-Held, Creative Technologist at MediaMonks. “This is a way of removing one of the steps.”

While entertainment-based Camera Effects make a great way for brands to promote sharing and empower users to tell their own stories, AR ads focusing on utility can help users really understand and research a product, driving meaningful engagements and measurable results. “Makeup and glasses have been some of the most poignant use cases for advertising with Camera Effects because people immediately see it and understand it,’” says Snider-Held.

Spark AR is also an incredibly accessible platform for brands to build simple, snackable experiences, and is a low-hanging fruit for experimenting with supporting augmented reality. Developing just a single Camera Effect, for example, provides brands the opportunity to establish a direct connection with consumers at scale—especially for brands that strive to take their creative capabilities in-house, but are tight on resources.

Little Brush Big Brush Case Video.00_00_15_17.Still009

Our Camera Effect experience for Unilever’s Signal toothbrush, transforming their Little Brush Big Brush web series into a full-fledged AR experience that gets kids in the groove of establishing healthy brushing habits. With animal mask filters and gamified elements, the Camera Effect shows how AR can be made both fun and functional, as well as the power of content to build brand love through differentiated, engaging digital experiences.

Building on User Behavior

It’s easy to see why AR ads are so effective for Facebook users: integrated directly within Stories and now the mobile News Feed, these effects fit seamlessly within the ways that users interact with one another on Facebook—opposed to, say, an AR experience that engages users on Facebook but ultimately pulls them away to an external microsite.

As brands invest more in digital advertising, they must evolve their strategies to react to emergent user behavior, because doing so is key to remaining relevant and driving meaningful engagement with their audiences. A good creative and production partner can help brands become more relatable and engaging with the dominant modes of interaction and communication today, and help them anticipate or overcome common challenges in the process.

frame_03_2000x1200

With back-facing camera support, brands can make their surroundings much more fun and engaging.

For example, Snider-Held highlights one challenge that some brands—luxury ones in particular—may run into with AR ads: file size limitations, which make it tough to highlight minute details, like stitching on a bag. “If the file size stays limited, brands will have to decide if they want to use that space to entertain the user, or make a product model that looks as high quality as possible,” he says.

In a panel focused on the S4 Capital model at Advertising Week New York, Sir Martin Sorrell noted the that S4 is committed to partnering with tech giants and working with in-house teams—a quality that differentiates the group from the approach that traditional agencies take. This partnership can help you forecast future opportunities that emerge with new technology. “If we can use the back-facing camera as well, you could put a model of a car in your driveway,” says Snider-Held. He compares this functionality to similar uses of AR that lack the visibility offered by the Facebook News Feed, like platform-specific AR models only available on certain apps on specific devices.

But these new means of collaborating are essential in ensuring brands are equipped to engage in step with changing means of communication, perhaps even innovating in the process. We’re excited to see how Facebook has revitalized familiar ad formats like video polls and Camera Effects by transferring each to a new environment, and look forward to how both will enable new conversations between users and brands.

Experiment with Camera Effects that really inspire.

Facebook’s new ad formats show a commitment to cultivating meaningful interactions between brands and their consumers, particularly through snackable AR experiences. Facebook’s New AR Ads: Get Ready for Your Close-Up Facebook is bringing its popular Camera Effects straight to the mobile News Feed.
spark ar facebook facebook ar facebook camera effects camera effects social ar augmented reality video polls

Using Clever UI to Lead Users on a Journey, Without Leaving Them Stranded

Using Clever UI to Lead Users on a Journey, Without Leaving Them Stranded

3 min read
Profile picture for user mediamonks

Written by
Monks

Using Clever UI to Lead Users on a Journey, Without Leaving Them Stranded

Earlier this summer, we released  Pharos AR, a mobile app that takes Childish Gambino fans on a virtual, hallucinogenic journey through the cosmos starting from wherever they stand in the real world. Fans and press alike were impressed by the artist’s bold foray into virtual space.

VentureBeat was excited by the long-term value of the app, for example: “Between the cool visual effects and Glover’s music—which the app will apparently update with new songs over time—there’s certainly enough here to merit a download for fans.” Variety, meanwhile, applauded how well it integrated within the larger Childish Gambino universe: “The whole thing is very spacey, and stays true to Childish Gamibo’s other ‘Pharos’ projects” that have expanded across performances.”

As we see interest build in augmented and mixed reality over the interim, we’re looking back on the project’s development and how multiple parties and team members came together to employ best practices in UI and design, helping everyday users ease into a mysterious–and perhaps technologically overwhelming–new interface without limiting space for play and exploration.

Made in collaboration with Google Zoo, Unity and MediaMonks, it’s the world’s first shared augmented musical experience, allowing multiple users to enjoy a unique, artistic experience together. “This app is a breakthrough for AR,” says Thomas Prevot, Sr. Producer at MediaMonks. “It serves as another outlet for Childish Gambino’s creativity, letting him update his fans with future song releases over the cloud,” says Prevot, explaining how the app fits within the larger Childish Gambino brand.

Pharos AR also showcases the power of the ARCore platform and how it can enable immersive, social storytelling experiences. In particular, it shows off the capabilities of Cloud Anchors, which lets multiple users interact in a shared virtual space–which also makes it fairly unique among AR experiences. The tech’s newness can be intimidating, though: how can apps cultivate an interactive, exploratory experience for those new to AR?

Integrate Brand Familiarity & Digital Ecosystems

Childish Gambino fans will recognize Pharos AR’s light-particle silhouette from a Pixel 3 TV ad, and scenes projected within an enigmatic pair of monoliths at this year’s Coachella festival. These elements are purposefully and artfully executed across touchpoints, and this integration with a pre-existing digital ecosystem helps to make the process of designing for emerging tech more intuitive.

VID_20190412_160740.00_00_12_14.Still002

Users were invited to try out the app at Coachella.

From the Pharos Festival to a TV spot to Coachella and beyond, you’ll find the same trademark, psychedelic motifs true to the nature of the artist. This goes to show the potential of AR as a powerful channel to not only engage audiences in an immersive way, but to bridge together the Childish Gambino gospel as well.

Get Intuitive

With AR still being fairly new to some, developers must think carefully about onboarding new users. “Experiences like Pharos AR provide an exciting opportunity for us to help make AR and VR more accessible to wider audiences,” explains Justina Sung, UX Designer at MediaMonks. “If people habitually use their phones a certain way, how do we break out of that to teach them new behaviors for emerging tech like AR?”

VID_20190412_160740.00_04_48_06.Still003

As users explore the surrounding cave, neon dancers come to life around the fire.

Inspired by Childish Gambino’s spirit of ambiguity and minimalism, the app integrates a sensory communication method, instead of verbal directions, to onboard. “Most app tutorials task the user with swiping through instructional cards, but we made it feel like the title sequence of a movie,” says Alex Otto, Associate Creative Director at MediaMonks. When users first open the app, for example, a pink laser beam dynamically snakes through space and into icons that illustrate next steps. “It’s designed to give off a mysterious feeling that builds up layers of suspense,” says Otto.

Direct Users Through Environmental Storytelling

Because AR rewards users through exploration, designing for it requires a careful balance between providing freedom versus direction. “It’s important that the design gives fans room to explore on their own to find meaning, using subtle environmental cues that nudge users in the right direction,” says Sung.

In Pharos AR, the scene gradually dims as users discover hidden glyphs on the cave walls. This nudges them to shift their focus to the altar, which lights up as users discover more paintings—a bit like a progress bar. Other cues include haptic feedback and the gradual, sonic buildup to Childish Gambino’s song “Algorythm,” which released through the app.

AR is a powerful platform for brands to tell their stories and engage directly with key audiences. In the case of Pharos AR, the technology provides users a chance to embark on a wild, immersive journey that encapsulates Childish Gambino’s message of enlightenment. As the first interactive, shared musical experience in AR, the experience truly pushes the limits on what can be achieved—inspiring brands and artists alike to consider what’s possible.

Eager to experiment with easily approachable AR?

Bringing AR entertainment to the masses, Childish Gambino’s Pharos AR app took a smart approach to onboarding users new to the tech. Using Clever UI to Lead Users on a Journey, Without Leaving Them Stranded Transforming the living room into a cosmic journey is no simple task.
AR VR mixed reality augmented reality childish gambino pharos pharos ar google ar ui

Google I/O Puts Focus on Speedy, Accessible New Interfaces

Google I/O Puts Focus on Speedy, Accessible New Interfaces

5 min read
Profile picture for user mediamonks

Written by
Monks

Google I/O Puts Focus on Speedy, Accessible New Interfaces

At this year’s I/O conference, Google unveiled several new features related to its upcoming Android release and devices. Among the most exciting of these features are those that aim to change the dominant interface through which users engage with their devices: typing on a keyboard. While the touch screen revolutionized media about a decade ago, it looks like the camera and microphone are ready to take the baton, at least when it comes to accessing on-the-go info.

Google CEO Sundar Pichai’s keynote event was brimming with new features and products that are set to change how we interact with devices and each other in everyday situations. We paid tribute to this spirit of progress by producing an animated countdown video that kicked off the keynote, taking viewers on a journey through advances in tech over the decades.

Many of Google’s most interesting announcements centered on voice and visual search in particular, making the case that these new features could provide its users with information much faster than if they had to type it out. While this has always been the idea with voice interfaces, this year’s I/O event delivers on the promise by showcasing instantaneous voice recognition with its Assistant and surfacing up actionable information through augmented reality. Here are the feature’s we’re most excited about—and where the value lies for users and brands alike.

Google Assistant Becomes More Human

Google announced several improvements to its Assistant, many of which center around on-device voice recognition. Previously, Google’s voice recognition model was 100GB, requiring queries to connect to the cloud before getting a response. Now, Google has managed to shrink that model down to an impressive half-gigabyte, small enough to fit on devices for rapid, offline voice recognition.

“Now that Google can do recognition on the device itself, the device can actively listen and respond without the need to go through the loop of saying ‘Hey Google’ followed by a command,” says Michiel Brinkers, Technical Director at MediaMonks. “You can simply keep talking to the Assistant with follow-up questions.” It also allows for contextual commands, such as saying “Stop” to stop media playback on the device, no “Ok Google” required.

Google’s newly unveiled Nest Hub Max device—which is a mix between the Home Hub and Nest camera—even adds physical gestures to the interface. Thanks to facial recognition, the Nest Hub Max can alert users when it notices someone in the home it doesn’t recognize, or greet them with personalized content when they’re in view. The latter solves a crucial problem faced by Internet of Things devices: when multiple users in a home share a single device, how do you target them individually with personalized content? We’re excited to see Google crack the case while alleviating privacy concerns with on-device facial recognition.

Speedy Voice Recognition Will Change Users’ Lives

The greatly improved speed achieved through offline Assistant interactions is a game-changer on mobile devices, where wait times or lack of connection can be a huge pain point. “If a voice assistant doesn’t instantly do what you want it to, or if it gets it wrong, then it becomes more effort to use that system than to accomplish the task through typing or tapping,” says Brinkers. “But what Google showed offers a huge improvement.”

Monk Thoughts Accessibility initiatives are where Google shows its value to the greater good.

While on-device voice recognition can make many of our lives easier, but for some it will be life-changing: thanks to immediate transcriptions, Android devices will be able to provide users with auto-generated subtitles for any video or audio (including live content), an obvious benefit to the hard-of-hearing. In addition, Google announced its Project Euphonia program, which will provide larger data sets to train the Assistant to better understand those with speech impairments. “These initiatives are where a company like Google shows their value to the greater good,” says Brinkers.

As a Technical Director, the faster, improved speech recognition turns the creative wheels in Brinkers’ mind. “If voice does become a dominant input method, maybe we can listen to tone of voice—do more than just listen to what’s being said, but how it’s said,” he muses. “Then we could identify their emotion and design experiences around that.”

Google Lens Brings Printed & Digital Content Together

Voice isn’t the only interface Google is gunning for this year: the company also revealed several new AR features. While most consumers’ experience with AR has been focused explicitly on entertainment, I/O demonstrated how much the technology has matured in the past year to provide users with actionable, contextual information they can use in their daily lives.

Monk Thoughts We always ask ourselves what the utility use case is for AR. This is it.

One example shown in Google’s keynote is the ability to scan a restaurant menu with a phone using Google Lens. Doing so provides users with a list of a restaurant’s most popular dishes, reviews and photos. We’ve long said that the camera is the new browser, and new Lens features offer a textbook example of what that future could truly look like. “If I could read any restaurant menu in a foreign country and see what the food looks like through my phone, that would be amazing,” says Brinkers. “We always ask ourselves what the utility use case is for AR. This is it.”

In addition to providing greater contextual information, Google showcased Lens’ ability to animate traditional, static media—one of the coolest features for those who always wished they could read an animated newspaper as seen in the Harry Potter universe. One example demoed at the event is a poster depicting the Eiffel Tower. When scanned with Google Lens, clouds in the sky begin to move, bringing the image to life.

The tech isn’t just about cool visual effects, though—it also has utility, particularly with how-to content. Scan a recipe in a magazine with Lens, and a video tutorial can overlay atop it to show how the dish is prepared. What really places Lens at the forefront of AR is that the scanned media doesn’t require abstract, distracting markers or QR codes to activate; the content itself is the key, enabling a more elegant way to augment printed media.

Get Up-Close & Personal with Google Search Results Using AR

Later this year, users will find 3D models in Google search results, allowing them to examine the object or thing they’re searching for from any angle. If that’s not already cool enough, Google is upping the ante by letting users place the object in front of them using AR. This functionality offers a simple, intuitive way for users to learn about real-world objects and preview products.

Monk Thoughts You see a lot of synergy between AR and machine learning; Google is combining all these tools.

“If you searched a chair on Google, it would be neat to drop it down in your room and see how it looks,” says Brinkers. “It will be interesting to see how this competes with proprietary apps that already let you do something similar.” One benefit that searchable AR objects have over those native apps is that users can view them without having to download and install anything. Google is exploring brand partnerships for developing these models in search, signaling the potential value it can have for marketing.

What’s truly exciting about each of these developments is their potential to come together in one unified experience. Scan a sign in a foreign language with Lens, for example, and Google can verbally read it back to you in your own language through advanced text-to-speech. Marry visual and voice features with an augmented reality layer, and the way we interact with everyday devices—if not the environments around us—may radically alter in the next couple of years. “What’s interesting with this event is that you see a lot of synergy between AR and machine learning,” says Brinkers. “Google is combining all these tools that they’ve worked on separately, and we see it coming together in a way that no one anticipated.” Ok Google, what’ll it be next?

Google's I/O 2019 event offered several interesting interface updates: more mature voice interactions, purposeful AR and greater accessibility. Google I/O Puts Focus on Speedy, Accessible New Interfaces Ok Google, what’s new? Faster voice, more useful AR and greater accessibility.
Google Google I/O I/O conference augmented reality ar voice search voice assistant visual search google assistant

Using AR to Tell a Story from Your Façade

Using AR to Tell a Story from Your Façade

4 min read
Profile picture for user mediamonks

Written by
Monks

Using AR to Tell a Story from Your Façade

A hallmark of the high-street shopping experience is to gaze at the various retail window displays while shuffling about. These intricate dioramas capture the attention of passersby and invite them inside through their creativity and intrigue—but these displays have long lacked a crucial element of engagement: interactivity.

A Forrester report highlighting the need for reaching consumers through digital experiences discusses the need for brands to center their marketing strategy on brand experience: “Digital is much more than marketing technology and channels; it is a way to harness technology’s enormous and unbridled capacity to reach out, connect, personalize, and engage.” The report goes on to mention how engagement through emerging digital channels in particular increases messaging salience, making it key for brands to continually bolster their understanding of emerging tech and its many applications.

To explore how brands could engage with audiences this way through one emerging channel—AR—the MediaMonks Labs team collaborated with one of the major players in the medium, Google. Together, they came up with an experimental app that transforms ground-level windows into portals to virtual worlds. Here’s how it works: take a look at a storefront window and you’ll find imaginative posters portraying friendly bees. Scan the marker with your phone’s camera, and the bee-bedecked windows transform into a splendid virtual world found inside a cartoonish beehive, the scene chosen for the tech’s prototype.

onboard3

Inside the hive, busy bees work tirelessly to support their society (but not without having some fun). The idea is that a place of business could use this virtual display to invite passersby to peek “inside” their windows and get a sense of what’s going on there—in this case, workers doing their due diligence—without sacrificing privacy. Of course, the small virtual world offers numerous opportunities for digital brand storytelling as well.

What sets the app apart from your typical AR portal is that it’s designed for a specific window layout at a specific building, which allows for a more convincing virtual scene. For example, the precise measurements already programmed into the app allows the scene to continue to render even as users turn away from the AR marker that activates the experience. This means some bees can fly outside of the virtual space and into the surrounding world. Small details like this add to the scene’s realism and lets the digital experience spill out into the real world. “With this technology, you can tell a story on your façade,” says Geert Eichhorn, Innovation Director at MediaMonks. “You can take anything you want to showcase from your business at that moment.”

Showcasing Brand Values

In addition to letting viewers “peek in” to a place of business while maintaining privacy, the tech can also help white collar businesses maintain a friendly, active presence within its surrounding community through inviting digital experiences. With a quirky art style and delightful interactions, for example, the virtual beehive shows that the hosting office space isn’t your typical, boring place to work—and is committed to helping society as a whole, just like the friendly bees support their own. For retail businesses, similar AR experiences transform traditional window displays into memorable, interactive experiences—an objective that’s become increasingly popular for retailers in particular.

Guards

AR is a versatile technology, which means the virtual diorama doesn’t have to stick to a place of business, either. “Another aspect of this sort of experience is its portability,” says Roan Laenen, Jr. Creative at MediaMonks. “You can take the posters to other places, like a school. This digital beehive could easily be reframed as an educational tool on how bee colonies function or to raise awareness of dwindling bee populations.” As users interact with the hive, they’re treated to a selection of fun facts about the bees, delivered in an amusing environment.

While the context of interaction is different at a storefront versus a classroom, brands can consider how their AR campaign translates or integrates with larger social initiatives. For brands seeking ways to offer values-based marketing, AR provides an excellent opportunity to showcase those values in a fun, digestible way.

Engage Through Explorable Interactions

“Other storytelling mediums are linear, but this experience is personalized and reactive to the user’s exploration of the space,” says Eichhorn. Just like in a human city, life inside the beehive is broken down into various role-based areas: the nursery where bee larvae are fed and cared for, the queen’s palace, the construction site where builder bees press new honeycombs and more.

builderinteract3

As users fix their gaze on these areas, a guide bee flies toward it and asks if they’d like to engage or learn more. “The guide bee isn’t just telling you what to do,” says Eichhorn. “His suggestions are built into the way the user looks around and explores, making him more like a real person who guides you through a space.” The guide balances the tightrope between leading users’ attention while also providing the freedom to engage however they please.

As a medium, AR naturally opens up opportunities for play and exploration. This results in a sense of intuitiveness that can’t be beat by most other forms of digital media. “This type of experience feels fun and understandable for young kids, yet it’s engaging for older people as well,” says Laenen, “even if they don’t understand the tech behind it.”

Good AR is a Multi-Discipline Effort

The guide bee buzzing to and fro isn’t the only way the experience captures user attention. “It’s a coming together of various talents and disciplines,” says Laenen. “The gamified element that attracts users to interact really flourishes with the animations and sound effects that also make up the experience.”

Poster Mockup - The Hive Drive

While much of the installation’s focus may be on the virtual world rendered via AR, one can’t forget the importance of the printed posters that activate the experience. They are, after all, what make the first impression to passersby and invite them to act. Printed collateral must not only be engaging, but consistent with the virtual experience. “The poster art easily translates to the 3D models,” says Laenen. “Close collaboration between the 3D modelers and the illustration team allowed us to realize a whole intricate world.”

By leveraging the latest digital technology, brands can reach their audiences in surprising and new ways, sometimes where consumers least expect it. By rendering imaginative virtual worlds as seen with MediaMonks Labs’ and Google’s digital beehive, brands can educate their audiences on brand values or key social issues in a way that sticks—and that sounds like a recipe sweeter than honey.

Learn how AR can bring physical spaces to life for more enchanting experiences and storytelling opportunities, like window displays that peek into alternate, fantastical worlds. Using AR to Tell a Story from Your Façade AR can transform any space into a digital stage, like turning a stagnant window display into an interactive Wonderland.
augmented reality ar retail technology emerging tech retail technology trends

F8 2019 Teases New Ways to Start the Conversation on Facebook

F8 2019 Teases New Ways to Start the Conversation on Facebook

4 min read
Profile picture for user mediamonks

Written by
Monks

F8 2019 Teases New Ways to Start the Conversation on Facebook

Facebook hosted its annual F8 conference this week, announcing several new features on the horizon both big and small. With a focus on more responsible uses of tech and a rejuvenated focus on fostering communication between friends and loved ones, this year’s event felt a bit like the start of a new era—helped by the surprise release of an entirely new mobile app design.

Facebook’s vision has always been to bring people together, and at this year’s F8 conference the company placed its focus away from the News Feed and toward more genuine forms of communication, like Groups, Messenger and Stories. These features not only help friends and loved ones connect in new and more engaging ways, but can help brands engage more directly as well.

Offering More Direct Forms of Communication

Let’s start with Groups: Facebook killed its standalone Groups app nearly two years ago, but this week unveiled a new core app design that places Groups at the forefront, promoting them in different areas of the platform. This means users might find buy and sell groups promoted when exploring the Marketplace section of the app, for example. In addition to providing better visibility, Facebook is also enabling features specific to group types, like a template for employers to easily list job openings in groups for job seekers.

resp2

Facebook’s approach to innovation is supported by a desire to use tech responsibly.

Facebook has also showed off more one-to-one types of interaction designed for smoother socializing and forming new relationships. Its Meet New Friends feature introduces users within shared communities, while its Dating feature offers a Tinder-like system that brings together people who share romantic interest in one another. Over on Messenger, meanwhile, friends can hang out virtually by watching videos together or even work together with a desktop app that allows for multitasking.

What’s interesting with these announcements is that the role of the News Feed—or at least broadcasting updates to it—becomes downplayed as Facebook explores other ways to bring people together through more direct forms of communication.

Chatbots are Invading Instagram

Chatbots aren’t new—Facebook popularized the medium at its 2016 F8 conference—but they’re about to get a lot better on Messenger. The most notable new feature is integration into Instagram ad units. Brands can include CTA’s in their Instagram ads that encourage users to swipe up to chat with an associate. A bot can handle the earlier stages of the conversation to qualify leads, then pass those leads along to a live agent. “It’s the perfect balance of scaling conversations for more basic interactions, while ensuring that qualified leads and complex customer service interactions are handled by real people,” says Nick Fuller, SVP of Growth at MediaMonks.

nickportrait

MediaMonks SVP of Growth Nick Fuller got a hands-on look at Facebook’s recent and upcoming features.

Another interesting thing about this functionality is how it brings different platforms in the Facebook family together: users may initiate a chat on Instagram, but the conversation happens within Messenger. “What’s important here is that Facebook is discovering more and more integration points of customer data and experience across their platforms,” says Fuller. “This means brands have the opportunity to target on one platform but easily retarget on another.”

The integration with Instagram ads highlights the success that brands have been having with its Stories format. “Stories is a really high-performing ad space for Facebook,” says Fuller. “With this chatbot integration into Instagram ads, brands can take users through the funnel with targeting, lead qualifying and conversion happening in one seamless flow—which is incredible,” said Fuller.

Bring Offline Spaces to Life

Facebook Camera Effects are cool and all—we wrote the book on it—but Facebook is looking beyond photo sharing to explore other opportunities where AR provides value. For example, users could scan a poster that transforms it into a three-dimensional scene or model, whether it be a portal that looks inward or an object that spills out of the surface, beyond physical constraint.

ARANGLES

Facebook's SparkAR platform lets you pull off dizzying effects from different perspectives.

Fuller sees endless creative possibilities for brands that want to digitally engage with their customers within a physical environment. “This will be a killer feature for brands to reach users in retail or event spaces, for example” he says. This can range from fun moments (digital scavenger hunts that encourage you to explore an amusement park) to practical (a tutorial, viewed from multiple angles, showing how to put furniture together).

“From a product education standpoint, having the ability to aim your camera at a sign to achieve this is going to be a really helpful next-level AR capability.” We’ve seen how harnessing emerging technology can have huge benefits to getting customers to explore brick and mortar retail, so we’re excited to see how new leaps in accessible, scalable AR will further transform the physical shopping experience.

In addition to the features mentioned above, Facebook reiterated a couple key themes of the F8 conference: responsible use of technology (by combating fake news, making AI more inclusive and more) and a shift away from being a “social network” to a “social platform.” The new era for Facebook looks perfectly structured for enabling direct communication between brands and their fans, and we can’t wait to see how the platform further shapes up in the next year.

Can’t wait to dazzle Facebook users with SparkAR?

This year's F8 conference showcased Facebook's stronger sense of responsibility and a desire to bring people together through more direct methods of communication, including new Messenger and Group functionality. F8 2019 Teases New Ways to Start the Conversation on Facebook Facebook’s ways of bringing people (and brands) together are more direct than ever.
facebook facebook news feed facebook stories instagram stories chatbots augmented reality AR

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss