Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss

How Facebook Built the Festival of the Future

How Facebook Built the Festival of the Future

5 min read
Profile picture for user mediamonks

Written by
Monks

This year’s Facebook Connect, one of the biggest annual events in digital innovation, went entirely virtual, compressing what’s typically a multi-day experience into a 10-hour live event augmented with 14 hours of on-demand content–not an easy feat. But it’s no surprise that the event, which offers a close look at digital’s potential to connect people, serves as a model of how brands can build events that not only react to the reality we live in today, but how people will connect and collaborate digitally well into the future.

Because Facebook Connect is the only world conference dedicated to connecting people virtually though AR and VR, the event itself had to embody that promise. Partnering with Facebook, MediaMonks built an experience ecosystem that brought the event to life through livestreaming and Oculus Venues – beta early access. And through live chats and connection to developer discussion groups, attendees could interact and network throughout the event within Facebook’s social ecosystem.

Diversity Panel

DE&I were important to the event, which featured a panel on diversity (photo above), a panel on accessibility and more.

Most importantly, the event was an example to other brands of how online events can turn digital into a true destination to meet, connect and play. While many have managed to create bombastic product reveals and virtual presentations despite the pandemic, Facebook and the MediaMonks team saw Facebook Connect as an opportunity to acknowledge the reality many of us are living and working within.

The event was a celebration of the work from home reality, and how connection, collaboration and productivity are still achievable. And for the first time, Facebook Connect was open to attendees far and wide for free. Diversity and inclusion were key pillars to ensure the event lived up to future-forward standards, with features like live captions and speakers on important topics relevant to the social climate.

Together, these elements show that building a current virtual event isn’t about just translating a series of touchpoints to digital, but rather maintaining the essence of an event’s goals within an entirely new context and experience. Here’s how it happened at Facebook Connect.

Reimagining the Product Showroom

The event kicked off with an early product reveal: the Oculus Quest 2. In a typical tradeshow setting, attendees would be able get up close and personal to view (or even try on) the product. But this wasn’t a typical event; absent of an in-person showroom floor, MediaMonks’ team of live experiential experts drip-fed exclusive, timed-release AR filters that activated on Instagram, allowing each viewer and attendee to explore new product features virtually. Invitations to “try on” the headset appeared via QR codes in interstitial segments between panels and talks.

Experience Facebook Connect yourself.

The product’s reveal inspired coverage from outlets like the Verge and TechCrunch, and even analysis from the Motley Fool, who reported on Facebook’s belief in connecting people virtually via emerging technology. In addition to the new Oculus headset, Facebook announced a slew of other news including a VR office solution, research into a future pair of AR-enabled glasses, game announcements including Star Wars and Assasin’s Creed, and more.

By using emerging tech to highlight some of the features and possibilities of these technologies built by Facebook, the event achieves a new level of brand virtualization—essentially, building distinct environments and ecosystems that translate brand promise into digital experiences. While events are only an initial step to virtualize, this type of digital, tangible product showcase offers a peek of how brands can differentiate in their product reveals.

Enabling Excitement and Exclusivity Through Engagement

In-person events thrive on engagement and making connections. But digital ones may often lack this energy, relegating interaction to just a chat box. “We aimed for a level of two-way-interaction and built that into the system, feeding back on the energy of the audience,” says Ciaran Woods, EP Experiential & Virtual Solutions at MediaMonks. “That’s always something we’ve been pushing for in a livestream.”

FBC Question

Viewers had the chance to select the last question that panelists and speakers were asked.

One of the key ideas behind Facebook Connect was to make the broadcast a real moment for audiences, rewarding those who took the time to sit down and participate live. This inspired the “one last question” at the end of talks and panels. Audiences were presented with three questions for the speaker or panel that they could vote on to ask. As viewers voted, an on-screen tally showed results in real time–made possible by LiveXP, MediaMonks’ live storytelling tool enabling a truer sense of interaction beyond just participating in the live chat.

Other immersive elements helped make Facebook Connect feel more tangible. One of the fun things about attending any event in person is taking some swag home with you. Shortly before and after the event, attendees could snag an exclusive Instagram filter that rewarded them with a personalized AR lanyard that serves as a memento of the experience. Finally, the event capped off with an exclusive talk from influential game developer John Carmack and an immersive Jaden Smith performance in VR.

Again, these features strive to put attendees “in the now.” A key challenge for digital events is evoking excitement and the feeling being present in a shared experience. What’s the difference between watching live and watching an on-demand recording? How does the event experience differentiate itself from just another livestream or video call? Brands and event organizers must consider these questions to ensure touchpoints build on excitement, promote a sense of presence and add some exclusivity to the live experience.

Connecting a Cohesive Journey

A final challenge that digital events face is building a cohesive journey across the experience. Brands often rely on external platforms and tools to host their events, with the consumer journey sometimes spread across different environments (for example, registering through a form on one page, accessing the schedule on a different platform and watching the event on a social channel). Brands serve their audiences best by building an events ecosystem that connects the experience–from lead-up to sign-up to aftercare–through a cohesive thread.

LiveXP_SP-remoteControl

The crew worked behind-the-scenes and across borders with impressive setups to ensure things ran smoothly.

While Facebook Connect took place exclusively on Facebook platforms, bouncing between different touchpoints like Oculus Venues, Facebook Groups and AR filters on Instagram could have felt jarring if not done with elegance and skill. An impactful visual identity designed by MediaMonks made for a connected and cohesive journey from start to finish. The visual identity included not only the Facebook Connect logo, but also interstitials, animations, soundscapes and a hub page that helped attendees find what they needed.

Together, these features culminate in an experience that turns digital into a destination, inspiring and drawing together Facebook’s community of developers as they envision the future of technology. Connecting various examples of emerging technology into a cohesive experience, Facebook Connect offers a glimpse of the festival of the future capable of activating communities and strengthening brand-consumer relationships.

With livestreaming, immersive AR/VR experiences and community discussion, Facebook Connect was ideally built for digital audiences. How Facebook Built the Festival of the Future The conference connected a cohesive journey across the Facebook ecosystem.
Virtual events virtual conference oculus facebook brand virtualization

Getting Our Hands Dirty with VR Hand Tracking

Getting Our Hands Dirty with VR Hand Tracking

4 min read
Profile picture for user Labs.Monks

Written by
Labs.Monks

Getting Our Hands Dirty with VR Hand Tracking

Engrossed in virtual reality, you’re surrounded by digital, fantastic objects, each begging for you to reach out and touch. But until recently, most interaction in mainstream VR headsets has still been limited to using a controller. For some experiences, the controller presents a disconnect between what people feel in their hands versus see on the screen—at least until recently.

Last month, the Oculus released its Hand Tracking SDK for the Oculus Quest, allowing people to use their hands to navigate through menus and applications supporting the new SDK. While the update isn’t meant to replace controllers outright, it enhances users’ sense of presence within the virtual space by blurring the barriers between real and virtual even further, presenting new creative opportunities for brands that are eager to offer assistive content in the emergent medium. “Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more of a physical thing, more like a real experience,” says Geert Eichhorn, Innovation Director at MediaMonks.

This illusion of reality is intriguing for Seth van het Kaar, Unity Monk at MediaMonks. “One thing VR has shown through experience and research is that our eyes override our other senses,” he says. “So, if I appear to be putting my hand in a bucket of cold water in VR, I’ll get the placebo effect of it feeling cold. Through creativity, you can use that to your advantage.”

Monk Thoughts Tangibility in digital has always been equated to a click of a mouse or key, but now it’s becoming even more like a real experience.
Portrait of Geert Eichhorn

Exploring the creative opportunities presented by the SDK, van het Kaar served as developer on a team of Monks experimenting with hand tracking to develop a working prototype that could take best advantage of the new interface. Here’s what the team learned in the process.

Find Opportunities to Get Hands-On

“Similar to how the development of voice as an interface has prompted brands to emulate human conversation as naturally as possible, we need to make these experiences feel as intuitive as possible, as you’re using your real hands,” says Eichhorn. As part of MediaMonks Labs, our research and innovation team, he’s focused not on using the latest tech for the sake of it, but rather finding the real-world application and value that it has for end-users.

Trying to identify what type experience would best benefit from this new input, the team wondered: what activities are very dexterous and require careful use of one’s hands? Shaving made sense: “It’s something that’s difficult for young adults and teens who are just learning to use these devices,” says Eichhorn. “And a lot of people still get things wrong, like going against the grain.” It’s also an intriguing use case in that shaving requires an element of precision, putting the usability of hand tracking to the test.

landscape-03

Inspired by clay, the Monk head grows noodles of hair that you can shave and trim.

By practicing grooming in VR using one’s own hands, users would be able to try out different tools and techniques without worrying about messing up their own hair. So, the team took our bald monk mascot and blessed him a head of hair, inviting Oculus Quest users to give him a shave and a trim in an experience inspired by the Play-Doh “Crazy Cuts” line of toys.

Start with Something Familiar

Interacting with one’s hands is incredibly intuitive; it’s one of the earliest ways that we engage with the world as infants. But that doesn’t mean any hand-tracking experience is inherently easier to use or design; experimenting with any new mode of interaction requires one to break free of any preconceived notions about design. In the case of hand tracking, how does one organize a series of options within an experience without the use of physical buttons (and in this case, no haptic feedback)?

To rise above the challenge, the team used common hand gestures as a starting point—for example, those used in rock/paper/scissors—to serve as an intuitive metaphor for interaction.  “The Oculus can track the difference between fingertips, so if I mimic scissors with them, that’s a funny interaction,” says van het Kaar. “In the app, you can select the scissors and now you’re like Edward Scissorhands,” a fictional film character whose hands made of scissors give him wild success as a hairstylist.

landscape-01

Move Beyond Limitations and Creative Constraint

In its experiments with the SDK, the team settled on a couple of learnings that could apply to subsequent hand-activated Oculus Quest experiences. First, there’s moving past the challenge felt in any VR environment: locomotion, or the relationship and (de)synchronization between one’s bodily movements and those of their virtual avatar.

Without haptic feedback, what should happen when the user’s hand comes in contact with a virtual object: should it move through the object, or should the object block their movement much like it would in reality? While the latter option might make sense on paper, the fact that users could still move their physical hand while the virtual one stays stationary could result in confusion. The team moved beyond the challenge by letting users push virtual objects freely—for example, the monk model that they shave—which snap back into place once released (which sounds like a fun interaction in its own right).

Monk Thoughts We need to make these experiences feel as intuitive as possible, as you’re using your real hands.
Portrait of Geert Eichhorn

The way that Hand Tracking SDK detects hands also presented a challenge: it seeks out the shape of a hand against a background, so it loses tracking once two of them overlap. “You can’t place a menu on the palm of your hand and tap an option on it, or interact with a virtual object on your wrist, for example,” says van het Kaar. To work around this challenge, a menu floats beside the user’s hand. While this doesn’t allow for haptic feedback by selecting options against one’s own body, this setup mitigates the risk of losing the tracking by having hands overlap.

Taking the time to experiment and apply these learnings allow us to develop increasingly realistic experiences in extended reality. From playing with hand tracking in VR to demonstrating how occlusion transforms experiences in AR, our team of makers are devoted to continually experimenting with new technologies, finding their most relevant use cases and establishing best practices for brands and our partners. As barriers continue to break down between the physical and virtual, it will be exciting to see what kinds of wholly new digital experiences emerge.

With the release of the Hand Tracking SDK for Oculus Quest, our innovation team went hands-on to experiment with a new form of input for virtual reality. Getting Our Hands Dirty with VR Hand Tracking We put a finger on how to build a hand-controlled experience for VR.
Vr virtual reality oculus oculus quest hand tracking sdk mixed reality extended reality

Choose your language

Choose your language

The website has been translated to English with the help of Humans and AI

Dismiss