the Libre Virtual Reality Meeting is a gathering of people interrested in the development of free libre tools and ressources to create Virtual and Augmented Reality experiences

Edition 2022 is hosted at Université Libre de Bruxelles, by the LISA laboratory , on the 6th of October 16:30-22:00 EUCT

in direct connection with FOSSXR.dev conference

supported by educode.be

Program

A-Frame workshop

by Fabien Benetou

A hands on session on how to quickly build XR scene on the Web

16:30

A distributed multiuser metaverse server infrastructure, connected by the fediverse protocol.

By William Murphy - Immers.spaces

Be connected; not confined. Immers Space makes self-hosted libre software that connects independent immersive experiences using open Web standards. With portable identity via OAuth 2.0 & OpenId Connect, a single account can be used across all different sites, providing consistent identity and giving people control over their data. With federated messaging using ActivityPub, people from different worlds and hosts can freely communicate without platform lock-in. Together, these enable a decentralized friends list where you can connect with people from many different virtual worlds, see their current locations in the metaverse when they're online, and join them instantly in live, shared experiences with the click of a link.

We'll talk about how Immers Space works, show some examples of how it is being used now, and demonstrate how you can connect your project to the metaverse with just one line of code.

17:30

get the Slides from the talk

The state of xrdesktop on SoCs

By Christoph Haag, Collabora

xrdesktop aims to provide an XR desktop experience not just on typical "VR ready" PCs but also on smaller systems. What does it take to run xrdesktop's wxrd standalone client, or xrdesktop's gnome-shell, or kwin integration on a Nvidia Jetson board? On a Raspberry Pi? After an overview over how xrdesktop implements taking windows into XR, this talk will present the solved and still open challenges to running xrdesktop on SoCs and other small systems, from performance considerations to GPU drivers.

18:00

WebXR, what's new since FOSSXR 2019, metaverse and more

by Fabien Benetou

Following FOSSXR 2019 both XR and the Web evolved. The latest buzzword introduced in late 2021 seems like an intangible abstraction. What actually is the metaverse and why does the Web is the perfect place for it to start? This talk will clarify what changed since 2019, both technically, software and hardware with a specific on FLOSS efforts, from all teams. The goal is to highlight gaps in the ecosystem for anybody who wants to both visit and build the metaverse thanks to FLOSS components. It is possible and it can done while keeping freedom in mind. Practically speaking we will touch on the WebXR specifications, current browsers per device and the networking stack allowing independent XR experiences to interconnect, allowing participants to build independently yet still yet users bring their own avatars and more.

18:35

Panel discussion : GDPXR : metaverse and personal data, why floss is key to respect user privacy and actual or coming privacy laws

Intro by Olivier Meunier, ozmovr.eu
& Fabien Benetou

presentation : context : in Europe, GDPR is in place since 2016, with enforcement since 2018 (cool down period of 2 year) EUCJ has issued in 2020 arrest that scrapped the "safe harbor" treaty challenging the use of US tech services in the countries of the union. In this context, and in the probable incoming of other privacy protecting laws in the US and elsewhere, the development of VR technologies needs to take data sharing very seriously. The use of FLOSS tools is particularly well adapted to this situation, with transparency, auditability and interoperability at the core. discussion : how to enforce that data privacy is safeguarded through interoperable platforms and tools? what to do with the "safe harbor" down? how to build "the metaverse" while respecting user privacy?

19:15

Plenoptic Imaging: Capture to Display and Use Cases

by Prof. Mehrdad Teratani, ULB

Realistic visualization of the content captured by 3D Imaging systems have recently attracted much attentions. One of the means of capturing a 3D scene is the use of a plenoptic camera that consists of an array of micro-lenses between the main lens and the image sensor. Such optical configuration allows capturing 3D content with one camera. Contents captured by such cameras well fit with existing 3D displays technologies. Plenoptic imaging technologies found their applications in several scenarios such as storage- or communication-based systems. This talk will focus on potential plenoptic imaging systems, use cases, and will give a summary of the related technologies - 3D acquisition, processing, and display.

20:00

Stream :

Live from the Auditorium :

join the chat on Matrix

Practical :

Free access : From 16:00 until 22:00 Brussels Time

in Auditorium B1.315 - Avenue Franklin Roosevelt 50, 1050 Bruxelles.

Acces : SNCB: Etterbeek station then Tram 25,
or Boondael station then tram 25 or 8
Metro: Line 1 - Station 'Delta' then Bus 71, or 72
Bus: 71 or 72 - 'ULB' stop, or 95 - Stop 'Cimetière d'Ixelles'
Tram: 25 or 8 - 'ULB' stop

contact.lvrm [at] vrse.be