A Mixed Reality (MR) simulation for volumetric analysis, providing a gamified, interactive laboratory experience for students and educators. It allows users to interact with digital lab equipment in a real-world setting, offering a safe, cost-effective, and engaging way to conduct chemical experiments. The core features include interaction with apparatus, a wobbly liquid that changes color, interactive UIs, and a feedback system.
This was week 1 Prototyping phase submission. the idea was selected by our mentors among 6 other ideas.
Conducting chemical experiments in reality involves risks due to the handling of potentially hazardous substances and can incur high costs for materials. This MR simulation provides a safe platform where students can repeat experiments multiple times without wasting real chemicals or risking accidents. Additionally, it gamifies the learning process, increasing engagement and motivation, and makes the experiment accessible to a broader audience, particularly beneficial for institutions with limited resources or students unable to participate in traditional lab environments.
I created everything in a week.
Meta SDKs (MRUK, Scene Understanding, Grabbing interactions), Blender and Figma.
Was my first time implementing the Meta SDKs. Coming from Unity's XR Interaction Toolkit, the philosophy was a bit different, it's like now everything is broken down into a component no matter how custom it is. Some which you would customize in XRI Toolkit by inheritance, here has it's own component.
There also things called Building Blocks, which helps to rapidly setup interactions and behaviours by drag and drop, just imagine!!. You can setup a full working Mixed reality app by just drag and drop.
In this app i used Scene Understanding, Grabbing interactions, Unity Events and few other stuffs.
One of my colleague had started creating a VR version of it, so i plan to offer the project, so that she can merge he two.
Meta Quest 2, Quest 3. Tested on Quest 3.