Six papers to add to your reading list from AI researchers at Meta at #CVPR2024. • PlatoNeRF: Discerning Reality in Plato's Cave from Single-View Two Bounce Time of Flight ➡️ https://go.fb.me/tju5fo • Nymeria: A Massive Collection of Multimodal Egocentric Daily Motion in the Wild ➡️ https://go.fb.me/0wcu84 • Relightable Gaussian Codec Avatars ➡️ https://go.fb.me/gdtkjm • URHand: Universal Relightable Hands ➡️ https://go.fb.me/1lmv7o • RoHM: Robust Human Motion Reconstruction via Diffusion ➡️ https://go.fb.me/ogm92y • HybridNeRF: Efficient Neural Rendering via Adaptive Volumetric Surfaces ➡️ https://go.fb.me/tzik3j
-
-
-
-
-
+1
I do not believe that we live in a simulation but this work is making that belief more brittle 😅🥽🌎
Can we figure out 3D movement and activity detection using only IMUs and a SMALL Motion Model? We’ve got to bring real-time processing on device. Thoughts?
Is Hashtag a different program or something or was it a typo in the copy? 🤔 P.S. Nymeria and RoHM looks promising.
NeRFs is here to stay!
Refreshing to see something other than LLM BS is coming out of google.
NeRF and Gaussian splatting technology is simply mind blowing and it’s here to stay and change the way we interact with objects and environments around us! There’s going to be a huge shift in the way entertainment industry is about to change!
Added to weekend learning 🔥
got a lot to learn about AI
Very informative
This sheds a new light on the topic!