Bridging the Gap between
Real World and Simulation
Perception and Action
"You shouldn't play video games all day, so shouldn't your AI." Gibson is a virtual environment built with real world complexity, using RGBD scanning.
"We must perceive in order to move, but we must also move in order to perceive." --- James J. Gibson Gibson enables training AI algorithms that explores both perception and action.
Use our opensource platform to explore active perception. If you find it interesting, we encourage reading about how we made it.
Up to 1000 RGBD scanning models, collected in indoor scenes using RGBD camera. State-of-the-art model complexity.
Learn about our deep view synthesis pipeline, physics integration, and experiment results in Gibson Environment.
We propose a domain adaptation mechanism called "Goggle", which ensures that results in Gibson can trasfer to real world.
Call To Action
Participate in our real environment challenge! To facilitate the next generation of powerful perceptive algorithms, we propose Gibson Challenge: a set of predefined tasks including navigation, object retrieval, collision avoidance, locomotion, etc. Try them out with your algorithms.