Gibson Environment

For Embodied Real-World Active Perception

(page under construction -- Gibson Environments will be released in February 2018)

Get Started View on Github

Overview


Bridging the Gap between

Real World and Simulation

Perception and Action

"You shouldn't play video games all day, so shouldn't your AI." Gibson is a virtual environment built with real world complexity, using RGBD scanning.

"We must perceive in order to move, but we must also move in order to perceive." --- James J. Gibson Gibson enables training AI algorithms that explores both perception and action.

Get Started

Use our opensource platform to explore active perception. If you find it interesting, we encourage reading about how we made it.

Gibson Platform

Gibson Environment for Real-World Perception Learning. We are opensourced on Github. Check it out , deploy, and start training perceptual agents.

Model Database

Up to 1000 RGBD scanning models, collected in indoor scenes using RGBD camera. State-of-the-art model complexity.

How We Did it

Learn about our deep view synthesis pipeline, physics integration, and experiment results in Gibson Environment.

Domain Adaptation

We propose a domain adaptation mechanism called "Goggle", which ensures that results in Gibson can trasfer to real world.