Skip to content

UnrealCV environments for reinforcement learning

License

Notifications You must be signed in to change notification settings

unrealcv/gym-unrealcv

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gym-UnrealCV: Realistic virtual worlds for visual reinforcement learning

Introduction

This project integrates Unreal Engine with OpenAI Gym for visual reinforcement learning based on UnrealCV. In this project, you can run (Multi-Agent) Reinforcement Learning algorithms in various realistic UE4 environments easily without any knowledge of Unreal Engine and UnrealCV.

A number of environments have been released for robotic vision tasks, including Active object tracking, Searching for objects, and Robot arm control.

Tracking in UrbanCity with distractors Tracking in Garden Tracking in SnowForest
Tracking in Garage with distractors Searching in RealisticRoom Robot Arm Control

The framework of this project is shown below: framework

  • UnrealCV is the basic bridge between Unreal Engine and OpenAI Gym.
  • OpenAI Gym is a toolkit for developing an RL algorithm, compatible with most numerical computation libraries, such as TensorFlow or PyTorch.

Installation

Dependencies

  • UnrealCV
  • Gym
  • CV2
  • Matplotlib
  • Numpy
  • Docker(Optional)
  • Nvidia-Docker(Optional)

We recommend you use anaconda to install and manage your Python environment. CV2 is used for image processing, like extracting object masks and bounding boxes. Matplotlib is used for visualization.

Install Gym-UnrealCV

It is easy to install gym-unrealcv, just run

git clone https://github.com/zfw1226/gym-unrealcv.git
cd gym-unrealcv
pip install -e . 

While installing gym-unrealcv, dependencies including OpenAI Gym, unrealcv, numpy and matplotlib are installed. Opencv should be installed additionally. If you use anaconda, you can run

conda update conda
conda install --channel menpo opencv

or

pip install opencv-python

Prepare Unreal Binary

Before running the environments, you need to prepare unreal binaries. You can load them from clouds by running load_env.py

python load_env.py -e {ENV_NAME}

ENV_NAME can be RealisticRoom, RandomRoom, Arm, etc. After that, it will automatically download a related env binary to the UnrealEnv directory.

Please refer the binary_list in load_env.py for more available example environments.

Usage

1. Run a Random Agent

Once gym-unrealcv is installed successfully, you will see that your agent is walking randomly in first-person view to find a door, after you run:

cd example/random
python random_agent.py -e UnrealSearch-RealisticRoomDoor-DiscreteColor-v0

After that, if all goes well, a pre-defined gym environment UnrealSearch-RealisticRoomDoor-DiscreteColor-v0 will be launched. And then you will see that your agent is moving around the room randomly.

We list the pre-defined environments in this page, for object searching and active object tracking.

2. Learning RL Agents

To demonstrate how to train an agent in gym-unrealcv, we provide DQN (Keras) and DDPG (Keras) codes in .example.

Moreover, you can also refer to some recent projects for more advanced usages, as follows:

  • craves_control provides an example for learning to control a robot arm via DDPG (PyTorch).
  • active_tracking_rl provides examples for learning active visual tracking via A3C (Pytorch). The training framework can be used for single-agent RL, adversarial RL, and multi-agent games.
  • pose-assisted-collaboration provides an example for learning multi-agent collaboration via A3C (Pytorch) in multiple PTZ cameras single target environments.

Customize an Environment

We provide a set of tutorials to help you get started with Gym-UnrealCV.

1. Modify the pre-defined environment

You can follow the modify_env_tutorial to modify the configuration of the pre-defined environment.

2. Add a new unreal environment

You can follow the add_new_env_tutorial to add a new unreal environment for your RL task.

Papers Using Gym-UnrealCV

🎉 Please feel free to pull requests or open an issue to add papers.

Cite

If you use Gym-UnrealCV in your academic research, we would be grateful if you could cite it as follow:

@misc{gymunrealcv2017,
    author = {Fangwei Zhong, Weichao Qiu, Tingyun Yan, Alan Yuille, Yizhou Wang},
    title = {Gym-UnrealCV: Realistic virtual worlds for visual reinforcement learning},
    howpublished={Web Page},
    url = {https://github.com/unrealcv/gym-unrealcv},
    year = {2017}
}

Contact

If you have any suggestions or are interested in using Gym-UnrealCV, get in touch at zfw1226 [at] gmail [dot] com.

About

UnrealCV environments for reinforcement learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%