GSoC/GCI Archive
Google Summer of Code 2015

JdeRobot - Universidad Rey Juan Carlos

License: GNU General Public License version 3.0 (GPLv3)

Web Page: http://jderobot.org/Collaborate#Ideas_List

Mailing List: jde-developers@gsyc.es

JdeRobot is a software development suite for robotics, home-automation and computer vision applications. These domains include sensors (for instance, cameras), actuators, and intelligent software in between. It has been designed to help in programming such intelligent software. It is mainly written in C++ language and provides a distributed component-based programming environment where the application program is made up of a collection of several concurrent asynchronous components. Each component may run in different computers and they are connected using ICE communication middleware. Components may be written in C++, python, Java... and all of them interoperate through explicit ICE interfaces.

JdeRobot simplifies the access to hardware devices from the control program. Getting sensor measurements is as simple as calling a local function, and ordering motor commands as easy as calling another local function. The platform attaches those calls to the remote invocation on the components connected to the sensor or the actuator devices. They can be connected to real sensors and actuators or simulated ones, both locally or remotely using the network. Those functions build the API for the Hardware Abstraction Layer. The robotic application get the sensor readings and order the actuator commands using it to unfold its behavior. Several driver components have been developed to support different physical sensors, actuators and simulators. The drivers are used as components installed at will depending on your configuration. They are included in the official release. Currently supported robots and devices:

  • RGBD sensors: Kinect from Microsoft, Asus Xtion
  • Pioneer robot from MobileRobotics Inc.
  • Kobuki robot (TurtleBot) from Yujin Robot
  • Nao humanoid from Aldebaran
  • ArDrone quadrotor from Parrot
  • Firewire cameras, USB cameras, video files (mpeg, avi...), IP cameras (like Axis)
  • Pantilt unit PTU-D46 from Directed Perception Inc.
  • Laser Scanners: LMS from SICK and URG from Hokuyo
  • EVI PTZ camera from Sony
  • Gazebo and Stage simulators
  • Wiimote
  • X10 home automation devices

JdeRobot includes several robot programming tools and libraries. First, viewers and teleoperators for several robots, its sensors and motors. Second, a camara calibration component and a tunning tool for color filters. Third, VisualHFSM tool for programming robot behavior using hierarchical finite state machines. It includes many sample components using OpenCV, PCL, OpenGL, etc.. In addition, it also provides a library to develop fuzzy controllers, a library for projective geometry and some computer vision processing.

Each component may have its own independent Graphical User Interface or none at all. Currently, GTK and Qt libraries are supported, and several examples of OpenGL for 3D graphics with both libraries are included.

JdeRobot is open-source software, licensed as GPL and LGPL. It also uses third-party software like Gazebo simulator, OpenGL, GTK, Qt, Player, Stage, Gazebo, GSL, OpenCV, PCL, Eigen, Ogre.

JdeRobot is a project developed by Robotics Group of Universidad Rey Juan Carlos (Madrid, Spain).

Projects

  • Interconnection between JdeRobot and ROS This is a proposal for the "Interconnection with ROS " project for the JdeRobot organization.
  • Interconnection of JdeRobot with Android Wear Android Wear devices were the most sold wearable gadgets last year. With the introduction of ICE for Android, android devices can communicate with JdeRobot components through ICE interfaces allowing them to be a part of distributed systems. Our project work extends this idea to connect a physical android wear device with JdeRobot and use it for remote control of aerial and ground vehicles.
  • Structure reconstruction using kinect2 device Two core components will be implemented, i.e. the Visual Odometry, which is responsible for Kinect tracking to get accurate pose of every frame(also loop closure); the Dense Mapping, in which all the depth and color frames will be aligned to a global coordinate, using a pose graph(maybe TORO) to achieve best results by optimization. After point cloud reconstruction done, meshing and other post-processing can be done to achieve beautiful models.