Development and testing is now complete for the Relay Server software addition, which allows remote operation of Oculus Prime mobile robots connected to the internet via mobile 3G/4G/LTE networks. It also allows operation behind NAT firewalls, or on any network where there is no ability to configure and forward ports necessary for general internet remote access.
All that is required is to run an instance of the Oculusprime Server application on a device connected to an unconstrained network. This will act as the ‘relay’ server, which you then configure the robot to connect to. When you want to remotely connect to the robot, you connect to the relay server instead, which relays commands and video from the robot seamlessly. The server can be running on any Linux system, including a Raspberry Pi, or within a virtual Linux environment on a Windows or OS X PC.
Now you can equip Oculus Prime SLAM Navigator or Pi Explorer with a smartphone or mobile wifi hotspot, and see how far you get driving around outside, free from the limited range of a wifi network (in fair weather, of course!)
This addition to the Oculusprime Server application has been on the to-do list for a long time, but has been put off because, well, it required a lot of boring network programming. (There always seemed to be something more exciting to work on, like testing out the newly-opened-sourced Google CartographerROS package.)
Summary of enhancements to Oculusprime Server version 0.8:
Expanded network menu
Wifi Access Point Manager upgraded to version 0.914
Red5 streaming media server upgraded to version 1.07
Apache Tomcat web server upgraded to version 8.0.33
Updated power PCB firmware (auto power-off at 30%, reduce false positive errors)
Not too long ago, the availability of low cost depth sensors, suitable for mobile robot auto-navigation and SLAM mapping, had become a problem. Apple bought Primesense, and along with it the intellectual property behind the original Microsoft Kinect and Asus Xtion sensors. The excellent Asus XtionRGBD camera, which was to be the main SLAM sensor for Oculus Prime, was discontinued. The Kinect 1 was still available in quantity, but it was bigger and heavier, and required separate 12V and 5V power.
And it somehow just looks wrong with a Kinect mounted to Oculus Prime:
So, we decided to explore using stereo vision as a possible option. A prototype robot was conjured, sporting two Lifecam Cinema cameras:
OpenCV’s Semi-Global-Block-Matching (SGBM) algorithm yielded decent looking depth data from combined images. Below left is the left camera view, and on the right is the disparity image generated with the cameras separated by a 60mm baseline – pixel intensity is proportional to distance from camera:
Comparing the depth images from the stereo setup vs the Asus Xtion camera was looking promising (stereo image on top, Xtion image on bottom, left camera 2D image inset):
In practice however, with this stereo setup as the data source for ROSSLAM mapping, there were issues. The depth data was quite noisy for some surface textures, and depth accuracy wasn’t very good beyond a few meters. Also, the SGBM algorithm tends to cause data omission for large texture-less surfaces.
The image below shows a comparison of a plan view map of the same area, generated using the stereo setup on the left, and using the Asus Xtion depth camera on the right (using data from the horizontal plane only, and pure odometry to align the scans):
The stereo scan noise occasionally projected a small false obstacle, that would wreak havoc with the ROS path-planner, and the inaccuracy or omission of distant features would cause weak localisation (and a lost robot).
Another problem was the slow speed of the system: the OpenCV Java SGBM processing, along with all the other robot functions fighting for CPU time, would only yield 2 frames per second (the prototype stereo-bot had an Atom N2800 CPU), demanding that navigating robot speed be slowed way down, to reduce errors.
In retrospect, an integrated stereo solution like the Zed camera, with some on-board processing and tightly-calibrated cameras, would have been much more effective (if money was no object).
In the end, we stuck with the Asus Xtion, dwinding supply and all, in the hope that the near future would deliver a new low cost depth sensor with stable supply.
Luckily the Orbbec Astra came along just in time.
Since the last 0.709 release, the Oculusprime Server Java application has been updated to 0.713 with the following enhancements:
On-board video recording
Follower function added (for SLAM Navigator version)
Auto-docking minor reliability improvements
Added reverse arc moves
ARM-supported avconv video streaming added 720p photos, sound detection, and stability improvements
High current drain detection disables motion until resumed by forward command
Power PCB firmware less strict checking on minor warnings
Video recording on/off toggle button has been added to the main menu:
Videos are saved under 'oculusPrime/webapps/oculusPrime/streams' with the current date/time as the file name, in FLV format.
Optional – Converting Video Format
If you want to convert to another format after recording, an easy way is to use avconv. If you’re running the Raspberry Pi equipped Oculus Prime Explorer version, it’s already installed. For SLAM Navigator versions, it can be installed by opening a robot terminal session and entering:
Follower mode can also be launched the traditional ROS way, from the command line with:
$ roslaunch oculusprime follower.launch
If you auto-update Oculus Prime from the server menu, it should grab the latest ROS code and compile automatically. Compiling is required for this node, since it’s coded in C++ instead of Python. If you’re having any problems, try compiling again with:
The radiation emitted during operation is dangerous for humans, so a reliable ROV like Oculus Prime is the way to go. It has been equipped with a radiation badge and a dosimeter, and a couple extra charging docks have been placed in the tunnel to optimize the inspection schedule. The unit’s overlapping wifi network auto-switching capability supplies uninterrupted connectivity throughout the entire 1/2-mile stretch.
The X-ray radiation is expected to degrade Oculus Prime’s unshielded electronics over time, but so far so good after a few months. Battery charging within the tunnel was initially throwing errors; possibly because of the massive nearby magnets, throwing off current sensing in the power PCB. We released a firmware update to increase the allowable current calibration tolerance, and that seems to have done the trick.
UPDATE 2016-4-30: The guys at Cornell provided this great shot of Oculus Prime at work in the synchrotron tunnel
The latest 0.709 update to the Oculusprime Server Application includes the following enhancements:
Beta ARM/Raspbian support: should work with most ARM single board computers with at least 1Gb RAM, see setup notes
Optional Camera/mic capture via avconv/ffmpeg (enabled with setting 'useflash false')
Carpet auto-sense during navigation (sensed carpet will disable arc-turn mode)
Measured route distance added to navigation log
arcmovecomp setting added (use a lower value if arc turns are too wide)
Relaxed unnecessary power warnings, ignore no_battery_connected and no_host_detected
Wait for power board ping response while auto-docking (in case of board reset)
The full package is available on the downloads page, or existing clients can be auto-updated from the remote web browser UI:
MENU > server > check for software update
We’re offering non-depth-camera systems pre-assembled and configured with Raspberry Pi, now up for pre-order! We’re dubbing it the Oculus Prime Pi Explorer. Here’s a rendering of a possible future decal option for that unit:
We haven’t tried auto navigation on the Raspberry Pi 3 yet (with only 1Gb of RAM it may not work reliably without running part of the ROS navigation stack on a supporting PC). However, one of our customers has reported full auto-navigation functionality running on an Odroid XU4 single board computer, using an Xtion sensor.
The fully autonomous SLAM Navigator version of Oculus Prime has just had a system hardware upgrade: it now comes standard with the new ASUS N3150IC mini-ITX mainboard, equipped with an Intel Braswell 14nm N3150 CPU.
Autonomous navigation using the ROS Navigation Stack is highly CPU intensive. The older Intel Atom N2800 boards we were previously using were up to the task, but the Intel N3150 offers a serious performance boost:
Navigation system start-up is 3X faster
Fewer localization errors during driving due to high CPU
Faster and more accurate mapping (fewer missed scan matches)
Faster live video encoding and image processing
RAM expandable to 8Gb
The board meets Oculus Prime’s requirements, with the CPU rated at a power-sipping 6W TDP, so the battery still lasts 3-4 hours like before, and it has a spare mini-PCIe slot for Wifi.
Overclockers rejoice: that massive gold heatsink barely gets warm with the GPU being unused!