Updated MALG PCB version 3

xaxxon MALG PCB rev3

We’ve once again updated our multi function differential drive robot MALG PCB — it retains the same basic functionality with slightly improved microcontroller power decoupling/smoothing, and a few layout changes.

Gone is the ancient 4 pin Molex jack found in the previous MALG — apparently the original Molex ‘Mate-N-Lok’ style dates back to 1963! The Molex cable was being used simply as a way to get 5V power from a motherboard, and to not be limited by the 500mA maximum supplied through the USB cable. Instead, with the J4205-ITX and J5005-ITX motherboards supplied with Oculus Prime SLAM Navigator robots, we’re now running a single lead from the motherboards’ +5V pin found in its chassis speaker header.

For flexibility we’re staying with use of Pololu daughter boards for the gyro. As well there is now an alternate gyro header for optional use of latest generation Pololu gyros and IMUs.
The photo below is of a heavily modified SLAM Navigator sporting a second MALGv3, equipped with a 9-DOF Pololu AltIMU-10:

xaxxon MALG with AltIMU-10 v4 IMU

It’s running our modified version of the Razor_IMU_9dof ROS package with Attitude Heading Reporting System (AHRS) firmware.

Other changes are:

  • Audio jack placement no longer interferes with the USB cable
  • Changed to the more common micro-B USB jack
  • Omit unused ‘FWD FLDLED circuit
  • Location change to one mounting hole

For more details and to buy the MALGv3 PCB, with or without gyro daughter board, go to the MALGv3 product page.

Posted by xaxxon on October 15, 2018   ↑ back to top

MALG PCB Back In Stock with New Gyro

xaxxon MALG PCB revB

The MALG (Motors-Audio-Lights-Gyro) multi function, Arduino compatible, differential drive robot PCB is back in stock!

Our remaining old MALGs had quality issues with their MAX21000 gyro ICs. So, otherwise perfectly good boards have been resurrected, with Pololu daughter boards sporting the L3GD20 three-axis angular rate sensor.

Performance-wise it seems there is no significant difference between the two gyros: by the specs the newer L3GD20 sensor has the edge, but they both ultimately provide super-accurate rotational odometry when used in mapping and auto-navigation. (The MALG PCB is standard equipment in our Oculus Prime mobile robots).

Expansion header pins that are occupied by the daughter board are still available via pass-thru pads on the underlying interface board, as detailed in the connection diagram:

xaxxon MALG PCB revB connection diagram
(click to enlarge)

More information on the MALG PCB (revB), including schematic and datasheet is available here. Firmware source code can be found in the github repo.

Posted by xaxxon on November 21, 2017   ↑ back to top

Oculus Prime Software Updated with 4G/LTE Connectivity

Development and testing is now complete for the Relay Server software addition, which allows remote operation of Oculus Prime mobile robots connected to the internet via mobile 3G/4G/LTE networks. It also allows operation behind NAT firewalls, or on any network where there is no ability to configure and forward ports necessary for general internet remote access.

All that is required is to run an instance of the Oculusprime Server application on a device connected to an unconstrained network. This will act as the ‘relay’ server, which you then configure the robot to connect to. When you want to remotely connect to the robot, you connect to the relay server instead, which relays commands and video from the robot seamlessly. The server can be running on any Linux system, including a Raspberry Pi, or within a virtual Linux environment on a Windows or OS X PC.

Now you can equip Oculus Prime SLAM Navigator or Pi Explorer with a smartphone or mobile wifi hotspot, and see how far you get driving around outside, free from the limited range of a wifi network (in fair weather, of course!)

This addition to the Oculusprime Server application has been on the to-do list for a long time, but has been put off because, well, it required a lot of boring network programming. (There always seemed to be something more exciting to work on, like testing out the newly-opened-sourced Google Cartographer ROS package.)

Summary of enhancements to Oculusprime Server version 0.8:

  • Relay server
  • Expanded network menu
  • Wifi Access Point Manager upgraded to version 0.914
  • Red5 streaming media server upgraded to version 1.07
  • Apache Tomcat web server upgraded to version 8.0.33
  • Updated power PCB firmware (auto power-off at 30%, reduce false positive errors)
  • Force disable navigation before auto-dock
  • Added calibraterotation command
  • Added no-battery-connected indicator to web browser UI
  • Minor bug fixes, optimisations

For software update instructions, see here.
For more details on running the relay server, see here.
Java 8 is now recommended, which has to be installed manually, see here for upgrade instructions.

Posted by xaxxon on October 25, 2016   ↑ back to top

Video: SLAM Navigator Autonomous Driving in Warehouse at Night

Watch an Oculus Prime SLAM Navigator unit drive itself around a warehouse at night:

The video shows the view through the remote web browser interface – the only user interaction is 3 clicks, selecting 2 waypoints then return to charging dock.

At around the 2:15 mark you can watch it close in and dock with its charging dock, which is tightly placed in a narrow slot between racks.

The map was created in one attempt by manually driving around the warehouse for 10 minutes, and was used in its original (un-edited) state.

Thanks to Shodor Industries for the video!

Posted by xaxxon on September 30, 2016   ↑ back to top

An Exploration of Stereo Vision SLAM Mapping - Early Prototype

Not too long ago, the availability of low cost depth sensors, suitable for mobile robot auto-navigation and SLAM mapping, had become a problem. Apple bought Primesense, and along with it the intellectual property behind the original Microsoft Kinect and Asus Xtion sensors. The excellent Asus Xtion RGBD camera, which was to be the main SLAM sensor for Oculus Prime, was discontinued. The Kinect 1 was still available in quantity, but it was bigger and heavier, and required separate 12V and 5V power.

And it somehow just looks wrong with a Kinect mounted to Oculus Prime:

kinect oculus prime mobile robot SLAM

So, we decided to explore using stereo vision as a possible option. A prototype robot was conjured, sporting two Lifecam Cinema cameras:

stereo SLAM mobile robot oculus prime

OpenCV’s Semi-Global-Block-Matching (SGBM) algorithm yielded decent looking depth data from combined images. Below left is the left camera view, and on the right is the disparity image generated with the cameras separated by a 60mm baseline – pixel intensity is proportional to distance from camera:

oculus prime stereo depth image test

Comparing the depth images from the stereo setup vs the Asus Xtion camera was looking promising (stereo image on top, Xtion image on bottom, left camera 2D image inset):

stereo vs RGBD depth camera

In practice however, with this stereo setup as the data source for ROS SLAM mapping, there were issues. The depth data was quite noisy for some surface textures, and depth accuracy wasn’t very good beyond a few meters. Also, the SGBM algorithm tends to cause data omission for large texture-less surfaces.

The image below shows a comparison of a plan view map of the same area, generated using the stereo setup on the left, and using the Asus Xtion depth camera on the right (using data from the horizontal plane only, and pure odometry to align the scans):

Stereo map on left, Xtion on right (click to enlarge)
stereo vs RGBD SLAM map

The stereo scan noise occasionally projected a small false obstacle, that would wreak havoc with the ROS path-planner, and the inaccuracy or omission of distant features would cause weak localisation (and a lost robot).

Another problem was the slow speed of the system: the OpenCV Java SGBM processing, along with all the other robot functions fighting for CPU time, would only yield 2 frames per second (the prototype stereo-bot had an Atom N2800 CPU), demanding that navigating robot speed be slowed way down, to reduce errors.

In retrospect, an integrated stereo solution like the Zed camera, with some on-board processing and tightly-calibrated cameras, would have been much more effective (if money was no object).

In the end, we stuck with the Asus Xtion, dwinding supply and all, in the hope that the near future would deliver a new low cost depth sensor with stable supply.
Luckily the Orbbec Astra came along just in time.

Posted by colin on June 23, 2016   ↑ back to top

Oculus Prime Software Updated With Follower Mode and Video Recording

Since the last 0.709 release, the Oculusprime Server Java application has been updated to 0.713 with the following enhancements:

  • On-board video recording
  • Follower function added (for SLAM Navigator version)
  • Auto-docking minor reliability improvements
  • Added reverse arc moves
  • ARM-supported avconv video streaming added 720p photos, sound detection, and stability improvements
  • High current drain detection disables motion until resumed by forward command
  • Power PCB firmware less strict checking on minor warnings

Video Recording

Video recording on/off toggle button has been added to the main menu:

Oculus Prime Mobile Robot Video Recording Menu

Videos are saved under 'oculusPrime/webapps/oculusPrime/streams' with the current date/time as the file name, in FLV format.

Optional – Converting Video Format

If you want to convert to another format after recording, an easy way is to use avconv. If you’re running the Raspberry Pi equipped Oculus Prime Explorer version, it’s already installed. For SLAM Navigator versions, it can be installed by opening a robot terminal session and entering:

$ sudo apt-get update
$ sudo apt-get install libav-tools

To convert the file to the MKV format, change to the streams folder and enter the avconv command (replace [filename] with the right filename):

$ cd ~/oculusPrime/webapps/oculusPrime/streams/
$ avconv -i [filename].flv -codec copy [filename].mkv

Video Recording as Navigation Route Waypoint Action

For SLAM Navigator Versions, you can now record videos at route waypoints. ‘Record Video’ has been added to the list of waypoint actions:

Oculus Prime SLAM Navigator Robot waypoint video record menu

It will record for the same time as the ‘Stay here for’ waypoint duration, and download links to videos will be posted to the navigation log.

Follower Mode

Follower mode on/off has been added to the navigation menu:

Oculus Prime SLAM Navigator Robot follow me mode menu

This will enable Oculus Prime, equipped with a depth-camera, to autonomously start following any object that comes within a meter or so in front of it.

The new ROS follower node, added to the Oculusprime ROS package, is a port of the Turtlebot Follower node, with code modified so it works with Oculusprime’s skid steering.

Follower mode can also be launched the traditional ROS way, from the command line with:

$ roslaunch oculusprime follower.launch

If you auto-update Oculus Prime from the server menu, it should grab the latest ROS code and compile automatically. Compiling is required for this node, since it’s coded in C++ instead of Python. If you’re having any problems, try compiling again with:

$ roscd
$ cd ../
$ catkin_make

Posted by xaxxon on June 13, 2016   ↑ back to top

Oculus Prime ROV On The Job at Cornell Synchrotron

Oculus Prime Robot in Cornell Synchrotron CESR Tunnel

The folks maintaining the CHESS Synchrotron at Cornell University are using an Oculus Prime ROV to inspect the equipment in the 768 meter CESR tunnel.

The radiation emitted during operation is dangerous for humans, so a reliable ROV like Oculus Prime is the way to go. It has been equipped with a radiation badge and a dosimeter, and a couple extra charging docks have been placed in the tunnel to optimize the inspection schedule. The unit’s overlapping wifi network auto-switching capability supplies uninterrupted connectivity throughout the entire 1/2-mile stretch.

The X-ray radiation is expected to degrade Oculus Prime’s unshielded electronics over time, but so far so good after a few months. Battery charging within the tunnel was initially throwing errors; possibly because of the massive nearby magnets, throwing off current sensing in the power PCB. We released a firmware update to increase the allowable current calibration tolerance, and that seems to have done the trick.

UPDATE 2016-4-30: The guys at Cornell provided this great shot of Oculus Prime at work in the synchrotron tunnel

Posted by xaxxon on March 23, 2016   ↑ back to top

© 2018 Xaxxon Products     About Us     Contact     News
CART 0 items | US$
    Check Order Status
@XaxxonTech Twitter