PC Side

Due to the limited capacity of the RCX Brick, the PC side does a lot of the calculations and processing required by the system before it sends any instructions to the RCX Brick.  The language used for processing is Java.

As mentioned earlier, Matlab is used as the interface to the system, however for the purposes of this project, it provides very little further functionality.  It will however provide a few scripts to run the system including one for automatic calibration. 
Note that using Matlab for image processing is beyond the scope of this project.

The PC side Java code incorporates calibration, receiving user commands from Matlab, processing the commands as necessary, and sending and receiving commands from the RCX.  It also incorporates Imaging but this will be discussed separately.

Manual calibration involves the instantiation of a GUI that guides the user through the calibration process.  This process results in the calculation of two ratios that will be used to convert yaw and pitch sensor values to yaw and pitch angles in degrees.

 

RCX Side

As mentioned above, the RCX has a limited capacity so our implementation of it required us to minimise its processing capability and just focus on performing fairly low-level tasks.

The code is designed such that there are two low-level classes for receiving & sending messages and controlling the RCX motors & sensors.  There is also a higher level class that manages the two lower classes and passes data between them.

As part of the reduction in processing, the RCX side has been designed to deal solely in sensor values so all conversions between degrees and sensor values occur on the PC side.

 

Imaging

The web camera is effectively the core of the system because it is that which the mount is designed around.  The functionality of the imaging section includes streaming images from the web camera to both the calibration GUI and to a frame for general viewing by the user when executing commands, taking snapshots of the current image in the web camera’s view, and enabling the user to save the images to disk.

In order to stream the data, Java Media Framework (JMF) is used to recognise the camera device, set the properties of the streaming device to match those of the camera, and create a frame to display the streaming image in.  A separate frame is also instantiated when a snapshot is taken to display that snapshot.