Archive for June, 2007

Andii – Phase II

It’s been quite a while since I’ve posted anything on the Andii navigating robot project. For most of 2006, as often happens with hobby projects, I shelved the robot. I picked it up a couple of times to present to the Orlando .NET Users Group, and for a Seminole Community College (SCC) open house event. But otherwise, the project was dormant.

So, when I started work at the University of Central Florida in the Modeling & Simulation program, I had an opportunity to continue this effort. This latest iteration picks up where I’d left off, and adds Video and GPS telemetry and a much more complete Windows client.

The software is broken into three primary components: the client, server, and shared interfaces. The server runs as a Windows service, and is composed of three independent “servers”, for vehicle control, and video and GPS telemetry. The client can connect to the each of these servers individually or at the same time.

The shared interfaces are for GPS telemetry and vehicle control using .NET Remoting. The video server offers up video data using a custom TCP/IP server instead of .NET Remoting because of the nature of the data (the heavy resource consumption for capturing and transmitting the video).

The service software employs the BrainStem.NET library for servo control, as well as two external software libraries for GPS and video management. The GPS library is GPS.NET from GeoFrameworks (~ $279) and the video library is DirectShow.NET (open source). The DirectShow.NET library wraps the Microsoft DirectX library DirectShow for use in .NET applications. The client software also makes use of DirectX for joystick input (DirectInput) and the GPS.NET library for speed and compass controls.

Here’s a screenshot of the client. If anyone’s interested in the source code, let me know and I’ll make it available (though you’ll need the GPS.NET controls).

Color Matching (XYZ to RGB)

Last semester I wrote a color matching application for a graphics class (using C# and XNA — XNA only for rendering the output texture, which is a kind of overkill, but no matter). I’ll let Wikipedia describe the CIE color space, the eye, color vision and so on — I definitely wouldn’t do these topics justice. But the general idea is, that in order to display a sample of light on a computer monitor, we have to convert the spectral values to RGB values.

For this project, I was supplied spectral values for the Macbeth Color Checker. These were stored in a comma-delimited text file with wavelengths samples between 380 and 780 nanometers in 5nm increments. Using the CIE 1964 color space (also stored in a text file and in 5nm increments between 360 and 830nm), and NTSC chromaticity coordinates:

X Y
R 0.67 0.33
G 0.21 0.71
B 0.14 0.08
White Point 0.313 0.329

we can go from spectrum samples –> XYZ coordinates –> RGB.

 

The output is a reproduction of the color checker as a set of 24 textures. The coding is pretty straight forward, and I’ve included it here . Kind of an obscure topic, but i hope it comes in handy for someone :)

Paul