View this PageEdit this PageUploads to this PageHistory of this PageHomeRecent ChangesSearchHelp Guide

UROC job offers

Create Video Capture wrapper for Canon Digital SLR camera

Use the Canon digital camera SDK to allow a digital SLR to be used as a video feed for the video capture libraries developed in our lab. Depending on student interest, this project could continue in a variety of ways:
  • Mount a tracker system onto the SLR and use it for Augmented Reality.
  • Restructure our video libraries to support multiple simultaneous cameras of different kinds

Set up a base station for a differential GPS

We have a GBX-Pro differential base station in TSRB. The antenna has been mounted on the roof, and the base station prepared for use. We would like someone to help set up the base station for use with our GPS units, communicating the differential data over IP.
This task will involve:
  • surveing the antena's location (with help from students and hardware in the mobile robotics lab)
  • setting up the base station (e.g., reading the manual, configuring it with the survey information to output RTCM differential information over its serial port)
  • writing a simple serial/TCP server to send differential information to any client that connects via TCP
  • modifying our GPS client (that currently talks to a GPS receiver over a serial connection) to receive the differential information over IP and pass it on to the GPS (to increase its accuracy).
  • demonstrating the improved capabilities of the GPS.

Multi-perspective Support For Video And Sketch Content

In augmented reality (AR) applications it’s possible for users to move around and view a dramatic character from many different visual perspectives. To achieve this in video, the actor must be filmed by many angles simultaneously (think Matrix “bullet-time” shots). We have video footage of the AR experience “Four Angry Men” ( filmed from multiple angles and we are looking for a talented student to develop software to load the appropriate frame based on time and viewing angle. The student will develop a DCR Xtra in Director for use in DART. The software should generalize to work for other multi-camera video footage and sketched content.

Partially Synthetic Video As Characters For Augmented Reality

One interesting feature for augmented reality could be the ability to generate content for a video “actor” without having to actually videotape all of the content. Prof. Irfan Essa and his students have developed algorithms to take a small amount of video and create longer sections of realistic computer-generated video. This project will be co-advised by Blair MacIntyre and Irfan Essa and would involve integrating/programming this software into DART (

Spatialized (3d) Audio For Augmented Reality

Using the OpenAL ( 3D audio engine and basic head-tracking technology we are able to simulate sound sources in 3D applications. We have integrated a basic version of this software into our toolkit for augmented reality (, but we need a student who is interested in this area to continue to integrate the technology. The student should compare our current method with another method that uses VRPN’s ( 3D sound engine. This is a challenging project for the right C/C++/Lingo programmer.

Augmented Reality Applications And Geographic Information Systems

Geographic Information Systems ( are essentially a convention for keeping track of information for a specific place. There can be many layers of information that can serve many different purposes (commercial, entertainment, accessibility, etc.). For this project, the student will research GIS servers and engineer a method to dynamically download complete AR experiences to a personal device (such as a wearable computer with a heads-up display). The student should understand basic client/server architecture, be able to program and research this topic independently.

DART Design, Development, Evaluation, And Support

he Designer’s Augmented Reality Toolkit ( (DART) is a system which integrates AR technology (live video feed, 6DOF trackers, computer vision tracking, and other sensor data) into Macromedia Director where it’s encapsulated in Lingo “behaviors” for quick application prototyping. We are looking for students to help design, develop, evaluate, and support users for DART. The student will help develop Lingo behaviors, setup hardware for AR, design cool demonstrations in AR, help with evaluations of DART, and support existing users.

Visual Marker Tracking For Augmented Reality

We are looking for someone to improve our algorithms for tracking visual markers in an environment using computer vision techniques. We also want to develop a “video pipeline” which allows the video to be analyzed and/or manipulated in many different ways. This is a complicated and challenging project.
The right student has taken a computer vision course, is willing to learn (or already knows) some complicated packages like OpenCV, and can program in C/C++

Previous UROC projects

Tracking And Controlling Toy Model Trains For Augmented Reality Applications

The Augmented Environments Lab has a computer-controlled model train set that is intended to be used as a prototype/demonstration of AR applications (i.e. such as the Poultry Inspection project - The goal of this project is to create a VRPN server ( that accurately and precisely tracks the location (3D position and 3D orientation) of train engines and cars on the tracks and makes this information available to clients in real-time. The project will involve developing a backend server application that allows the train layout to be configured and controlled. The tracking of the trains is a challenging real-time programming problem that should be interesting to CS students.

Link to this Page

  • Old Misc Pages last edited on 15 May 2008 at 2:02 pm by