Projects – Spring 2003

Click on a project to read its description.

The proposed project is to enhance to freely available Ethereal software to support iSNS Naming Service.

Ethereal is a free, open source network protocol analyzer for Unix and Windows. It allows one to examine data from a live network or from a capture file on disk. You can interactively browse the capture data, viewing summary and detail information for each packet. Ethereal has several powerful features, including a rich display filter language and the ability to view the reconstructed stream of a TCP session.

iSNS is an open source comprehensive discovery and management protocol that provides the generic framework and naming service for storage entities in an IP Storage network. iSNS is designed to be a lightweight discovery protocol that can be deployed in iSCSI adapters and target devices, iSNS servers, and IP Storage switches. iSNS, which accommodates all IP Storage protocols, minimizes the manual administration of IP Storage networks by automating facilities for registration, discovery, zoning, and management of IP Storage resources.

NAS Performance Load Generating Tools


A Network Attached Storage (NAS) system can best be described by the following architectural model:

The NAS front end takes network requests for files, converts those requests into storage lookup requests, passes these requests onto the storage back end, and returns the data in the proper protocol for a request. The storage back end takes the request and performs the necessary read and write operations to retrieve or write the data. The back end also takes care of any data restructuring (for example RAID, striping, etc.). The cache memory is used as an intermediary between these two subsystems.

The overall performance of a NAS device can be impacted by many things: the speed of the front end, the back end, the network, the cache, software efficiency, etc. A major activity of any performance evaluation group is the measurement of actual system performance under simulated workloads. These workloads can be customer environment simulations or "in house" workloads designed to stress the system in certain ways.

Project Description

This project involves providing libraries, APIs, and protocol support to enhance our "in house" load generation tool. For this project it will not be necessary to learn and use our tool. The output from this effort will be integrated into our "in house" tool over the summer. The work is being isolated in this way to keep ramp up time low and to allow for independent development by the NCSU team using non-specialized equipment.

We currently use code out of the Samba open source to generate a Common Internet File Specification (CIFS) protocol requests from the load generation clients to the NAS device being tested. Doing so has several drawbacks. First it forces our test clients to be UNIX based. Second the overhead to port & integrate new versions of Samba is increasing in cost, time, & complexity as our users increase (150% growth this past year). Third, the level of granularity & control needed to add measures and performance debug tools is insufficient with the Samba code.


We desire a library, APIs, test framework, & tests to accomplish the following:

  • Provide a CIFS code library that can generate & respond to CIFS packets placed directly onto the network wire.
  • Provide this library such that it compiles and works with UNIX and Windows environments (W2K, Win/XP Pro, or Win/XP Server).
  • Provide a set of generic APIs that the load generation program can interface to such that OS differences are masked. This same set of calls should tie into the CIFS library mentioned above and a pre-existing NFS library we have together. A partial subset of APIs include:
    • Open, close, read, write, Seek
    • GetAttr, SetAttr, lock, unlock
    • Create, delete, createDir, deleteDir
  • Provide support for (or at least identify) work required to support NFS V4 from this library / API set.
  • Provide a test suite and framework such that this library and API set can be checked for accuracy outside of our performance tools suite.
  • A complete set of user documentation outlining library architecture, library design, API design, test framework, and API user's guide.

CAIR Work Orders


WebWorks is a part of the Information Systems department within John Deere's Commercial and Consumer Equipment Division (C&CE) that develops internet, extranet, and intranet applications for John Deere. They work as a set of internal consultants to the division. WebWorks focuses on providing cost effective solutions written using solid industry techniques and open standards. WebWorks is a small group of programmers practicing Extreme Programming.

The CAIR work order system will be an interactive, online workflow application for managing the part measurement request process. (CAIR stands for Computer Aided Inspection Reporting.) It will be written as a Java web application, using Java Server Pages (JSP), Servlets, and a JDBC database.


Workflow is the flow of data through a process. For example, the ordering process goes through many steps during its cycle. A customer places an order. Then a warehouse fills the order. The shipping department ships the order. Finally the customer receives the order. At any given state, certain activities may be performed, such as updating the document or notifying people involved. The document can either be forwarded to the next state, or rejected back to the previous state, until it reaches some terminal state (i.e., order completed).

Internet applications are being developed to make it easier to track a document through its many states. We get many requests to develop such systems.

The proposed system will enable the CAIR department at our Horicon, Wisconsin factory to track the requests from engineers for work. CAIR is responsible for measuring parts to ensure that they match the blueprint specifications. Engineers or factory personnel bring parts to the department for measurement. CAIR measures the parts and produces a report back to the requestors of the results. They also measure supplier parts to report on whether the supplier's results are repeatable or if the parts are not made consistently. Thus, when a part is put on the tractor and it fails, knowing the measurements will help determine if it was the part itself that failed or the way the part was installed on the unit.

The objective of this project is to create a tool that tracks the entire request process. Engineering, Supply Management, and factory personnel will submit requests for part measurement. The CAIR department will process the request, perform the work, and produce a report to the requestor.

The application should be written using Java, utilizing Java Server Pages (JSP) and Servlet technology, and a database. They should use thin client architecture so the users are only exposed to HTML and JavaScript. The servlets communicate with the domain, a layer of business objects. The domain communicates with the databases through a level of database brokers. The brokers must utilize JDBC to send data to and from the database.

As WebWorks works in an XP environment, we suggest that the Extreme Programming methodology be followed when developing this application:

  • Plan and scope out each release. Prioritize tasks.
  • Small releases - every release should be as small as possible and should contain the most important business requirements. Release often.
  • Simple design
  • Testing
    • Test cases should be developed that can be used in acceptance testing.
    • Test first coding can be used to integrate Unit testing into the design. (JUnit)
  • Pair programming
  • Collective ownership - anyone can change any code anywhere
  • Coding standards - document code, format code so its easier to read and follow

Web Auction


WebWorks is a part of the Information Systems department within John Deere's Commercial and Consumer Equipment Division (C&CE) that develops internet, extranet, and intranet applications for John Deere. They work as a set of internal consultants to the division. WebWorks focuses on providing cost effective solutions written using solid industry techniques and open standards. WebWorks is a small group of programmers practicing Extreme Programming.

The Web Auction will be an interactive, online auction application for managing the selling of items to employees. It will be written as a Java web application, using Java Server Pages (JSP), Servlets, and a JDBC database.


The proposed system will allow the John Deere Commercial & Consumer Division to conduct auctions of tractors, computer equipment, and any other item that comes up for employee bidding. The application should be designed to handle more than one auction at a time for different sets of audiences. Each auction may contain one or many items of differing quantities. For example, you may have one auction that offers three laptops, two printers, and one desktop for employees at the Cary, North Carolina, office. At the same time, the Augusta, Georgia, factory may be auctioning one tractor to its employees.

The application should be written using Java, utilizing JSP and Servlet technology, and a database. They should use thin client architecture so the users are only exposed to HTML and JavaScript. The servlets communicate with the domain, a layer of business objects. The domain communicates with the databases through a level of database brokers. The brokers must utilize JDBC to send data to and from the database.

As WebWorks works in an XP environment, we suggest that the Extreme Programming methodology be followed when developing this application:

  • Plan and scope out each release. Prioritize tasks.
  • Small releases - every release should be as small as possible and should contain the most important business requirements. Release often.
  • Simple design
  • Testing
    • Test cases should be developed that can be used in acceptance testing.
    • Test first coding can be used to integrate Unit testing into the design. (JUnit)
  • Pair programming
  • Collective ownership - anyone can change any code anywhere
  • Coding standards - document code, format code so its easier to read and follow

Test Results Automation II

The Network Quality Lab (NQL) test team provides a testing service for hard and soft modems developed by the Voice Band Modem (VBM) group. Currently, our testing produces several log files, which must be manually converted to readable results in Excel. The NQL team would like a web front-end to automate and simplify the processes, using a database backend to store the data.

Last semester, a team of students completed this task for a type of test we execute that involves emulating phone line impairments. The equipment that emulates the impairments is known as Telecom Analysis System (TAS). They were able to write a module that parses the log files, stores the data in a database, and generates graphs. Though they implemented this process only for TAS tests, they designed this application to be extensible to other types of test results. For this semester's project, we would like more test types implemented in this application, depending on the students' estimate of feasibility. Additional test types include fax function testing, ISP operational verification, and modem-on-hold verification. Each of these test types produces a log file which needs to be processed in a manner similar to previous semester's TAS.

The skills required to be successful at implementing this type of system are web interface design, asp, database design and implementation in MySQL, perl (for the parsing module), and written communication (to be able to clearly document requirements, design, and user documentation). Students must be able to work well as a team, dividing the work among themselves with little supervision from the Intel sponsor.

Project Description:


Digital Rights Management (DRM) and its associated security & cryptography software are rapidly evolving and will be very much a part of the future of the internet (read: Job Opportunities). It is estimated that the music industry will lose $10 Billion (with a "B") in revenue due to piracy in 2003. There are multiple large initiatives that are addressing how to implement DRM into the music and video industry. Another smaller more focused and often overlooked area of DRM is collaboration of engineering design information. Almost all design is electronic today. Everything from a toaster oven to a BMW is designed with Computer Aided Design (CAD). Product manufacturers are collaborating with distinct areas of domain expertise often external to themselves in order to improve design and reduce costs. They are sending sophisticated electronic designs over the internet to partners, contractors, manufacturers, consultants, experts all of whom have aligned interests at the time of this collaboration. However, after the collaboration, how is the owner of the engineering design assured that their design will not be used for other purposes? These digital rights are what we are interested in protecting.

Project Requirements

We would like our current client based Design Application extended. This project seeks to create a system that minimizes the risk of "design leakage" associated with collaborating Engineering Design Documents over the web by developing the following components:

  • Application layer encryption using MS-PKI for Engineering Design Documents
  • Rules based Web Server which permissions for:
    • Cut/Copy/Paste
    • Save, SaveAs, and
    • Print
    • Others
  • Secure communication between Server and Design Application
  • Updates Client when new security patches are placed on the server

The ideal team will be comprised of those seeking to become involved in encryption, security, high & low level MS application development and someone interested in hacking into secure systems. High quality, state of the art UML design documentation using Togethersoft will be stressed. The ideal group will have individuals who have experience or want to gain experience with:

  • Web Services and/or IBM Websphere
  • SQL, Oracle or DB2
  • MS-PKI
  • TogetherSoft

Challenge and Opportunity

Numerous challenges exist. Not the least of which is the technology but more importantly is developing the above group of diverse individuals into a team to accomplish the goal of keeping the system secure at Posters & Pies!

We envision making the system available to the CSC department at Posters & Pies and challenge the community to hack the system. Those that are successful will be awarded prizes!

Successful completion of this goal offers opportunities at I-Cubed as well as significant experience in an area of the software industry that is growing rapidly.

Abstract: IT Requirements / Costing Project

There needs to be better consistency in gathering business requirements and providing cost estimates than currently exists. The process that the business analyst uses to gather business requirements for an IT project is not always consistent and often key requirements are not communicated at the onset of an IT project. By not gathering all of the requirements up-front, the time to deliver a workable IT project lengthens and cost overruns can be commonplace.


The IT Requirements / Costing Project consists of developing an IT tool to gather requirements for an IT project. In addition the IT tool should determine ballpark pricing of the application to be developed. Lastly, the tool should allow for building an application in-house versus purchasing an application from two separate vendors.

Currently, the DataFlux web site is merely an electronic brochure where our prospects, customers, and partners come to find out what’s new with DataFlux. As DataFlux creates more in-depth relationships with our customers and partners, a new level of expectations and services are required. Due to this evolving model the DataFlux web site is going to be redesigned into three separate, yet equally important sections.

  1. Prospect Section- This is very much the web site in its current form. All product data will remain; the only major addition will be ‘cookies’. A prospect area will be created where a visitor can register and view DataFlux white papers, methodologies, etc. The prospect can then return to the DataFlux web site in the future to browse additional resources without having to sign in while giving the DataFlux marketing department the ability to see which pages our prospects like and don’t like.
  2. Customer Section- This will be a customer only section, where customers can log on (with their customer data and some verification information) to view FAQ’s, download any patches, and update their information. Registration codes will be available for the software packages they have licensed. A technical support section will be included here as well for online submission.
  3. Partner Section- This section will be very similar to the Customer section with minor enhancements geared toward our Partners.

In a nutshell, our desire is to take our static web site and make it a very personalized experience for our customers. By making these front and back end changes, we will improve our external web site interaction with our internal sales and customer databases.

This is only a brief overview of the ideas we have. More detail is available, as needed. Students should use their imaginations and creativity to take these basic concepts wherever they see the potential.


Bally Refrigerated Boxes manufactures custom refrigerated units such as: walk-in coolers; walk-in freezers; refrigerated buildings; modular structures; mortuary coolers; and blast chillers. Over 50 distributors located throughout the United States sell Bally's products worldwide. Currently, these distributors with 3 to 5 sales staff each (a total of 150-250 people) use a Microsoft Access application to generate price quotations for customers.

Bally Refrigerated Boxes, Inc. proposes a Senior Project problem to re-design and convert our current Quotation System from an individual PC based system to a web based system with very tight security and controls. The current Quotation System is written in Microsoft Access 2000 Visual Basic using menu-driven displays. The web-based system needs to interface with MS/SQL database tables and be written in a combination of JavaScript / Active Server Pages (ASP) / HTML. Other programming languages may be considered.

The current MS/Access application stores quotes on distributed local systems. The web-based system needs to allow all the quotes to be stored on a central Bally web server. These quotes can then be accessed based on various security levels:

  1. Administrator Level - can do anything including file maintenance.
  2. Bally Company Level - can view and update any quote in the system
  3. 3) Company Level - allows individual companies to set and control access to their quotes which allows their sales staff to create new quotes and maintain existing quotes for their company.

Fast response time on the web-based system is critical to the success of this project. Due to design issues and limitations of MS/Access, certain functions of the existing software are slow.

The goal of this project is to create a web-based job costing/reporting system. IT consultants working for Shark Technology are assigned client projects. It is necessary to track the time each consultant allocates to each client and project. This data is used for billing clients for services and for determining payroll for consultants. Input to the payroll/billing system is a CSV file (comma separated variables).

Clients and consultants of Shark Technology are geographically dispersed, so a client/server architecture is required. Transactions must be secure, since this data is sensitive. Also, various permissions must be implemented to permit client/consultant relationships to be securely defined and to permit secure client verification of consultant data.

Meeting Maker is an open source program used to reserve rooms for meetings, invite attendees, etc. Last semester, a student team created an open source replacement for Meeting Maker, correcting serious impediments to its use. The new "Meeting Maker" is browser based, uses Java web programming, and a back end database.

The goal of the project for this semester is, in the best spirit of open source, to enhance the functionality of this product. The scope of this semester's project, like last time, will be determined largely by the students participating. However, the following are interesting avenues of pursuit for us:

  • Geographical localization
  • iCalendar type support for PDA and desktop calendar syncing
  • LDAP (allow Evolution and/or mozilla to connect and sync calendars, global address book)
  • Project hosted for community development (e.g., code and docs posted to sourceforge or NCSU homepage) under a license approved by the Open Source Initiative's Open Source Definition ( For example, the GPL.
  • And/or anything the students and advisor(s) may care to explore, under the time/resource constraints of CSC 492.


There is currently a prototype site ( that communicates the purpose of this Lab. This site was developed using Dreamweaver 3, Fireworks 3, Adobe Photoshop 6, Quicktime, Adobe Premiere 5, Automate 4.5 and Surfer 6. Due to other demands on Lab personnel, this site has not been completed as planned.


The goal of this project is to incorporate an interactive server, multimedia, client-customized web site to deliver content in a clear, more target oriented, and visually exciting manner. This may be done by enhancing or replacing the existing site with one that will:

  • Allow for transition from the Unity servers to on-site server, or third party server.
  • Set up a contributions page using secure server technologies to process credit card and other forms of online payments for web visitors.
  • Data subscription service - allow individuals or companies to purchase subscriptions for password-secured databases available only to these subscribers. This may, or may not, include e-mail notification when new data becomes available to them.
  • Incorporate a real-time update from field data currently coming into the Lab hourly (currently averaged every 6 hour period). Use of timeline plots, etc.
  • Make use of a large repository of photos, from microorganisms to site photos.
  • Allow for easy content change.
  • Allow for distribution of customizable GIS maps online
  • Overhaul of interfaces using drop down menus (?) and redundant navigation of site.
  • Allow for full screen display when brought up on larger screens, and the minimization of the need to scroll to the bottom of the page. This latter, especially when there is only the credits that remain. Keep 'clicks' to a minimum, and properly handle the 'back' and 'forward' navigation.
  • There is a great deal of scientific data that must be displayed, this is in the form of graphs, colorized maps, etc.
  • Website use of cookies and ASP / PHP technologies to identify return users to the site and customize data according to their preferences.
  • Allow for streaming video of field research trips and remote live cameras.
  • Set up linkages to other sites, and keep it simple.
  • Allow for easy update of information, Lab background, projects and information.
  • Leave 'hooks' for children's interactive pages, possible link to class material, and others,
  • We would prefer the use of Dreamweaver for the main site construction, programming languages to be discussed.

Due to interest in water quality, this site has the potential to draw a large cross section of the population. The site must be snappy, not too cluttered, and be able to catch a person's attention.

The US Environmental Protection Agency is interested in raising public awareness of the dangers involved in sun exposure. There is a large body of scientific evidence that too much sun exposure, especially at a young age, can have drastic negative effects later in life. Depletion of the ozone layer exacerbates this problem. This project is a continuation of a three-year senior design effort to develop computer based systems that contribute to raising this awareness.

The goal of this semester's project is to continue design and implementation of a computer game to illustrate to children ages 7-11 the effects of sun-exposure. The basic idea that has been discussed (which can be used or not) is a 3rd person game looking down on a "baby" playing in the sun. The player guides the baby to pick up treasures and/or get to a goal location (i.e. the umbrella where "Mom" is) while the sun beats down. Over time the baby becomes more and more exposed to the sun (shown visually by coloring the baby in various shades of red). The sun exposure can be reduced in the game by having the baby put on a hat, slather on sunscreen and the like. The overall idea is to illustrate sun-savvy behavior that would include the effects of covering up, using sunblock, etc. The game should provide simple play for younger kids with lots of rewards to reinforce basic concepts. This game can be done in 2D or 3D.

The goal of the Fourth Annual Computer Society International Design Competition (CSIDC) is to advance excellence in education by having student teams design and implement computer-based solutions to real-world problems. The theme of this year's CSIDC is Added Value: Turning Computers Into Systems. Teams are encouraged to take a PC, laptop, hand-held computer (or similar device) and use additional low-cost hardware/software to create a computer-based solution to a problem that is socially valuable. Add-on cost must be less than $400 (this cost will be paid by the Senior Design Center). First Prize is $15,000 to be distributed among the students on the team. Total of all prizes to be awarded to students is $37,000.

For more information check

Computer Science senior design students will work on a team with Chemical Engineering senior design students on this multidisciplinary project. This semester, the CHE student team assigned to this project is available on Mondays from 1:30-3:30. The multidisciplinary team will receive extra guidance concerning multidisciplinary teaming, documents, and presentations.

The specific objective of this project is to execute a preliminary engineering design of a process, automation and controls system, and paperless manufacturing execution system (MES) using Aspergillus niger for the production of citric acid via fungal fermentation and its purification using acid precipitation. Students will perform experiments to determine necessary technical and operating specifications, and will use the apparatus to evaluate an MES developed by the design team.

Citric acid is used in a variety of industries such as food, pharmaceuticals, cosmetics, plastics, and biodegradable detergents. Citric acid was originally produced in the early 19th century using fruits and the technology has since advanced to using the liquid submerged tank method. This more recent method of production has a shorter fermentation time and higher yield compared to other recent methods such as the Koji process and liquid surface process. As a result, the most accepted method of production uses the submerged tank fermentation of Aspergillus niger to produce citric acid.

Aspergillus niger is a filamentous fungus that grows in a wide-range of temperatures. This organism is used most frequently for industrial production in the submerged culture system in the West because of its ability to grow under lenient temperature conditions. However, other factors such as water, pH, and gas composition also have a large influence on the growth of this fungus. Therefore, these conditions must be monitored carefully during the fermentation process.

The production of citric acid in a facility is complex due to the many operating conditions that must be monitored during the process. A manufacturing execution system (MES) and automated control system provide a solution to effectively regulate the fermentation cycle. The manufacturing execution system documents validation protocols and equipment, batch records and reports, maintenance logs, and standard operating procedure (SOPs), thus maintaining a comprehensive history of the production of citric acid. An automated control system improves product quality by regulating the process parameters and adjusting them closely to optimal conditions. It can also lower production costs because of decreased labor requirements. Automation is helpful in regulating the fermentation process because it is continuous. The NCSU Fermentation MES project will focus on designing a manufacturing execution system and control system for the fermentation of citric acid using Aspergillus niger. Isolating and purifying the citric acid, however, is a more difficult task to manage using automation because it is a discrete process. The NCSU Fermentation MES project will investigate control technologies associated with this portion of the process, but as a simplification, the data collected from the purification will be manually entered in the manufacturing execution system.

Pack Tracker

Marine and mammal biologists have been using devices of one sort or another for decades to track wildlife and report related environmental activity. This tracking research adds substantially to our understanding of different species' habits and patterns. All tracking methods in use today have various drawbacks (satellite expense, geographic precision, etc.). The goal of this project is to investigate the use of Mica Motes (a new pervasive computing technology - see below) in a wildlife tracking application to see if this new technology can overcome some of the drawbacks of existing technologies. A new tracking system will be designed and implemented based on the properties of these devices. The application on which we will focus is the Red Wolf Recovery Project in the coastal plains of eastern North Carolina. Wolves recently re-introduced into the wild will (ultimately) be collared with production versions of our design. As the wolves roam their territory, sensor units in their collars will collect relevant data. Wolves will periodically enter an area covered by a network of data receptors, and collected data will be offloaded from the collars and forwarded to a centralized collection node for storage, display, and subsequent processing. The network of data receptors will be based on motes, which basically form a special purpose, autonomous, distributed, wireless network of computers.

Mica Motes are small, approximately 2x1x1 inch, microprocessor based computer units specialized for the collection and dissemination of data. They have built-in radio transceivers which can communicate point-to-point or, more interestingly, in networks over short ranges . A Mote can function in various roles. It may act strictly as a communication element in a network or, with the addition of one or more piggyback sensor modules, as a data gathering unit, or in any combination of these capabilities.

Motes are designed to very specific criteria. Low power consumption, powerdown and sleep modes between readings; tight integration of hardware and software; and the smallest, lightest possible weight are some of the prime considerations. The Motes we will use are second generation units. Development efforts are underway to reduce the size of a Mote, literally, to that of dust particles. (Hence the term "Smart Dust").

Mote programs are written in nesC. This is an object-oriented-like language that looks much like "C" but deals mostly with high level constructs and exhibits many of the qualities of OO languages such as Java. A collection of library functions interface with and control all functions of the Mote board as well as any attached sensor boards. Application logic must be developed to query the sensors, store data, and forward data toward the final destination.

At that destination, the data will need to be retrieved by a PC and displayed in a user-friendly format. The PC interacts with the base station Mote (the egress point of the network) via serial port and Java programs.

The need for security in web applications is becoming a prime consideration as the internet grows. One level of security is addressed by cryptography, i.e., encode transmissions to guarantee data integrity. Much work has been done in this arena. Another type of security might be concerned with guaranteeing that a given application operates in a predefined way. For example, a credit card transaction uses encryption to assure your card number and other personal data are not compromised. However, how can we guarantee that the application does not (outside of its specification) extract several pennies per transaction and hide it away somehow to benefit the (unscrupulous) programmer? Or, in another example, if we were to create an online course evaluation system, how can we guarantee that students' responses can never be traced back to them (to be used in an unintended way by either unscrupulous faculty or students)? Current technology depends on code and documentation inspection by a disinterested party for these types of guarantees.

The goal of this project is to investigate possible software solutions (perhaps based on a middleware layer) that can provide guarantees that such subversions are not possible. One approach might be to define a prototype application (credit card and/or course evaluation system simply defined above), demonstrate subversive hacks, and create blocks to such hacks by appropriate middleware abstractions and implementation of blocking rules of operation.

Part I:

Experiment with PDAs (latest handheld Compaq Ipaqs w/ XScale CPU) to demonstrate the ability to combine quality-of-service with frequency scaling.

Background: We have a helix media server generating RTSP packets suitable for 802.11b wireless communication. We also have codes that change the frequency of the Ipaq dynamically.

Objective: We would like to show the effect of frequency scaling on MPEG video decoding.

Method: Modify the Berkeley Mpeg_Tools player to handle frame losses while preserving synchronization in time to guarantee quality levels for replays. This can be complemented by frequency scaling and frame drops upon on the server side as well.Platform: Windows 2000 and CE, embeddedVisual C++ and SDK 2000

Part II:

Experiment with motion detection and image feature detection using a WebCam and interface with one or more embedded devices (RCX Lego Mindstorms).

Background: We have motion detection software that we would like to use together with RCX navigation experiments.

Objective: Track objects (RCXs), determine quadrants of objects, react to visual events by sending messages to RCXs via infrared communication.

Methods: Enhance Motion package under Linux, modify LNPD for infrared to operate as blackboard database and broadcast messages.

Platform: Linux, LegOS, gcc cross-compilation.

19. Back Pack Web Auctiondescription coming soon