Projects – Fall 2007

Click on a project to read its description.

Navigating Retail Space Planning Diagrams Over the Web

SAS sponsored a senior design project in the spring of 2007 called "Merchandise Planning in a Virtual World". This project involved research and development of a software extension to the Second Life environment to enable the user to layout retail merchandise on particular fixtures (shelves, tables, etc.). The project was a success in many ways, but the eventual conclusion was that the Second Life environment lacks sufficient control to allow individual product placement to be done with the ease of use required. As a follow-on to this work, SAS is proposing a senior design project to build upon the previous progress with a somewhat more limited scope. This project will involve researching and, ultimately, developing a system for the interactive exploration of a static environment built for retail space planning. Rather than focusing on the product layout on shelves, pegboard, and custom displays, the entire retail space (warehouse, store, kiosk) would be modeled and built, and individual products placed, outside of the proposed virtual environment. The resulting system would be used to share, via the internet, the results in a 3D walkthrough environment. Second Life may be one of the potential platforms, but the initial research should also include other open-source and proprietary 3D graphics rendering engines like: Sun Game Server, QUAQ, and OctLight. One of the primary design considerations should be the software footprint required for the end-user deployment. Ideally, a small, installed application would be deployed on the client machine(s), while the backend server(s) would be used to house the models and textures. This would allow a centrally-managed repository of retail spaces to be built, updated, and shared with a very large end-user community. This community would interact with the environments via 3D navigation/walkthrough, but would not be designing or laying out any of the internal 3D models.

Meeting Management System

The Cisco Enterprise Portal (CEP) is a general purpose portal application platform. The platform will be host to a variety of portal applications including departmental dashboards, team spaces and personal workspaces.

CEP is implemented using IBM’s WebSphere Portal. Portal applications will make use of off-the-shelf and custom portlets integrating with a variety of back end Enterprise information systems.

The CEP team is interested in developing a Portal enabled Meeting Management system for use with Cisco project team spaces. The core of this system will implement a meeting information capture interface. The input interface should be implemented with a series of portlets for capturing:

  1. Meeting Notes
  2. Attendee List
  3. Action Items

  • The Meeting Notes portlet will integrate with the Atlassian Confluence Wiki system.
  • The Meeting Notes portlet should allow text input using Confluence markup tags.
  • The Action Items portlet will integrate with the Atlassian JIRA Issue Tracking system.
  • Users need to be able to enter information quickly and simply to keep up with the typical flow of information during meetings.
  • Meeting information will be auto-emailed to all attendees and selected email aliases.
  • Information from completed meetings shall be accessible through the Portal interface as well as directly through the back end integrated systems.

System Technologies and Integration Methods

  • WebSphere Portal
  • Atlassian Confluence
  • Atlassian JIRA
  • Custom JSR168 Portlets implemented using Java
  • SOAP
  • AJAX

Develop New Features For NFSv4.1

NFSv4.1 Background

The proposals included here have to do with the NFSv4.1 protocol. The NFSv4.1 protocol is the latest version of the NFS protocol. It is currently still in the specification stages. Latest draft of the protocol can be found here:

Netapp has an Open Source Linux client and server prototype for some of the NFSv4.1 features. The proposals mentioned below involve adding features to the prototype. It is expected that work on these projects will give the team an understanding of the NFS protocol, real world issues involving distributed file systems and also introduce them to Open Source contribution.

Proposal 1: Dynamic Slot table resizing and trunking for sessions in NFSv4.1


Traditionally, the NFS protocol has always assumed that there would be only one connection between the client and the server. There was no way to aggregate multiple connections into one 'transport'. Also, NFS had no reliable way to distinguish duplicate requests from legitimate ones. Servers often made a best-effort attempt by maintaining a Duplicate Request Cache (DRC). The DRC was basically record of the last 'n' requests that were seen. Since by definition, n was unbounded, but practically, it needed to be, the solution was always best effort. In order to cater to these requirements NFSv4.1 introduced the sessions feature.

Sessions introduces the concept of a slot table. The size of this table is negotiated between the client and the server at the setup time. Each entry in the slot table corresponds to a token. In order to issue an RPC to the server, the client must select a token. So the client, selects a slot (token) and then includes the slot number in the RPC it sends to the server. Since tokens cannot be shared, it is illegal to send 2 RPCs over the same slot. In this way, the size of the DRC is bounded. Also, sessions introduces a concept of a channel. A channel can consist of one or more connections. RPCs are issued to the channel which then can figure out how to put them on the various connections.


NFSv4.1 allows the size of the table to be modified during the course of operation. For instance, the server, if low on resources, can tell the client to shrink the size of the slot table and therefore reducing the size of the DRC the server needs to manage.

The team will need to perform the following tasks:


  • Understand the workings of the NFSv4.1 protocol and client implementation.
  • Implement a channel layer to abstract away the concept of a connection from the NFS client.
  • Implement RPCs on the client and server side that allow a connection to be associated with a channel.
  • Develop test plans and test the code for correctness.

Slot table resizing

  • Understand the workings of the NFSv4.1 protocol and client implementation.
  • Understand the workings of the slot table implementation.
  • Implement dynamic resizing of the slot table on client and server.
  • Develop test plans and test the code for correctness.


Demonstrate trunking and slot table resizing working together.

Proposal 2: Implementation of directory delegations


It is often observed that a lot of NFS traffic is looking up files and attributes. In order to alleviate this, traditionally, NFS has cached file attributes and directory information. While this works well when the directory entry is found, misses incur the penalty of an RPC to the server. An example of this is compilation. Too many of these RPCs limit the WAN scalability of NFS.

Directory caching has existed in other distributed file systems like AFS. 'Directory Delegations' is NFSv4.1's version of directory caching.


Implement the directory delegations feature on the client and the server.

The team will need to perform the following tasks:

  • Understand the workings of the NFSv4.1 protocol and client implementation.
  • Implement RPCs on the client and server to request and call back directory delegations.
  • Develop test plans and test the code for correctness.


Demonstrate a decrease in network traffic due to directory caching.

Infrastructure requirements

A lab setup which includes at least 2 machines running any Linux distribution. Serial debugging support is helpful, but not required.

Skillset To Develop

  • Knowledge of Operating System Internals.
  • Knowledge of concurrency control mechanisms.
  • Basic Understanding of Unix syscalls.
  • Basic understanding of File Systems.
  • Working Knowledge of Distributed Systems.

Design, Implementation and testing of either of these projects is acceptable. Accomplishing both would be outstanding.

Faculty Mentor: Dr. Vince Freeh (freeh (at)

Translation Tool

Fujitsu Transaction Solutions is one of the top three suppliers of retail systems and services worldwide. Its integration of Microsoft DNA (Distributed InterNet Architecture) creates a high performance yet open platform that retailers as diverse as Nordstrom and Payless Shoe Source are able to customize.

The goals of the Spring 2007 Fujitsu Senior Design project were to migrate existing Fujitsu language translation tools from VB6.0 to Visual Studio 2005 and to support Unicode. Both goals were achieved in that semester. The NCSU Senior team delivered a set of language Translation tools to Fujitsu. The tool parses input from various text files and presents the translator with an intuitive UI to error proof the process. The translation tool kit is made up of 2 programs: TransEditor and TranslateUtil. TranslateUtil is a program to extract texts from Fujitsu’s GlobalSTORE product to a couple of text files. After translation, it applies these text files back to the GlobalSTORE product. TransEditor is a program that interacts with the users for translation from English to a foreign language.

In this semester, we would like to move onto Phase II of the translation tools. The goals are to:

  1. Fix program bugs from Phase I.
  2. Add more enhancements for Phase II.

The enhancements for TransEditor will include:

  • Translation Dictionary
  • Consistent Translation within a file Glossary of Terms (Operator may mean Cashier…etc)
  • SpellChecker
  • Filter by Section
  • Filter by Tag
  • Bookmarks
  • Find\Replace
  • Power User
  • Persist settings
  • Allow editing Language Code when input file is DB*.txt

The enhancements for TransUtil will include:

  • A parameter file in XML format.
  • When processing UI*.txt, provide a replacement of FontName / FontType. E.g. Can only display Japanese with MS Gothic font type.

The project team will be given a release of GlobalSTORE 3.2.1 to install on a lab system. The source code of translation tools will be supplied to the project team. They can be applied to the GlobalSTORE 3.2.1 system for system testing.

(Technologies Used: C#, Visual Basic.Net, Windows XP, XML)

MP3 Profiler


In previous semesters, Concert Technology senior design teams built software that can profile an MP3 collection on a client's hard drive and report that metadata to a centralized server for storage in a database. The solution provides a downloadable scanning client, a server for receiving metadata information, a web server for providing browser-based functionality to manage and access the stored information, and a web service for programmatically accessing the managed information. A framework for a music algorithm experimentation tool and for visualization of algorithm results was established. This tool provides the capability to analyze contents of a user’s music database. For example, this analysis can be based on user provided profiles, logs of use, recommendations gathered from other users, and extractions of metadata from actual MP3 files on the user’s hard drive. The overall goal of the user algorithm experimentation tool is to facilitate the establishment of user networks based on common music interests. The tool is intended for use internally by Concert Technology.

Project Overview

This semester’s project goal is to expand upon the Spring 2007 music cataloging project by enhancing the music algorithm experimentation tool. Enhancements to this tool must include:

  1. An assessment of available 3rd party charting tools, followed by a best recommendation. Upon approval, purchase license and integrate charting into the existing framework from previous semesters.
  2. Develop algorithm classes for several different types of experimental algorithms.
  3. Develop test algorithms in each algorithm class that can be run against a test library of music (to be provided).

Executive Dashboard

Students will develop an executive dashboard to facilitate a 360-degree view of Duke Energy’s customer base, highlighting the energy-efficiency programs they participate in. The dashboard should be web-based with creativity, flexibility, and extensibility in mind. Mashups (such as Google Earth) and other emergent technologies are encouraged for use within the dashboard, while standard charting and reporting services will be required. Various categories of the dashboard should be displayable, with the availability to drill down to specifics if needed. Microsoft Reporting Services will be used for generating reports to be made available via the dashboard. Open-source charting/graphing libraries may also be used (various choices may be made, depending on the language chosen for the dashboard). SQL Server will be the underlying RDMS. Schemas for this database are already defined. Appropriate security and documentation of the dashboard application will be required.

The first Duke Energy sponsor meeting is scheduled for Wednesday, September 5, from 10:40-11:30 in EBII-3300.

Automatic Website Generation

Our company specializes in making medical data available to anyone at anytime. One product is a web-based PACS (Picture Archiving Computer System) that permits related patient data to be archived along with actual echocardiogram video clips. These clips, accompanied by patient-related data, permit the physician to observe cardiac function and record observations and/or a diagnosis.

A project has been initiated to enhance the existing system by creating a customizable web interface to allow physician reporting of echocardiograms. This system would allow user physicians to customize their reports on the fly with preset patient info and study data fields. The reporting system must also be individualized for each physician group.

Last Semester, we created a prototype of an enhanced system, based on the Ruby on Rails (RoR) development framework. This prototype demonstrated that RoR can be used to meet our needs and created a base system that can be used to move forward. The new system, based on Ajax and RoR development tools, allows for an accelerated development cycle and provides a more robust, upgradeable system solution.

More important features need to be added to the base system. The project for this semester is to continue this development by adding the following new features:

  1. XML Data parsing to Database
  2. Widget Creation
  3. Template personalization (adding/removing comments per group or individual)
  4. Robust QuickTime viewing templates
  5. Customer data QA/QC module

The justification for this system is that iCardiogram employees spend approx 15-20 hours per new client to set up a customized web-based PACS. The majority of this time is spent in template customization and setup. The new system will position us above our competitors in the market place. Reducing the time to create a customized web site by automating template selection will save time and money for both the client and iCardiogram on future installations.

A demonstration of a sample target system is available on line. A demo username and password to the web based system is included below. You will need the QuickTime player to view sample echocardiograms. Upload this free viewer software using the links below. You can also review the past project binder to view what was accomplished last semester.

For Windows:
For Macintosh:
The URL for the iCardio site is

Once you login you will see a sample study list. For help on using the website go to

Executive Dashboard

  • Phase 1: An Executive Dashboard which enables Executives, Program Managers of strategic programs and organizational administrators to track progress of strategic initiatives globally.
  • Eventual Phase 2: leverage same infrastructure to deliver business intelligence info to Knowledge Services. (Data we can use to manage our business)
  • Figure 1 below depicts the suggest system architecture.

Figure 1. System Architecture

Requirements for Executive Dashboard

  • Latest levels of Oracle, MS.NET and MS Sharepoint are possibilities for the Architecture
  • Norpass authentication (password protected)
  • Ability to express criteria for target audience based on variety of HR data elements such as country, employee type, employment status, etc.
  • Ability to freeze target audience (or not) based on programs.
  • Ability for users to view data as Employee (only see their own), Manager (see their direct reports all the way down their organization) or Executive and Program Managers (see all global data related to their program)
  • Ability for all users to download data from dashboard into Excel spreadsheet
  • Real-time or close to real-time data update frequency
  • May include pulling data from other external databases
  • Ensure security/legal requirements are considered: German Works Council concerns

Tools and Techniques for High Efficiency Software Support

Our Goal

Our large market HR Payroll Product Quality Assurance team is faced with a number of challenges regarding the complexity of the product that they are charged with maintaining (test and enhance). The team is looking for ways to improve efficiency and scale. The HRP QA team is composed of approximately 23 US associates primarily located in RTP, NC and 12 off-shore associates in Bangalore, India. The team goal is to increase our efficiency by at least 10% within the next year.

Our Challenge

We work with a highly customized Oracle HRP solution that has been adapted differently for our 7 different large clients. Our QA team is responsible for testing monthly installs to update the HRP application with enhancements and bug fixes. Each enhancement requires lengthy analysis and research to develop test cases. We have approximately 70% new members to our team in the last 6 months, and there is little experience in the team on much of the functionality.

The Project

Perform analysis and design for a prototype "solution" or "solution set" to help us improve our efficiency. We expect the student team to interview our associates and analyze our current training and tools. We leave the solution open-ended to allow for a broad variety of solution approaches. Recommendations ranging from improved team training, use of efficient Knowledge Management tools, and/or test automation to help us reach our goal are acceptable. Technology to be used in the prototype solution is open, per student team recommendation.

Parse Performance Data Produced by the IBM System z New Virtualization Engine TS7700

Background: The IBM Virtualization Engine TS7700 provides tape virtualization for the System z environment. As the follow-on product to the highly successful IBM TotalStorage Virtual Tape Server, the TS7700 Virtualization Engine is designed to provide improved performance and capacity to help lower the total cost of ownership for tape processing. It introduces a new modular, scalable, high-performing architecture for mainframe tape virtualization.

This device maintains a significant amount of statistical performance data regarding the health and state of the managed drives. This information can be retrieved from the device by a series of commands. When instructed, the device will dump its statistical information, in binary format, to a file. A description of the format of the records within this file will be provided.

Objectives: The first objective of this project is to read the binary performance data output from a TS7700 and format that data in a text file in a way that humans can understand. The second objective is to create a searchable, a memory-resident data structure. Constraints on input processing are that a generalized approach to lexicographical token recognition and parsing must be adopted. I.e., use of lex and yacc are expected. The end product must be a lexer to be compiled by Lex and a grammar file to be compiled by Yacc on System z (note: compiling under z/OS Unix System Services is probably the easiest and is acceptable). Access to computers running System z will be provided. Parsed output must be directed to a text output file, or the appropriately keyed memory resident data structure. Key definitions will be defined in concert with project sponsors.

Student Benefits: Through this project, the students will gain knowledge and acquire skills in the following areas: state machines, parser generation, IBM hardware and mainframe technologies.

Prerequisite background: C programming.


Faculty Mentor: Dr. Purush Iyer (purush (at)

More information related to this project can be found at the CSC 492 course website under Class Notes.

A Real Time Control System for Electron and Ion Beam Imaging and Nanofabrication

The ability to fabricate (create) and image (visualize) structures at a nanometer scale (10-9 meter) is of increasing importance to research and industry. Applications of nanofabrication include integrated circuit mask repairs, construction of nanometer scale biological and mems devices, fabrication of 3D wiring structures, and artistic creations (e.g., wine glasses and an 8.8 micro-meter model of the Star Ship Enterprise). These tasks are performed using instrumentation in which an electron beam or an ion beam is positioned using computer hardware and software (see Matsui*.pdf on CSC 492 course website). Creation of complex, often multilayered structures via deposition of material onto a sample surface or via etching the surface of a sample, or both, is accomplished by the use of special purpose hardware. This hardware, controlled by custom software, manipulates a beam in a manner that will allow the appropriate deposition/etching of material. Similarly, imaging (i.e., creating a visual image of such a structure) is accomplished by positioning a beam and reading resultant data values (gray scale) derived from the beam position.

The overall goal of this project is to develop a software system to control electron and ion beam instruments for imaging and also for nanofabrication purposes. The goal of the initial phase (imaging) is to develop software to move the beam of an electron or ion beam instrument in a raster pattern. At each (X,Y) beam position in the raster scan, a voltage which provides a gray scale quantification of the sample at the beam position (a pixel value) will be read and mapped onto a display space on the PC screen. The end result of this process is an image of the sample being scanned. The second phase of the project will allow fabrication of complex 3D structures by additional control of the beam residence time or “dwell”. Here “dwell” refers to the time that the beam spends at any one (X,Y) coordinate position. The dwell time of the ion beam dictates the amount of material etched, thus creating a raised relief image

An example of a simple structure nanomachined by using a focused ion beam is shown in the figure below. The NCSU Wolf was machined into a silicon substrate by software conversion of the BMP image of the wolf, below left. The software algorithm was written such that the dwell time of the ion beam at any one position on the substrate was proportional to the gray scale intensity of the corresponding position on the image. Note the scale bar in the lower left of the scanning electron microscopy image of the result of the machining. The scale is in Micro-meters or units of 10-6 meters. The wolf is less than 10 millionths of a meter tall!

This project will utilize a PC outfitted with a special purpose PCI plug-in supplied by Innovative Integration – their “Conejo” (Rabbit) board. The Conejo board provides the scanning and control hardware. Software on the PC, written using C++ or another suitable language, will provide a GUI for input of image acquisition and display parameters. The Conejo board is driven by a Texas Instruments TMS320C6711 floating point Digital Signal Processor (DSP). The Conejo board includes the high speed data streaming capability required to send the compiled DSP code, positional and other information from the PC to the DSP board and to stream measured sample generated intensity values, status values, etc. back to the PC. The DSP and its associated on-board memory is used to control high speed, high precision digital to analogue converters (DAC) (output devices), and high speed, high precision Analogue to Digital Converters (ADC) (input devices) which are on board the Conejo. The DSP is programmed in C, all under a special purpose IDE. The DAC’s on board the Conejo will convert programmed digital (X,Y) position information into voltages suitable for use by a beam instrument for beam positioning. One of the 4 ADC’s on the Conjeo board will be used to convert a voltage signal originating from the sample into a digital value. This digital signal intensity measured by the ADC will then be streamed to the PC and displayed as a brightness level for each (X,Y) position The result is a gray scale image of the sample such as the image of the wolf above. The end result of the first phase of the project will be a system that, when interfaced to suitable beam instrumentation available in the AIF, can be used to image a sample. If time allows, this software will then be extended to write simple 2D patterns. Further extension will include to ability to write patterns having variable dwell times for each pixel such as the wolf above. The final product will allow fabrication of complex 3D structures via deposition of material or via etching.

More information related to this project can be found at the CSC 492 course website under Class Notes.

Qualcomm IP Browser


Qualcomm is working on developing an internal software solution to manage its System-on-a-Chip (SoC) Intellectual Property (IP). This project is called the IP Repository. The IP Repository consists of two main pieces. The first is a database of source code currently managed in a tool called DesignSync. This piece is out of scope for this senior design project. The second piece is the IP Browser, which will be the main user interface into the IP Repository.

The IP Browser is an interactive web-based application that when fully featured, would provide more than just the ability to view IP datasheets, but would be a starting point for building an SoC design. The IP Browser will allow the architect to select the best combinations of IP that will meet the specifications for the design. These specifications may be things such as features, power, area, cost, etc.


Here is a list of features to implement:

  • Search Features
    • Ability to find all IP supporting a clock frequency greater than a selected frequency
    • Ability to find all IP with a specific bus port or bus slave port or bus master port
    • Ability to find all IP with a term or phrase in the description
  • Reporting Features
    • Compare a list of selected IP with regards to power consumption
      • Requires application of a selected frequency and throughput added to the IP-specific power equation
      • Generate bar graph to compare power consumption
  • Chip Creation (Composite IP)
    • Create a new piece of IP in the IP Repository
    • Ability to add other IP to chip through search methods
    • Ability to run reports on composite IP (i.e. Power Comparison reports)

Technical Details

Source code for a prototype version of the IP Browser, written in Perl, using CGI and AJAX web technologies is available as a starting point. The proototype is running on an Apache web server on Linux using MySQL as its database backend. These are hard requirements unless there is a compelling reason to change them. The Look-and-Feel, UI, and DB schema are fair game for change.

Faculty Mentor: Ms. Carol Miller (miller (at)

Custom Menu Ordering System


Established in 1969, the Duke Diet and Fitness Center (DFC) has a long-established reputation among the most highly respected treatment centers in the world for individuals who are overweight and who have a sedentary lifestyle. The program at DFC is designed to teach clients about healthy eating and exercising. Ours is an “immersion” approach to lifestyle change. People seeking lasting changes in their lives join us for two to four weeks or longer to experience healthful eating and safe exercise.

The goal of this project is to design a menu ordering system for our clients. Our approach to menu management is not accommodated by any over-the-counter package. We believe there is future in the restaurant industry for a menu ordering system that will help consumers manage their diets.

About the Application

The goal of the proposed package is to automate our menu ordering system. Our menus run in a four-week cycle that changes twice annually. Each meal provides a different main entree with a constant selection of back of the menu items that are available at each meal. The menu system must be browser based with security features. Only clients who have been pre-cleared will have access to the system. We anticipate an average of about 130 people to be using the program at any given time, ranging from as low as 80 to as high as 160 people.

General Criteria for Use

  • Clients will obtain clearance from accounting and receive password to program.
  • Clients will enter their personal range of specific information about total calories and grams of food components for each meal/day.
  • Food components are broken down by Starch, Protein, Fruit, Vegetable, Dairy and Fat servings for each meal/day.
  • Upon entry, a client’s calories and food components will be tallied for each day.
  • If entries are out of range, some alert should be made.
  • The client will repeat this process for all seven days.
  • When the client has finished entering menu data for the week, the menu should be sent to the Nutrition Dept. where it is reviewed for accuracy.
  • Once approved, the menus are all sent in a package to the kitchen. Counts for each meal are tallied and can be printed.
  • When a client comes through for a meal he/she can swipe his/her badge and a readout of his/her selections for the meal will be produced for the kitchen staff.

This general overview can be enhanced with details established by the team in conjunction with the sponsor. Go Pack!

The NC State student team will be joined by counterparts at Michigan Tech. Team interaction will be accomplished remotely. Internet based technology such as VoIP, videoconferencing, Wiki, etc. will be used.