Project

# Title Team Members TA Documents Sponsor
29 Interactive Projection System on Arbitrary Surfaces
Jie Xu
Jing Weng
Yuqi Tang
Zibo Dai
Liangjing Yang
# Problem

Most current smart devices rely on fixed-size screens for human-computer interaction, which limits display area, temporary collaboration, and natural input. Projection technology can extend interfaces into the physical environment, but conventional projectors usually provide visual output only and cannot support stable direct touch interaction across surfaces with different shapes, sizes, and materials.

Our project aims to develop a system that projects an interactive user interface onto arbitrary physical surfaces and supports direct touch input on the projected area. This is a meaningful and technically challenging problem because the system must address not only projection, but also surface detection, projector-sensor calibration, touch localization, and real-time interaction feedback. We will begin by validating the first prototype on a normal wall, and then extend the design toward more general surfaces such as desks, paper, and other physical objects.


# Solution Overview

We propose to build an interactive projection system that integrates projection hardware, vision-based sensing, and embedded control. A projection module will display a graphical user interface on the target surface, while a camera or depth-based sensing module will monitor the surface and detect the position of a user’s finger during interaction.

The sensed position will then be mapped into the projected interface coordinate system so that the system can recognize basic actions such as clicking and dragging, forming a complete display-sensing-recognition-feedback loop. The first implementation will be validated on a flat and stable wall surface; however, the overall architecture will be designed for extension to arbitrary surfaces, with attention to surface size variation, pose variation, and adaptive interface placement.

Prior research shows that the key technical problems of arbitrary-surface interactive projection include surface segmentation and tracking, projector-camera calibration, and interaction area definition, which directly motivates our design.


# Solution Components

The proposed system consists of the following major components:

## Projection Display Module
Projects a graphical user interface onto the target surface and adjusts the displayed area according to surface size, position, and orientation.

## Surface Sensing Module
Uses a camera or depth/vision sensor to capture image or depth information from the target surface, detect surface geometry, and identify the available interactive area.

## Touch Detection and Interaction Recognition Module
Detects whether the user’s finger is touching the projected surface and recognizes basic interaction events such as tapping and dragging.

## Coordinate Calibration and Mapping Module
Establishes the spatial relationship between the sensing system and the projector so that detected touch points can be accurately mapped to interface locations.

## Embedded Control and System Integration Module
Executes control logic, coordinates sensing and projection data flow, and manages communication and power across the system.

## Mechanical Support Structure
Provides stable mounting for the projector, sensors, and control hardware so that the relative geometry remains fixed and repeatable during calibration and testing.


# Criteria of Success

The project will be considered successful based on the following criteria.

1. The system must project a stable and visible interactive interface onto at least one physical surface and maintain usable operation during demonstration.
2. It must detect direct touch input within the projected area and correctly trigger at least one basic interaction event, such as a click.
3. The touch localization accuracy must be sufficient for users to complete simple interface tasks such as button selection or menu navigation.
4. The system must demonstrate extensibility toward arbitrary surfaces by supporting interaction on at least one additional surface beyond a wall.
5. The complete prototype must support a demonstrable application scenario, such as a numeric keypad, simple control panel, or menu-based interface, showing that the full interaction loop has been implemented.

These success criteria match the course expectation that requirements should be clear and verifiable, and they are also consistent with prior evaluation methods for click detection and drag interaction in projected interactive systems.

A Micro-Tribotester to Characterize the Wear Phenomenon

Shuren Li, Boyang Shen, Sirui Wang, Ze Wang

A Micro-Tribotester to Characterize the Wear Phenomenon

Featured Project

**Problem**

Many research efforts have been made to understand the complex wear mechanisms used to reduce wear in sliding systems and thus reduce industrial losses. To characterize the wear process, coefficient of friction needs to be measured “not only after completion of the wear test but also during the wear test to understand the transitional wear behavior that led to the final state”.(Penkov) In order to improve the effectiveness and efficiency of these research methods, it is necessary to improve the instrument used to characterize the wear phenomenon to better measure the friction coefficient of the material. Although the instrument can be applied on all solid samples, we will use silicon wafer coated with SiO2 as our specimen targeted object.

**Solution Overview**

The objective of the experiment is to evaluate the wear phenomenon of the sample during the sliding test so as to obtain the wear information of the material. We will design planar positioning and force sensing system to get the move and force information of our objects. To collect the data of vertical load and horizontal friction, 2 force sensors are mounted on linear rails to minimize the radial force and ensure that only the axial forces are collected. Then, the coefficient of friction can be calculated by equation:

![](https://courses.grainger.illinois.edu/ece445zjui/pace/getfile/18615)

And to determine the relationship between the coefficient of friction and the state of wear, we use a microscope to monitor the state of wear at a given location in the wear track and evaluate the wear process during each sliding cycle. In this way, we can investigate the wear transition processes with respect to the sliding distance then transport our data to a computer. Finally, we will design our data processing method in the computer to successfully obtain an acceptable result in the margin error.

**Solution Components**

1. Motion Platform: This subsystem includes a linear actuator that moves the sample in reciprocating motion along X-axis, a stationary counter surface that applies constant vertical load onto the sample, and another actuator that compresses the spring and provides a vertical load to the counter sample.

2. Specimen and Counter surface: We will test the wear and friction between the specimen and the counter surface during the sliding test. A 10 × 10 mm^2 silicon (Si) wafer coated with 50 nm thick SiO2 will be used as the specimen and a stainless-steel ball with a diameter of 1 mm was used as the counter surface.

3. Sensors: This subsystem includes two force sensors that measure the vertical load and horizontal friction. The Load Sensor should assemble along with the Z-axis actuator. To measure the friction without the effect of load, we assemble the Load Sensor and Friction Sensor sensor on the Linear Rails, as the photo attached shows. Since the sensors are strain gauges and only outputs, small changes in resistance, amplifiers, and ADC are needed to collect the signal and send converted data to the computer.

4. Data Processing: This subsystem includes acquiring raw data of load and friction on the computer, applying necessary filters to reduce noise and improve accuracy, and plotting the result that reflects the relationship between the sliding cycles and coefficient of friction for our sample.

![](https://courses.grainger.illinois.edu/ece445zjui/pace/getfile/18611)

**Criterion for Success**

1. Motion platform can perform precise reciprocation. The control system can effectively control the number and speed of reciprocating motion.

2. The acquisition unit can collect data effectively and can transfer the data that can be processed to the computer.

3. On a computer, the raw data can be processed into a readable graph based on algorithms set up. By analyzing the graph, the relationship between the data and the expected results can be correctly obtained.

**References**

Penkov OV, Khadem M, Nieto A, Kim T-H, Kim D-E. Design and Construction of a Micro-Tribotester for Precise In-Situ Wear Measurements. Micromachines. 2017; 8(4):103. https://doi.org/10.3390/mi8040103