Project
| # | Title | Team Members | TA | Documents | Sponsor |
|---|---|---|---|---|---|
| 29 | Interactive Projection System on Arbitrary Surfaces |
Jie Xu Jing Weng Yuqi Tang Zibo Dai |
Liangjing Yang | ||
| # Problem Most current smart devices rely on fixed-size screens for human-computer interaction, which limits display area, temporary collaboration, and natural input. Projection technology can extend interfaces into the physical environment, but conventional projectors usually provide visual output only and cannot support stable direct touch interaction across surfaces with different shapes, sizes, and materials. Our project aims to develop a system that projects an interactive user interface onto arbitrary physical surfaces and supports direct touch input on the projected area. This is a meaningful and technically challenging problem because the system must address not only projection, but also surface detection, projector-sensor calibration, touch localization, and real-time interaction feedback. We will begin by validating the first prototype on a normal wall, and then extend the design toward more general surfaces such as desks, paper, and other physical objects. # Solution Overview We propose to build an interactive projection system that integrates projection hardware, vision-based sensing, and embedded control. A projection module will display a graphical user interface on the target surface, while a camera or depth-based sensing module will monitor the surface and detect the position of a user’s finger during interaction. The sensed position will then be mapped into the projected interface coordinate system so that the system can recognize basic actions such as clicking and dragging, forming a complete display-sensing-recognition-feedback loop. The first implementation will be validated on a flat and stable wall surface; however, the overall architecture will be designed for extension to arbitrary surfaces, with attention to surface size variation, pose variation, and adaptive interface placement. Prior research shows that the key technical problems of arbitrary-surface interactive projection include surface segmentation and tracking, projector-camera calibration, and interaction area definition, which directly motivates our design. # Solution Components The proposed system consists of the following major components: ## Projection Display Module Projects a graphical user interface onto the target surface and adjusts the displayed area according to surface size, position, and orientation. ## Surface Sensing Module Uses a camera or depth/vision sensor to capture image or depth information from the target surface, detect surface geometry, and identify the available interactive area. ## Touch Detection and Interaction Recognition Module Detects whether the user’s finger is touching the projected surface and recognizes basic interaction events such as tapping and dragging. ## Coordinate Calibration and Mapping Module Establishes the spatial relationship between the sensing system and the projector so that detected touch points can be accurately mapped to interface locations. ## Embedded Control and System Integration Module Executes control logic, coordinates sensing and projection data flow, and manages communication and power across the system. ## Mechanical Support Structure Provides stable mounting for the projector, sensors, and control hardware so that the relative geometry remains fixed and repeatable during calibration and testing. # Criteria of Success The project will be considered successful based on the following criteria. 1. The system must project a stable and visible interactive interface onto at least one physical surface and maintain usable operation during demonstration. 2. It must detect direct touch input within the projected area and correctly trigger at least one basic interaction event, such as a click. 3. The touch localization accuracy must be sufficient for users to complete simple interface tasks such as button selection or menu navigation. 4. The system must demonstrate extensibility toward arbitrary surfaces by supporting interaction on at least one additional surface beyond a wall. 5. The complete prototype must support a demonstrable application scenario, such as a numeric keypad, simple control panel, or menu-based interface, showing that the full interaction loop has been implemented. These success criteria match the course expectation that requirements should be clear and verifiable, and they are also consistent with prior evaluation methods for click detection and drag interaction in projected interactive systems. |
|||||