Project

# Title Team Members TA Documents Sponsor
69 Paint Color and Gloss Classification Device
Charis Wang
James Lee
Victoria Lee
Chihun Song proposal1.pdf
# Title
Paint / Sheen Analysis Device

# Team Members:
- James Lee (jl212)
- Victoria Lee (vlee33)
- Charis Wang (cwang274)

# Problem
Homeowners, renters, and especially college students frequently face the challenge of matching existing wall paint and texture for touch up or repairs often without access to the original paint can. While it is possible to peel a physical chip off the wall to scan it, it is an inconvenient process. While mobile apps exist they rely on smartphone cameras which use auto white balance and are heavily infused by ambient lighting. These current solutions do not account for sheen such as matte vs eggshell meaning that even the best color match can look off once applied. This resulted in wasted time and materials and a poor result / color match.

# Solution
We propose a non-destructive "Paint/Surface Analysis Device" that accurately identifies both wall color and sheen without removing a physical paint chip. Our device utilizes a controlled lighting environment and a spectral color sensor to determine the precise color composition (hex code) of the wall. To address the gloss, the device integrates a secondary computer vision subsystem utilizing "raking light" (low-angle side lighting). This illumination technique reveals the paint finish (e.g., gloss vs. semi-gloss) Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project.

## Subsystem 1: Microcontroller and Processing
Coordinates sensor data acquisition, executes matching algorithms, and manages system timing. It converts spectral data into the standard color space. From there, we match the color to color database stored in memory.
Components: STM32F7 Series Microcontroller (High-performance with DCMI for camera support)
## Subsystem 2: Sheen Analysis
We intend to shine an LED light at a 60 degree angle and measure how much light bounces off. If there is a lot of bounce the surface would be considered glossy if there is little bounce the surface would be considered matte.
Components: Low-angle "Raking Light" LED array, AS7341 11-Channel Spectral Sensor, calibrated neutral-white LED, Photodiode


## Subsystem 3: Spectral Sensing
Measures the absolute color composition of the sample under calibrated internal lighting.
Components: AS7341 11-Channel Spectral Sensor, calibrated neutral-white LED
## Subsystem 4: User Interface
Displays the identified paint brand, color name, and recommended applicator type.
Components: 2.8" TFT LCD Display, Rotary Encoder for menu navigation
## Subsystem 5: Power Management
Regulates external power for sensitive analog sensors and high-current LED subsystems.
Components: 12V DC Wall Adapter, Buck Converters (5V), and Low-Noise LDO Regulators (3.3V)
## Subsystem 6: Enclosure
Blocks outside light and fixes spectral sensor position/angle for reproducible results
Components: Cardboard Box with fixed cutouts for reproducible measurements

# Criterion for Success
Color Accuracy: Achieve a color match with a Delta-E < 3.0 across multiple measurements, which represents a commercially acceptable match for consumer-grade applications.
How Is Color Measured? Calculating Delta E | ALPOLIC®
Sheen Classification: Correctly distinguish between "Gloss," "Semi-Gloss," and “Flat” with 90% accuracy.
Ambient Isolation: Maintain consistent color readings regardless of external room lighting conditions.

Cloud-controlled quadcopter

Anuraag Vankayala, Amrutha Vasili

Cloud-controlled quadcopter

Featured Project

Idea:

To build a GPS-assisted, cloud-controlled quadcopter, for consumer-friendly aerial photography.

Design/Build:

We will be building a quad from the frame up. The four motors will each have electronic speed controllers,to balance and handle control inputs received from an 8-bit microcontroller(AP),required for its flight. The firmware will be tweaked slightly to allow flight modes that our project specifically requires. A companion computer such as the Erle Brain will be connected to the AP and to the cloud(EC2). We will build a codebase for the flight controller to navigate the quad. This would involve sending messages as per the MAVLink spec for sUAS between the companion computer and the AP to poll sensor data , voltage information , etc. The companion computer will also talk to the cloud via a UDP port to receive requests and process them via our code. Users make requests for media capture via a phone app that talks to the cloud via an internet connection.

Why is it worth doing:

There is currently no consumer-friendly solution that provides or lets anyone capture aerial photographs of them/their family/a nearby event via a simple tap on a phone. In fact, present day off-the-shelf alternatives offer relatively expensive solutions that require owning and carrying bulky equipment such as the quads/remotes. Our idea allows for safe and responsible use of drones as our proposed solution is autonomous, has several safety features, is context aware(terrain information , no fly zones , NOTAMs , etc.) and integrates with the federal airspace seamlessly.

End Product:

Quads that are ready for the connected world and are capable to fly autonomously, from the user standpoint, and can perform maneuvers safely with a very simplistic UI for the common user. Specifically, quads which are deployed on user's demand, without the hassle of ownership.

Similar products and comparison:

Current solutions include RTF (ready to fly) quads such as the DJI Phantom and the Kickstarter project, Lily,that are heavily user-dependent or user-centric.The Phantom requires you to carry a bulky remote with multiple antennas. Moreover,the flight radius could be reduced by interference from nearby conditions.Lily requires the user to carry a tracking device on them. You can not have Lily shoot a subject that is not you. Lily can have a maximum altitude of 15 m above you and that is below the tree line,prone to crashes.

Our solution differs in several ways.Our solution intends to be location and/or event-centric. We propose that the users need not own quads and user can capture a moment with a phone.As long as any of the users are in the service area and the weather conditions are permissible, safety and knowledge of controlling the quad are all abstracted. The only question left to the user is what should be in the picture at a given time.

Project Videos