Project

# Title Team Members TA Documents Sponsor
43 Autonomous Transport Car
Xubin Shen
Yiqi Tao
other2.pdf
other3.pdf
other4.pdf
Chushan Li
Project: Autonomous Transport Car
Team Members:
Ma Jingyuan(674072315)
Xubin Shen (677258677)
Tao Yiqi(670182981)
Zhang Haotian(676598571)

Problem Overview: Recent years, the demand for autonomous goods transport systems is growing and people are seeking ways to improve efficiency in logistics. Traditional retrieval methods are generally manual. Workers need to deliver the packages by hand, which is very exhausting and time-consuming with low reliability. Existing solutions often highly depend on human operations, lacking automation in identifying, selecting and transporting items. This limits the further development of logistics industry. Besides, there are also big challenges to train the transport devices to find and follow proper path accurately. Additionally, a convenient way for users to issue instructions and get the feedback is also necessary. For these problems, we propose an autonomous transport car that can grab goods and deliver, with intelligent object recognition based on color and accurate navigation systems. This project aims to design an autonomous system of searching, grabbing, and transporting designated items, improving efficiency and reducing the dependence on human.

Solution Overview: The Autonomous Transport Car project aims to develop an intelligent vehicle capable of autonomously searching for and transporting specified goods. This solution integrates advanced technologies such as autonomous driving, motor systems, mechanical manipulation, and computer vision to achieve efficient and reliable operation.

Autonomous Driving System: The design includes an autonomous driving system that navigates through the environment using sensors and algorithms. It follows a preset ground trajectory, including obstacle detection and avoidance features, to move to the designated platform.
Motor Power System: The vehicle is equipped with an efficient and stable motor power system, utilizing power electronic components and control algorithms。
Gripping Structure: The mechanical structure is designed and assembled to pick up goods from shelves. It can adjust the gripping force according to the size and shape of the items, and this structure is controlled by motors and actuators.
Camera Recognition: The vehicle is equipped with a camera recognition system that can identify the types and colors of goods on the shelves, locate and select the specified items.
Solution Components: 1.Autonomous Driving System oFollows preset trajectories via IR sensors and PID control. oAvoids obstacles using ultrasonic sensors and reroutes dynamically. 2.Camera Recognition System oRaspberry Pi Camera Module V2 with OpenCV for color-based item detection. 3.Gripping Mechanism o2-DOF servo-driven gripper with pressure feedback, tailored for lightweight boxes. 4.Communication & Control oArduino Mega handles motor control and sensor data. oHC-05 Bluetooth module enables app-based commands (e.g., “return to base”). 5.User Interface oMobile app with minimalistic UI for issuing commands and receiving status updates.

Project Goals Successful outcomes will include: 1.Functional Hardware Prototype with Technical Specifications oA 4-wheel modular chassis using 12V geared DC motors, controlled by an L298N motor driver. oComputational Units: Raspberry Pi 4B (4GB RAM) for computer vision (OpenCV-based color detection) and high-level navigation logic. Arduino Mega for low-level motor control, sensor interfacing (e.g., ultrasonic, IR), and gripper actuation. oSensors: 3x TCRT5000 IR sensors for line following. 2x HC-SR04 ultrasonic sensors for obstacle detection. FlexiForce pressure sensors on the gripper for force feedback. oActuators: 2-DOF servo-based gripper (SG90 servos) optimized for lightweight (≤500g), box-shaped items. oPower: Dual 7.4V LiPo batteries (separate power supply for motors and logic units). oCommunication: HC-05 Bluetooth module for app integration. 2.Reliable Software Implementation oAutonomous Navigation: PID-controlled line tracking using IR sensors. Obstacle avoidance via ultrasonic sensors with dynamic path recalculation. oObject Recognition: Color-based identification (targeting specific HSV ranges) using Raspberry Pi Camera Module V2. Localization within a 1m x 1m shelf area. oApp Integration: Basic command interface (e.g., “retrieve red item”) with real-time status feedback via Bluetooth/Wi-Fi. 3.Scope and Success Metrics oFunctional Limitations: Gripping mechanism designed for standardized, rigid items (no fragile/irregular shapes). Navigation restricted to flat indoor environments with clear line markings. oDemonstrated Outcomes: End-to-end operation in a 5m x 5m test area: Identify target item → Plan path → Grasp → Transport → Deliver. ≥85% success rate across 20 trials.

Expectations for Team Members Attend all meetings prepared (e.g., review agendas, complete assigned tasks). Communicate progress, blockers, or delays proactively (no “radio silence”). Respect deadlines; renegotiate timelines early if conflicts arise. Provide constructive feedback during reviews and respond openly to critiques. Document work thoroughly for seamless handovers. Escalate risks (e.g., technical hurdles, miscommunication) immediately.

Interactive Proximity Donor Wall Illumination

Featured Project

Team Members:

Anita Jung (anitaj2)

Sungmin Jang (sjang27)

Zheng Liu (zliu93)

Link to the idea: https://courses.engr.illinois.edu/ece445/pace/view-topic.asp?id=27710

Problem:

The Donor Wall on the southwest side of first floor in ECEB is to celebrate and appreciate everyone who helped and donated for ECEB.

However, because of poor lighting and color contrast between the copper and the wall behind, donor names are not noticed as much as they should, especially after sunset.

Solution Overview:

Here is the image of the Donor Wall:

http://buildingcampaign.ece.illinois.edu/files/2014/10/touched-up-Donor-wall-by-kurt-bielema.jpg

We are going to design and implement a dynamic and interactive illuminating system for the Donor Wall by installing LEDs on the background. LEDs can be placed behind the names to softly illuminate each name. LEDs can also fill in the transparent gaps in the “circuit board” to allow for interaction and dynamic animation.

And our project’s system would contain 2 basic modes:

Default mode: When there is nobody near the Donor Wall, the names are softly illuminated from the back of each name block.

Moving mode: When sensors detect any stimulation such as a person walking nearby, the LEDs are controlled to animate “current” or “pulses” flowing through the “circuit board” into name boards.

Depending on the progress of our project, we have some additional modes:

Pressing mode: When someone is physically pressing on a name block, detected by pressure sensors, the LEDs are controlled to

animate scattering of outgoing light, just as if a wave or light is emitted from that name block.

Solution Components:

Sensor Subsystem:

IR sensors (PIR modules or IR LEDs with phototransistor) or ultrasonic sensors to detect presence and proximity of people in front of the Donor Wall.

Pressure sensors to detect if someone is pressing on a block.

Lighting Subsystem:

A lot of LEDs is needed to be installed on the PCBs to be our lighting subsystem. These are hidden as much as possible so that people focus on the names instead of the LEDs.

Controlling Subsystem:

The main part of the system is the controlling unit. We plan to use a microprocessor to process the signal from those sensors and send signal to LEDs. And because the system has different modes, switching between them correctly is also important for the project.

Power Subsystem:

AC (Wall outlet; 120V, 60Hz) to DC (acceptable DC voltage and current applicable for our circuit design) power adapter or possible AC-DC converter circuit

Criterion for success:

Whole system should work correctly in each mode and switch between different modes correctly. The names should be highlighted in a comfortable and aesthetically pleasing way. Our project is acceptable for senior design because it contains both hardware and software parts dealing with signal processing, power, control, and circuit design with sensors.