Create Your First Project
Start adding your projects to your portfolio. Click on "Manage Projects" to get started
More intelligent access: AI-Driven doors and ramps for inclusive transit
Accessibility, Object Detection, Deep Learning, Public Transit, YOLO-8n, Ramp Deployment, Inclusive Transit Design, VGG16
August 2025
RARs, Published, 08/2025
Under peer review at JRHS, 11/2025; Published in RARs, 08/2025
Public transportation is undergoing a massive expansion across the United States, fueled by the recognition of an unsustainable car culture, urbanization, and sustainability concerns. Despite these strides, accessibility features remain outdated, with people using wheelchairs, walkers, and other mobility aids still relying on decades-old systems of slow, manual ramps and uniformly timed doors. This project aims to modernize accessibility features within public transit vehicles by leveraging computer vision and a YOLO-8n-based convolutional neural network (CNN) to analyze CCTV footage and detect mobility aids in real-time. The study benchmarks this approach against a VGG-16-based classifier in single-label scenarios to demonstrate YOLO's robustness for multi-object detection tasks. The YOLOv8n model achieved superior performance with a precision of 0.9946, a recall of 0.9846, and an F1 score of 0.9893, outperforming the VGG16 baseline across all metrics. Upon identifying a mobility aid, the system would signal transit vehicle doors and ramps to remain open or deploy automatically without delay, reducing human error and safety risks while improving accessibility for nearly 70 million Americans living with disabilities.