Abstract
<jats:p>Artificial intelligence has played a significant role in improving accessibility for visually impaired individuals by enabling real-time environmental understanding and multimodal interaction. This survey paper examines the design, methodology, and system components of AURA—an AI- powered virtual assistant developed to support individuals with visual impairments. The study evaluates various modules including speech processing, object detection, navigation, sound recognition, wearable technology, and guardian monitoring systems that collectively aim to provide a safe, intelligent, and accessible user experience. The survey further explores the challenges in current assistive solutions and how AURA bridges those gaps by integrating deep learning models, sensor-based feedback, multilingual communication, and real-time danger detection. The paper organizes the technical considerations into specific IEEE template-style sections while aligning them with the practical AI framework of AURA.</jats:p>