An ongoing project in Sinha Laboratory for Vision Research at MIT.
The code featured in this project:
MATLAB and Image Processing Toolbox library for histogram equalization, generation of color conditions,
analyzing Tobii Eye Tracker Clearview software output +
Processing for the automation of the stimuli array layouts.
In Sinha Laboratory for Vision Research, I am exploring how normally sighted individuals perform directed visual search. Directed visual search is a task related to finding a target object in a cluttered environment made of non-target distractors. It is fundamental to not only everyday tasks but also to the manipulation of visual aids such as diagrams, sketches, drawings, charts. Contrary to past work which usually use synthetic stimuli such as lines and focus on low-level saliency cues, this study aims to examine directed visual search in heterogeneous arrays of natural objects. These objects attract attention according to the specific attributes(for example, color, orientation, size etc.) that are called visual salience of the stimuli. The visual saliency of items are bottom-up, stimulus-driven phenomenon, however, these can be guided with top-down goals of the searcher. In this study I examine the roles of color and degradation in the dynamics of scan paths while undertaking a directed search. Specifically, I am looking into how search strategies differ with and without color when the images are degraded. In these experiments, preliminary results suggested that for normally developing adults, as viewing conditions become more degraded, low-level visual cues, such as color, played a more significant role in contributing to successful search. I further explored search efficiency and strategy between color and gray-scale images as a function of image resolution to emulate the conditions of subjects with degraded vision.
This project is licensed under the MIT License - see the LICENSE.md file for details.