Project Overview
This project investigates cognitive load during CAD modeling tasks performed in two different interaction environments: a traditional desktop CAD workflow and AR-CAD, an augmented-reality CAD system developed for Meta Quest 3. The work was designed not only to compare tools, but also to build a reliable EEG acquisition and processing workflow for design cognition research.
The project connects HCI, AR/VR, engineering design, cognitive neuroscience, and signal processing. EEG was used as a time-resolved measure that can complement post-task instruments such as NASA-TLX and SUS.
regions of interest considered: frontal, central, parietal, and occipital
key EEG bands emphasized in the analysis: theta, alpha, and beta
fully instrumented pilot participant used to validate feasibility and workflow quality
Why This Work Matters
Traditional CAD often requires users to perform 3D modeling through a 2D screen, keyboard, mouse, menus, and indirect spatial manipulation. AR-CAD changes the interaction style by allowing users to create and manipulate geometry in a passthrough AR environment.
The central question is whether this shift from screen-mediated CAD to embodied AR interaction changes cognitive demand during design. EEG makes this question measurable by capturing task-related neural activity during the design process rather than only after the task is complete.
Research Significance
The work demonstrates how EEG can be integrated into immersive AR-CAD evaluation through careful baselines, event markers, signal-quality monitoring, segmentation, and artifact-aware preprocessing.
My Role and Contributions
I designed and implemented the study workflow, including the experimental structure, baseline strategy, EEG preprocessing pipeline, marker-based segmentation, ROI-level analysis, and pilot interpretation plan.
I also integrated the EEG workflow with the broader AR-CAD research program by aligning EEG acquisition with task performance, usability evaluation, workload assessment, screen/audio recording, and future Bloom-coded think-aloud analysis.
Method and Workflow
The study used an Emotiv FLEX 2 EEG system with Meta Quest 3 for the AR-CAD condition and a desktop workstation for the traditional CAD condition. EEG was recorded during baseline and task periods, with event markers used to identify task boundaries and support synchronized segmentation.
Baseline Recording
Collected eyes-open and task-relevant baselines, including an AR-specific baseline inside Meta Quest 3 passthrough.
Signal Quality Monitoring
Used hardware and software checks to monitor electrode quality before and during task recording.
Marker-Based Segmentation
Used event markers to identify task boundaries and synchronize EEG with design activity.
Frequency-Domain Analysis
Extracted spectral bandpower and summarized changes across task segments and regions of interest.
The processing workflow included channel mapping, 1–40 Hz filtering, 60 Hz notch filtering, artifact inspection, ICA-based cleaning, average re-referencing, marker-based segmentation, edge trimming, baseline normalization, spectral bandpower extraction, and ROI-level summary analysis.
Pilot Outcomes and Significance
The pilot showed that EEG acquisition during Meta Quest 3 AR-CAD interaction is feasible when careful cap fitting, signal-quality checks, baselines, event markers, and preprocessing steps are used. The workflow produced analyzable task segments and descriptive spectral outputs.
Regional alpha patterns provided early evidence that AR-CAD and desktop CAD may engage visuospatial and attentional processing differently. These results are not presented as final statistical evidence because the completed pilot included one fully instrumented participant, but they justify the full study design.
A key methodological lesson was that AR-CAD introduces more motion and muscle artifact because of headset use and controller-based interaction. This makes artifact-aware cleaning and future motion-reference strategies important next steps.
Research Value
This project contributes a practical EEG protocol for immersive design research. It shows how AR/VR interaction, CAD task analysis, subjective workload measures, behavioral markers, and neurophysiological data can be combined into a single evaluation framework.
In the full study, this workflow can support comparisons between novice and experienced CAD users, evaluate how AR-CAD changes cognitive load, and connect neural signals with Bloom-coded think-aloud segments from the design process.