Lost in Space

Multisensory Conflict yields Adaptation in Spatial Representations across Frames of Reference

Johannes Lohmann & Martin V. Butz

Overview


These slides contain some supplementary information regarding the

  • experimental setup
  • dependent measures
  • data analysis

Just hit [ESC] to switch to the global view and to jump to the slide you are interested in. Depending on your browser settings, content might not be loaded on the fly. If a video slide is empty or a visualization is not showing up, than just refresh your browser window (in most cases [F5]).


Experimental Setup


  • two blocks with different hand models
  • each trial consisted of three stages
    • pre-localization: self and external
    • bimanual object interaction for 40 seconds; used to introduce visual offset
    • post-localization: self and external
  • after each trial, participants were asked to move their hands out of the tracking range, after this, the visual offset was removed
  • at the end, the igroup presence questionnaire (IPQ) was administered

Experimental Setup: VR Setup



Experimental Setup: VR Setup Overview


vr_setup_overview

Experimental Setup: Hand Models


vr_setup_handmodels

Experimental Setup: Task Space I


  • the asymetrical layout of the task space was due to the sensor
  • the tracking range is more broad than deep
  • hence we put the basket at the outer right

Experimental Setup: Task Space II


vr_setup_taskspace

Experimental Setup: Tasks


Object Interaction

Complex bimanual task (picking petals) during which a visual offset was applied to the hands. The offset increased slowly and only during interaction, the maximal amplitude was small (about 6.7 cm).

Localization

Pointing with both hands to oneself (with the thumbs), or to an external reference (with the index fingers), i.e. the basket. Localization required the hands to be still and to be in parallel. Fifty data samples - consisting of palm and finger positions - were recorded and used to obtain the dependent measures. In one block, participants performed localization with the hands visible, in the other block, the hands were invisible.


Experimental Setup: Object Interaction I


  • this sample shows the object interaction, the transparent overlay shows the actual hand position
  • please note that the participants only saw the solid, shifted hand
  • the shift only commenced while the hands were moving and the flower was grasped, it took about ten seconds of interaction to introduce the full shift
  • in general participants complied to the task, overall the mean number of picked petals was 4.5 (SD = 1.4) that is one flower per trial
  • please note that this is a lower bound, since these are only those petals that landed in the basket

Experimental Setup: Object Interaction II



Experimental Setup: Localization I


  • this sample shows the localization which had to be performed before and after the interaction phase
  • due to the error checks, the task was quite demanding
  • hence, participants were given quite a lot of time to practice this task (about 10 minutes)

Experimental Setup: Localization II



Experimental Setup: Factors


Hand Visibility during localization (per block)
  • visible
  • invisible
Localization Reference (per trial, random)
  • self
  • external
Visual Offset (per trial, random)
  • left
  • right
  • toward and left
  • toward and right

Experimental Setup: Offset Conditions


vr_setup_offsets

Experimental Setup: Measures Part I


Palm Drift

The distance of the mean palm centroids in pre- and post-localization might reveal adaptation of the center of hand space due to the visual offset.

Angular Disparity

When performing the localization, a drift of the hand position might be compensated by rotating the palm. This rotation might reveal if information from other frames of reference is used to correct the error in hand space.

Positional Discrepancy

Complete compensation by rotation should yield a similar positional estimate in the post-localization as in the pre-localization. The remaining discrepancy can reveal the success of the correction. A discrepancy would imply that the conflict in hand space could not be completely compensated.


Experimental Setup: Measures Part II


  • the next slide provides an interactive overview of the measures
  • you may need to refresh your browser ([F5]), if it still does not work, your browser settings do not allow to load local javascript
  • the yellow hands indicate the physical hands above the sensor, the red hands indicate the visual hands in the VR
  • with the buttons on the right you can switch between the dependent measures
  • with the 'x-offset' slider you can manipulate the extend of the left / right offset; please note that this will make the yellow (actual) hands move, since the visual hands had to be kept in the center to allow a succesfull interaction with the flower
  • the 'visual capture' slider is basically a scaling factor that exaggerates or attenuates the offset effect

Experimental Setup: Measures Part III



Participants, Procedure and Data Preparation


  • Participants
    • 33 students, 14 right-handed, 11 females
    • one left-handed participant, no glasses
    • Mean age: 21.7, SD: 2.5
  • 2 × 12 trials
    • two blocks, localization with visible, or invisible hands, order balanced over participants
    • 12 trials each: Four offset conditions, each repeated three times
    • self- and external localization before and after 40 seconds of object interaction
  • Data Preparation
    • per localization, 50 data samples of the palms, index finger, and thumb tips were recorded
    • data were averaged into a single 2D [x,z]-centroid to compute dependent measures
    • signs were assigned depending on offset condition, hence measures remained centred around zero

Results: Data Analysis


  • Drift, Disparity, and Discrepancy
    • Linear mixed effect model analysis using R statistical software
    • model identification with fixed effects (and interactions) for
      • hand visibility (visible vs. invisible)
      • localization reference (self vs. external)
      • visual offset (left, right, forward right, forward left)
    • maximal, converging random effect structure
    • checks against zero at the level of the reference × visibility interaction
  • IPQ Scores
    • results were compared with reference data from the IPQ database
    • t-Tests to check for significant improvement on the different scales

Results: Model Identification


Palm Drift
  • fixed effect for visibility (F(1,33.19) = 39.03, p < .001)
  • visible hands increased drift by 1.5cm ± 0.24cm
Angular Disparity
  • fixed effects for visibility (F(1,33.01) = 24.00, p < .001) and reference (F(1,74.34) = 24.98, p < .001)
  • visible hands increased disparity by 2.0° ± 0.41°; disparity was reduced by 1.4° ± 0.31° when pointing to oneself
Positional Discrepancy
  • fixed effect for reference (F(1,39.00) = 4.70, p = .036) and visibility (F(1,1486.40) = 13.51, p < .001); interaction between visibility and offset (F(3,1486.40) = 3.24, p = .021)
  • visible hands increased discrepancy by 2.4cm ± 0.66cm; discrepancy was reduced by 0.9cm ± 0.42cm when pointing to oneself

Results: Visualization


  • the next slide provides an interactive overview of the data on the aggregated over the different offset conditions
  • you may need to refresh your browser ([F5]), if it still does not work, your browser settings do not allow to load local javascript
  • on the y axis you can see the dependent measure
  • the x axis displays the four reference × visibility combinations
  • you can switch between boxplots and individual data via the 'bar' and 'value' buttons
  • you can change between the dependent measure with the lower buttons
  • all means differ from zero, except for the discrepancy value in case of invisible hands and pointing towards oneself

Results: Reference × Visibility



Results: IPQ


results_ipq

Results: Summary


  • visibility affected all measures
  • reference only affected disparity and discrepancy
  • more pronounced effects in case of visible hands and external localization
  • type of offset only affected discrepancy
  • measures differed from zero in all but one condition

Main Results


Palm Drift

Compensation of the visual offset persisted in the localization task - especially when hands were visible. Apparently, the center of hand space shifted.

Angular Disparity

Partial compensation of the drift which depended on visibility and the target reference. In general, participants showed stronger compensation in case of stronger drift.

Positional Discrepancy

When the hands remained visible, even for self-localization a systematic discrepancy was observed - manipulated visual information biased the positional estimate grounded in proprioceptive information.

Results on disparity and discrepancy show that adaptation was not restricted to hand space