Skip to content

BottiniLab/action-hippo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

action-hippo

Code repository for the action hippo project

Code is split into separate directories to analyse behavioural, fMRI and eyetracking data. Files are named sequentially, allowing a straightforward replication by following the ordered files. File paths are relative to the local user, and will need changing to relevant local directories.

All code assumes adherence to the BIDS format for fMRI data; behavioural and eyetracking data is saved separately in a pseudo-BIDS format. For details see code comments.

Code was mostly run on a compute server (SLURM). Code necessary for this is labelled with a letter and 'submit_slurm', but can easily be adapted for use on any personal compute server or run as individual files using the 'sys.argv' commands from terminal.

For any further information, feel free to contact Alex (alex.eperon@gmail.com).

fMRI analysis

  1. Convert raw files to BIDS format and move to working directories
  2. Preprocess using fmriprep
  3. Create events files and ROIs for future analyses; define event files based on planned analysis (RSA)
  4. Run first-level GLM for 4 main conditions using nilearn
  5. Create a subject-specific version of the Juelich atlas mamimum probability maps to use for ROI analysis
  6. Segment the Juelich atlas into ROIs
  7. Create neural RDMs
  8. Compare model RDMs in entorhinal ROIs, excluding the effects of other models using partial correlation
  9. Run a searchlight analysis in subject space (T1W)
  10. Convert searchlight maps to MNI space
  11. Run cluster correction to check for significant clusters in searchlight maps
  12. Use a permutation-based method to create whole-brain voxel reliability maps for a single subject
  13. Create intersected ROIs using the voxel reliability maps created in the previous step
  14. Code to plot the results of the RSA analyses.
  15. Code to predict eye position using deepMReye toolbox (Frey, Nau and Doeller, 2021)
  16. Code to extract events from deepMReye-predicted gaze positions for use in further analyses

eye analysis

  1. Data preprocessing; blink removal
  2. Create an events file to categorise data by condition
  3. Visualise data and create a big dataframe organised by subject and condition
  4. Test if eye movements are skewed right or left in x and y
  5. Test if deepMReye-predicted eye movements are skewed left or right in x
  6. Test if deepMReye-predicted eye movements are skewed left or right in y

About

Code repository for the action hippo project

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published