Here is the qualtrics link for this session https://qimr.az1.qualtrics.com/jfe/form/SV_eA90nL27Ww43rIa Here are the commands in a txt file if that is easier to work with # Today we will be working in scratch today as the files we're using are big mkdir -p /scratch/${USER}/{raw,mds,qc,saige} # first we're going to copy the plink files we'll be using as input cp /faculty/sarah/2023/Imputation/* /scratch/${USER}/saige/ mv /scratch/${USER}/saige/myers19.* /scratch/${USER}/raw/ #next we are going to start the singularity container singularity shell --hostname localhost --bind /scratch/${USER}:/data /usr/local/lib/singularity/ImputationProtocol.sif # and load the applications we'll be using setup-hadoop --n-cores 2 setup-imputationserver # Next we're going to set up the files for imputation # we start by make some MDS components (which are conceptually similar to PCs) # this is not technically part of an imputation protocol but it is good to check the ancestry of your sample before imputing cd /data/mds enigma-mds --bfile ../raw/myers19 # then we're going to do one final QC step cd /data/qc enigma-qc --bfile /data/raw/myers19 -c 19 --study-name myers19 # now we're going to start imputing imputationserver --study-name myers19 --population eur #OH NO!!! we have strand flips - luckily we can easily fix this check flip --bfile ../raw/myers19 # we need to remake the vcf files cd /data/qc enigma-qc --bfile /data/raw/myers19.check-flip -c 19 --study-name myers19 # then try the imputation again imputationserver --study-name myers19 --population eur ######################################################################## #Once the imputation is finished we will exit the container and process the data exit cd /scratch/${USER}/output/myers19/local/ #Unzipping the data 7za -ppassword x chr_19.zip tabix -f -p vcf -C chr19.dose.vcf.gz ######################################################################### #First we'll load SAIGE singularity shell --hostname localhost --bind /scratch/${USER}:/data /usr/local/lib/singularity/saige.sif cd /data/saige #Next we'll run step 1 of the SAIGE analysis in this step we run the LMM and creates a pre-processed r data file ./SAIGE_step1AD.sh #Next we'll run step 2 of the SAIGE analysis in this step we run the association analyses ./SAIGE_step2AD.sh exit #########################################################################