-
Notifications
You must be signed in to change notification settings - Fork 141
Contrack
This page describes how to generate fibers using ConTrack, a probabilistic fiber-tracking algorithm that identifies the most likely white matter path between two points in the brain. The method is introduced in Sherbondy et al 2008; there is one application paper that shows how to find the optic radiation.
The ConTrack executable (included with mrVista) is implemented only for 64-bit linux machines. If you need to build the executables for your system see the Build QUENCH page and follow the directions there (ConTrack and QUENCH are part of the same package).
The ConTrack algorithm is used to find the most likely path between two ROIs in the brain. To do this the algorithm needs access to the diffusion data (dt6.mat file) as well as the regions between which you would like it to find a pathway (the ROI pair). Once you have the diffusion data and the ROI pair you are ready to track your pathways using ConTrack's pathway sampling algorithm (contrack_gen.glxa64). This algorithm will generate a sample of candidate pathways. Those "raw" pathway samples can then be scored using a separate scoring algorithm (contrack_score.glxa64).
The ability to run the ConTrack algorithm requires a few things:
- The executables - obvious one right? Obtaining the executables is very easy, actually you already have them if you're using VISTASOFT. They are located in... /vistasoft/trunk/mrDiffusion/contrack
- You also need to have the ConTrack executables in your linux path - the Before You Begin section will help with this.
- A dt6.mat file - which requires that you have processed your DTI data using dtiRawPreprocess.
- At least one ROI pair - defined using either MrDiffusion or ItkGray.
- A linux 64-bit machine (preferably with a lot of power).
Running ConTrack consists of the following Steps:
- Run CtrInit (GUI for use with single subjects) or ctr_makeConTrackFiles.m (batch script for use with multiple subjects) to create the required files for use with the pathway generation algorithm (contrack_gen.glxa64).
- Run the tracking algorithm using the shell script generated in step 1 - this will be done in a linux shell and will result in a "raw" set of candidate pathways in .pdb format.
- Run the scoring algorithm (contrack_score.glxa64) in a linux shell by feeding it the "raw", un-scored .pdb file. Scoring can also be done for multiple subjects using ctr_batchConTrackScore.m (see comments for usage).
- Visualize your fibers using MrDiffusion or Quench.
1. Make sure you are running everything on a 64-bit linux machine.
- If you are a Windows user, you should logon to a 64-bit linux machine, like teal, to run ConTrack.
- You need the executables in your path to run the generation script as well as to run the scoring.
- Type:
which contrack_gen.glxa64
- If that command returns a statement saying not found you need to add the path to the contrack executables to your .bashrc file on white as directed below:
cd ~ gedit .bashrc
- Add the following line. Your path should look like this, if it doesn't change the the path to direct to your vistasoft folder:
export PATH="$PATH:$HOME/matlab/svn/vistasoft/trunk/mrDiffusion/fiber/tractography/contrack/"
- Logout of your terminal and login again or source your .bashrc.
- Now you should be able to run which contrack_gen.glxa64 and it will find it.
''subjectDirectory''/fibers/conTrack ''subjectDirectory''/''dt6directory''/bin ''subjectDirectory''/''dt6directory''/dt6.mat
- The bin directory should have the following contents:
- tensors.nii.gz
- pddDispersion.nii.gz
- First, we use the ctrInit GUI to create the necessary files for running the conTrack bash shell script in a terminal window.
- From Matlab, change directory to your subject's dti directory (e.g,. /biac1/subjDir/dti30). Then type:
ctrInit
The GUI will below will pop up:
400px
Box: Input file names - Click on the DT6, ROI1, and ROI2 push buttons to browse for the relevant files. ROI1 is arbitrarily referred to as "start" in the resulting txt file, and ROI2 is arbitrarily referred to as "end" in the resulting txt file.
Box: Parameters - Specify all of the following parameters:
- desired number of fibers to generate
- max fiber length allowed
- min fiber length allowed
- step size in millimeters (length is in number of nodes of this step size)
- Y/N, do you want to seed ROI1
- Y/N, do you want to seed ROI2
- Y/N, do you want to create a new PDF file if it already exists
- Y/N, do you want to create a new WM prob file if it already exists
- For more information on the parameters, what they mean, how they are translated into the ctrSampler.txt file, jump to this section
Button: Set time stamp - When you have set all values in the input file names and parameters box and are ready to generate the conTrack files, click this button. This will update the time stamp. This string with the date/time will be added to all the file names that you create.
yyyymmddThhmmss: military time
Button: Create ConTrack Files! - This is the button that actually creates all the files you need to create in order to go ahead and run the bash shell script (see below).
ctrInit will create all the following files, so watch out if you don't have them or they don't look right.
- subjDir/fibers/conTrack/ctrSampler_timestamp.txt: text with parameters read by contrack_gen/score
- subjDir/fibers/conTrack/ctrScript_timestamp.sh: script we execute to conduct contrack_gen
- subjDir/dt6dir/bin/pdf.nii.gz: probability density function file used to guide probablistic tracking
- subjDir/dt6dir/bin/wmProb.nii.gz: map of probability each voxel contains white matter created by call to dtiFindWhiteMater
- subjDir/dt6dir/bin/roi1_roi2_timestamp.nii.gz: nifti file that is basically a mask with each ROI given a different mask value.
- The .txt/.sh files can be viewed and edited in any text editing program. They are created by ctrInitParamsFile, ctrSave, and ctrScript.
- All nifti files (except pdf.nii.gz) can be loaded for visual inspection in dtiFiberUI: File --> Add Nifti. You can also readFileNifti and showMontage the resulting data.
- Take extra special care to inspect the wmProb.nii.gz file!!! (for anything in particular?)
- Open a terminal window (bash shell).
- Change to the directory where the .sh and .txt files are located:
subjDir/fibers/conTrack
- Type the name of the .sh file into the command line to run the script. NOTE: You may have to change the permissions of the file to allow group users to execute the file. For example, use chmod a+x * within the directory to grant execute privileges for all files in the directory to the group.
./ctrScript_20080603T181312.sh
- This should result in fibers that can be scored and eventually loaded into mrDiffusion and analyzed. Running the command ps will allow you to see if it is running. You can end the script by typing "Ctrl+c" or using the command kill if it's running in the background.
- If it works, output to command line screen will say:
- "Fetching parameters from disk ..."
- "Loading Tract Params File"
Required arguments:
-i xxx.txt: CTR parameters text file. Can this be the same as the text file used for pathway generation, or should it be different?
-p xxx.pdb: Name of the output file for the pathways. Can this take on other formats, like .mat? What is the consequence of a .pdb format -- which programs can read it, and what special properties does it have that .mat doesn't have?
-- thresh # [--sort]: Where '#' a value. Not sure what values this can take on in the current scoring system. However, if the flag --sort is also passed, then the value becomes the top scoring # pathways.
xxx..pdb: Must have at least one pathway as an input pathway. Not sure what other formats can be taken on.
To save and score only pathways that have a score greater than -1.5
contrack_score.glxa64 -i ''ctrSampler.txt'' -p scoredFgOut_-1p5.pdb --thresh -1.5 ''fgIn.pdb''
To save and score a particular number of pathways (e.g., 5000):
contrack_score.glxa64 -i ''ctrSampler.txt'' -p scoredFgOut_top5000.pdb --thresh 5000 --sort ''fgIn.pdb ''
You can see the help for this executable by typing:
contrack_score.glxa64 --help
The ctrInitBatch pipeline consists of three functions: ctrInitBatchParams, ctrInitBatchTrack, and ctrInitBatchScore. These three functions work in concert to use conTrack to create pathway samples for multiple subjects and/or multiple ROI pairs. We discuss the use of these functions in the sections below.
- The basic workflow for multi-subject tractography is as follows:
- Run ctrInitBatchParams.m in matlab to set the tracking parameters and subject-specific variables (paths, roi pairs etc.)
- Run ctrInitBatchTrack.m in matlab. This will produce a shell script which you can run in your terminal to do the actual tractography (be sure that the contrack executable files are in your path, if you're unsure if this is the case please see the section above which will help you with this).
- Run ctrInitBatchScore.m in matlab. This will produce another shell script which you can execute in your linux terminal (or within matlab).
- The first step in running ConTrack on multiple subjects is to make sure that each of your subjects have:
- The same directory structure.
- The same ROIs - with the same naming conventions (i.e., if you want to track from the corpus callosum to the cingulate all of your subjects should have a corpus callosum ROI as well as a cingulate ROI, and they should all be located in the ROIs directory for that subject).
- ROIs should be located in /baseDirectory/subjectDirectory/ROIs.
-
ctrParams = ctrInitBatchParams;
[cmd] = ctrInitBatchTrack(ctrParams);
system(cmd);
batchFileName = ctrInitBatchScore(infoFile);
system(batchFileName);
I. Initialize tracking parameters and variables using ctrInitBatchParams.m
- What you put in:
- You will need to initialize the structure and set the fields (see comments section below) to reflect the location of your data, the subjects you wish to run, the ROIs you wish to track between etc.
- What you get out:
- ctrParams - The output structure containing all the information needed to run ConTrack on your data. The output structure is then fed into ctrInitBatchTrack to create the files needed to run the algorithm.
-
>> ctrParams = ctrInitBatchParams
ctrParams =
projectName: 'myConTrackProj'
logName: 'myConTrackLog'
baseDir:
dtDir:
roiDir:
subs: {}
roi1: {}
roi2: {}
nSamples: 50000
maxNodes: 240
minNodes: 10
stepSize: 1
pddpdfFlag: 0
wmFlag: 0
roi1SeedFlag: 'true'
roi2SeedFlag: 'true'
multiThread: 0
executeSh: 0
ctrInitBatchParams.m Comments
The comments from ctrInitBatchParams.m are copied below and should be read carefully before running the function.
-
OVERVIEW:
ctrParams = ctrInitBatchParams([varargin])
Initialize the parameters structure used with ctrInitBatchTrack when
doing multi-subject tractography using the conTrack algorithm.
FIELD DESCRIPTIONS:
ctrParams ...
.projectName = ctrParams.projectName = ['myConTrackProj'];
This variable will be used in the name of all the
files that are created. E.g., outFile =
['fg_',projectName,'_',roi1,'_',roi2,'_',timeStamp,'.
]; .logName = ctrParams.logName- = ['myConTrackProj']; This will be used to provide a unique name for easy ID in the log directory. .baseDir = Top-level directory containing your data. The level below baseDir should have each subjects data directory. * e.g., ctrParams.baseDir = '/biac/wandell/data/dti/'; .dtDir = This should be the name of the directory containing the dt6.mat file.* Relative to the subjects directory. * e.g., ctrParams.dtDir = 'dti40trilinrt';. .roiDir = Directory containing the ROIs. * Relative to the subjects directory. * e.g., ctrParams.roiDir = 'ROIs'; .subs = This is the cell array that will contain the names of all the sujbect's directories that will be processed. * e.g., ctrParams.subs = {'sub1','sub2','sub3'}; NOTE: This script also supports the ability to load a list of subjects from a text file. If you wish to do this simply leave the cell empty. You will be prompted to select a text file that contains a list of subjects. Please assure that this list is a simple text file with only subject names seperated by new lines or spaces. * This can also be a path to a text file with subject IDs. .roi1/roi2 = These two cell arrays should contain the names of each ROI to be used in tracking. The script will track from ROI1{1} to ROI2{1} and ROI1{2} to ROI2{2} etc... Use .mat files, but DO NOT include file extensions. In case that you wish to track from multiple rois (ROI1) to the same roi (ROI2) you can just place the name of one roi in ROI2 and each roi in ROI1 will be tracked to the single roi in ROI2. * e.g., ctrParams.roi1 = {'Roi1a','Roi1b'}; ctrParams.roi2 = {'Roi2a','Roi2b'}; .nSamples = [50000]; % Enter the number of pathway samples to generate. .maxNodes = [240]; % Enter the max length of the samples. .minNodes = [10]; % Enter the minimum length of the samples. .stepSize = [1]; % Enter the step size. .pddpdfFlag = [0]; % 0 = Only compute if file does not already exist. 1= Always recompute. .wmFlag = [0]; % 0 = Only compute if file does not already exist. 1= Always recompute. .roi1SeedFlag = ['true']; % We usually want to equally seed both ROIs, so both flags = 'true'. .roi2SeedFlag = ['true']; % For speed you can choose not to seed the second ROI .multiThread = [0]; % Spawn many tracking jobs at once. .executeSh = [0]; % Execute the script on cuurent host immediately using an xterm. EXAMPLE USUAGE: ctrParams = ctrInitBatchParams; ctrParams.baseDir = '/Directory/Containing/Data'; ctrParams.dtDir = 'dti40trilin'; ctrParams.roiDir = 'ROIs'; % Relative to the subject dir. Contains the ROIs ctrParams.subs = {'subDir1','subDir2'}; ctrParams.ROI1 = {'Roi1a','Roi1b'}; % assumes '.mat' ctrParams.ROI2 = {'Roi2a','Roi2b'}; % assumes '.mat' ctrParams.nSamples = 10000; [cmd] = ctrInitBatchTrack(ctrParams); WEB RESOURCES: http://white.stanford.edu/newlm/index.php/ConTrack mrvBrowseSVN('ctrInitBatchParams'); See Also: ctrInitBatchTrack.m , ctrInitBatchScore.m
This section will help you run ctrInitBatchTrack.m. Copied below are the comments for the script, which should be read in detail before you run the script.
- What you put in:
- ctrParams - this structure, created with ctrInitBatchParams.m (see previous section) contains all the information needed to run this function.
- What you get out:
- cmd - this is the path to the shell script that will be run in your linux terminal, which calls contrack_gen for each of your subjects with the supplied parameters and actually tracks the fibers.
- infoFile - this .mat file will be used in the next step (scoring). It contains all your parameters and makes the scoring process a snap.
ctrInitBatchTrack.m Comments
The comments from ctrInitBatchTracks.m are copied below and should be read carefully before running the function.
-
function [cmd] = ctrInitBatchTrack(ctrParams)
OVERVIEW:
This script takes functions from ctrInit and makes the sampler.txt
and .sh files (used by conTrack to generate fibers) for a large
group of subjects using as many pairs of ROIs as the user desires.
The logFile: Reports the results of the process as well as the
parameters used to setup the tracking script.
The infoFile: (info structure) Created for use with
ctr_conTrackBatchScore.m and saved in the log dir with the same
name as the log file.
What you end up with here is: (1) a log.mat file (for use with
ctr_conTrackBatchScore (path = infoFile), and (2) a .sh shell
script that will be displayed in the command window, which will run
tracking for all subjects and ROIs specified. The resulting .sh
file should be run on a 64-bit linux machine with plenty of power.
USAGE NOTES:
The user should use ctrInitBatchParams to initialize the structure
that will contain all variables and algorithm params. see
ctrInitBatchParams.m - mrvBrowseSVN('ctrInitBatchParams');
After the script has completed the user will see instrucitons
appear in the command window telling the user to copy and paste a
provided line of code into their terminal in order to initiate
tracking. They will also see the full path to the log file that was
created by this script.
The directory in which the fibers will be saved is: subDir/fibers/conTrack/
INPUTS:
ctrParams - a structure containing all variables needed to run this
code. Get this struct by running 'ctrInitBatchParams'
OUTPUTS:
cmd - the command that can be run from your terminal to initiate
tracking. ** Why don't we allow the user to execute the
command through matlab??? Becuase it takes up matlab
licenses for the duration of the tracking - which is not
good.
infoFile - Path to a file containing a struct with all parameters
used for tracking. infoFile can be passed in to
ctrBatchScore to initiate scoring.
VARIABLES:
** See ctrInitBatchParams to create these variables **
projectName = This variable will be used in the name of all the files
that are created. E.g.,
outFile = ['fg_',projectName,'_',roi1,'_',roi2,'_',timeStamp,'.pdb'];
logName = This will be used to provide a unique name for easy ID in the log directory.
baseDir = Top-level directory containing your data.
The level below baseDir should have each subjects data directory.
dtDir = This should be the name of the directory containing
the dt6.mat file. E.g., dti40trilinrt.
logDir = This directory will contain the log files for this project.
scrDir = This directory will contain the .sh files used for tracking in linux.
subs = This is the cell array that will contain the names
of all the sujbect's directories that will be
processed. ( e.g. subs = {'sub1','sub2','sub3'}; )
NOTE: This script also supports the ability to load a
list of subjects from a text file. If you wish to do
this simply comment out the subs variable in section
I or leave the cell empty. You will be prompted to
select a text file that contains a list of subjects.
Please assure that this list is a simple text file
with only subject names seperated by new lines or
spaces.
ROI1 & ROI2 = These two cell arrays should contain the names of
each ROI to be used in tracking. The script will
track from ROI1{1} to ROI2{1} and ROI1{2} to ROI2{2}
etc... In case that you wish to track from
multiple rois (ROI1) to the same roi (ROI2) you can
just place the name of one roi in ROI2 and each roi
in ROI1 will be tracked to the single roi in ROI2. **
Code assumes '.mat' file extensions. - which you
don't have to inlcude
E.g., ROI1 = {'Roi1a','Roi1b'}; ROI2 =
{'Roi2a','Roi2b'};
EXAMPLE USUAGE:
ctrParams = ctrInitBatchParams;
ctrParams.baseDir = '/Directory/Containing/Data';
ctrParams.dtDir = 'dti40trilin';
ctrParams.subs = {'subDir1','subDir2'};
ctrParams.roi1 = {'Roi1a','Roi1b'};
ctrParams.roi2 = {'Roi2a','Roi2b'};
ctrParams.nSamples = 10000;
[cmd] = ctrInitBatchTrack(ctrParams);
WEB RESOURCES:
http://white.stanford.edu/newlm/index.php/ConTrack
mrvBrowseSVN('ctrInitBatchTrack');
mrvBrowseSVN('ctrInitBatchParams');
SEE ALSO:
ctrInitBatchParams
Once you have run ctrInitBatchTrack.m and have executed the resulting batch shell.sh file you are well on your way to examining your resulting pathways. First, however, you want to score the resulting pathway samples. Doing this is very simple once you have the log.mat file that was created by ctrInitBatchTrack.m
To score your pathway samples:
- run ctrInitBatchScore.m in matlab
- In the dialog box - select the log.mat file created by ctrInitBatchTrack.m
- In the next dialog box (see image below) choose the number of pathways you would like returned from your original sample and choose if you would like all of the scoring operations to happen in parallel (1) or serially (0).
- Execute the resulting shell script in your linux terminal. (The command will appear in, and can be copied directly from, your matlab command window.
ctrInitBatchScore - Comments
-
batchFileName = ctrInitBatchScore([infoFile], ...
[numPathsToScore], [multiThread=0], [executeScript]);
OVERVIEW
This function allows the user to score multiple conTrack fiber sets
across subjects. The user must point to an infoFile that was created with
ctrInitBatchTrack.m. That script will place all relevant data into a
structure which this function will read in.
We save out the top numpathstoscore from the original fiber group. This
is the --thresh and --sort option in contrack_score.glxa64. See
http://white.stanford.edu/newlm/index.php/ConTrack#Score_paths for more
info.
INPUTS:
infoFile - Created by ctrInitBatchTrack. Contains all relevant
data
in a struct that is read here. If the user does not
pass define this variable then the gui flag is tripped
and they are prompted for the info file and the other
variables as well.
numPaths... - Number of paths to score.
multiThread - If multiThread == 1 all scoring commands will be executed
in parallel.
executeScript - If executeScript==1 (default=0) then a terminal will be
launched and the batch script will be run immediately
on this machine. If == 0 then the script location will
be thrown to the command window and the user can
execute it where he/she pleases.
OUTPUTS:
batchFileName - Name of the shell script that is run to do the
scoring.
WEB RESOURCES:
mrvBrowseSVN('ctrInitBatchScore');
http://white.stanford.edu/newlm/index.php/ConTrack
- Now you have you pathway samples scored and ready to be loaded into QUENCH.
This section provides instruction for those wishing to use conTrack to create pathway samples for multiple subjects as well as multiple ROI pairs. We discuss the use of scripts that were developed specifically for this purpose and how to modify them so that they will work for you.
- The basic workflow for multi-subject tractography is as follows:
- Run ctrBatchCreateContrackFiles.m in matlab.
- Execute the resulting shell script in your linux terminal (be sure that the contrack executable files are in your path, if you're unsure if this is the case please see the section above which will help you do just this).
- Run ctrBatchScore.m in matlab.
- Execute the resulting shell script in your linux terminal.
- The first step in running ConTrack on multiple subjects is to make sure that each of your subjects have:
- The same directory structure.
- The same ROIs - with the same naming conventions (i.e., if you want to track from the corpus callosum to the cingulate all of your subjects should have a corpus callosum ROI as well as a cingulate ROI, and they should all be located in the ROIs directory for that subject).
- ROIs should be located in /baseDirectory/subjectDirectory/dti*/ROIs.
This section will help you run ctrBatchCreateContrackFiles.m. Copied below are the comments for the script, which should be read in detail before you run the script.
- What you put in:
- Directory structure - you must tell the script where your data lives.
- A list of subjects - you can edit the script and add your subjects into the subs variable, or leave the variable empty and a dialog box will appear allowing you to feed the script a text file with a list of your subjects.
- ROIs - you must edit the script and enter your ROIs into the ROI1 and ROI2 variables.
- Parameters - edit the script and set the params in section III.
- What you get out:
- Shell Script - this script will be what you run in your linux terminal, which calls contrack_gen for each of your subjects with the supplied parameters and actually tracks the fibers.
- Log File - this log file will help you keep track of the subjects, ROIs and parameters you used in your project.
- Mat File - this .mat file will be used in the next step (scoring). It contains all your parameters and makes the scoring process a snap.
-
OVERVIEW
This script takes functions from ctrInit and makes the sampler.txt and
.sh files (used by conTrack to generate fibers) for a large group of
subjects using as many pairs of ROIs as the user desires.
The logFile: Reports the results of the process as well as the parameters
used to setup the tracking script.
The infoFile: (info structure) Created for use with
ctr_conTrackBatchScore.m and saved in the log dir with the same name as
the log file.
What you end up with here is: (1) log file (2) log.mat file (for use with
ctr_conTrackBatchScore, and (3) .sh shell script that will be displayed
in the command window, which will run tracking for all subjects and ROIs
specified. The resulting .sh file (3) should be run on a 64-bit linux
machine with plenty of power.
USAGE NOTES:
The user should only edit the lines of code within sections I, II
and III. These sections include variables that will change for each
user and should be set before the script is run.
After the script has completed the user will see instrucitons
appear in the command window telling the user to copy and paste a
provided line of code into their terminal in order to initiate
tracking. They will also see the full path to the log file that was
created by this script.
The directory in which the fibers will be saved is: subDir/fibers/conTrack/
VARIABLE DESCRIPTIONS:
projectName = This variable will be used in the name of all the files
that are created. E.g.,
outFile = ['fg_',projectName,'_',roi1,'_',roi2,'_',timeStamp,'.pdb'];
logName = This will be used to provide a unique name for easy ID in the log directory.
baseDir = Top-level directory containing your data.
The level below baseDir should have each subjects data directory.
dtDir = This should be the name of the directory containing
the dt6.mat file. E.g., dti40trilinrt.
logDir = This directory will contain the log files for this project.
scrDir = This directory will contain the .sh files used for tracking in linux.
subs = This is the cell array that will contain the names
of all the sujbect's directories that will be
processed. ( e.g. subs = {'sub1','sub2','sub3'}; )
NOTE: This script also supports the ability to load a
list of subjects from a text file. If you wish to do
this simply comment out the subs variable in section
I or leave the cell empty. You will be prompted to
select a text file that contains a list of subjects.
Please assure that this list is a simple text file
with only subject names seperated by new lines or
spaces.
ROI1 & ROI2 = These two cell arrays should contain the names of each ROI to be used
in tracking. The script will track from ROI1{1} to ROI2{1} and
ROI1{2} to ROI2{2} etc... Use .mat files, but DO NOT include file
extensions. In case that you wish to track from multiple rois (ROI1) to the
same roi (ROI2) you can just place the name of one roi in ROI2 and
each roi in ROI1 will be tracked to the single roi in ROI2.
E.g., ROI1 = {'Roi1a','Roi1b'}; ROI2 = {'Roi2a','Roi2b'};
HISTORY:
08.27.2009: LMP Wrote the thing
07.23.2010: LMP adapted from ctr_makeConTrackFiles and made the code more
general.
Once you have run ctrBatchCreateContrackFiles.m and have executed the resulting batch shell.sh file you are well on your way to examining your resulting pathways. First, however, you want to score the resulting pathway samples. Doing this is very simple once you have the log.mat file that was created by ctrBatchCreateContrackfiles.m.
To score your pathway samples:
- run ctrBatchScore.m in matlab
- In the dialog box - select the log.mat file created by ctrBatchCreateContrackFiles.m
- In the next dialog box (see image below) choose the number of pathways you would like returned from your original sample and choose if you would like all of the scoring operations to happen in parallel (1) or serially (0).
- Execute the resulting shell script in your linux terminal. (The command will appear in, and can be copied directly from, your matlab command window.
Loading Fibers Using MrDiffusion or Quench
To quickly view your scored fiber pwathways you can either load them into Quench or import them using MrDiffusion in matlab.
MrDiffusion: File --> Fibers --> Import fibers ...
Quench: File > Load Pathways ...
... Select the .pdb paths you just created from contrack_score
A super-basic bash shell primer is located here.
You can see what is inside the script by using simple bash commands: more ctrScript_20080603T181312.sh cat ctrScript_20080603T181312.sh
You can edit the script by using the a terminal text editor like emacs or nano: nano ctrScript_20080603T181312.sh
If you run into permissions problems, make sure to make the script executable: chmod a+x ctrScript_20080603T181312.sh
#!/bin/bash
- Bash script syntax for setting up this kind of file
- CMD is a variable that stores the command string that we want to execute in order to generate paths using conTrack. This line doesn't do anything yet, it just stores the command into the CMD variable.
- nohup: means "no hang up" and ensures that the path generation will continue even if you log out of the terminal
- contrack_gen.glxa64: this is the actual tract generation code, which has been compiled to work on a 64-bit linux (teal). This means that it has been converted to computer language (0s and 1s) and is no longer human readable, unlike a matlab.m file. You can see the help for this executable by typing:
- -i ctrSampler_timestamp.txt: -i is a flag that lets contrack_gen know the next string specifies the text file containing the parameters that conTrack_gen will look in for tract generation parameters. Since this text file is in the same directory the script is called from, we don't provide any additional path info.
- -p roi1_roi2_timestamp.pdb: this second argument sets the output file name where the generated fibers will be saved. The location will be here, where the ctrScript is called from, in the subjDir/fibers/conTrack directory. These fibers are in Bfloat format, which cannot be loaded into mrDiffusion. You could change the ext to .pdb, but anyhow, we should not look at unscored fibers in mrDiffusion, says Tony.
- 2>ctrLog_timestamp.txt: the 2> syntax means to write anything printed to the command line providing information about status/errors/etc to a log file, whose name is specified right after.
- The command echo is equivalent to using fprintf in MATLAB and will print a copy of the command executed above to the command line so you can look at it and make sure there are no shenanigans.
- This actually tells the bash shell to execute the command.
The ctrSampler_timestamp.txt file has the following fields.
- Params: This should be set to 1, it is the version number that Tony's functions need to know about.
- Image Directory: This is the full path to where the image data are stored, usually inside the subject's dt6/bin directory. For example: /biac1/subjDir/dti30/bin/
- WM/GM Mask Filename: This should be wmProb.nii.gz, and path listed relative to the image directory (so just a bare filename since it is located in the bin directory already). This probability image is used as a mask, depending on the threshold parameter below. This means that conTrack will NOT track over voxels with values below the threshold. We therefore want the threshold to be extremely lenient. Tony says it's okay if CSF is included, etc.
- PDF Filename: This stands for probability density function and should be pdf.nii.gz, with path listed relative to the image directory (so just a bare filename since it is located in the bin directory already). If tensors.nii.gz and pddDispersion.nii.gz look okay, this will look okay. This is the critical file for probabilistic tracking as it gives the distribution of probabilities that in each direction. The PDF on the PDD reflects tensor uncertainty. For example, an elongated cigar shape is very certain to be along the length axis of the cigar shape, whereas a sphere shape would reflect equal probability in all directions. This is shown in Figure 4 of Tony's paper.
- ROI MASK Filename: Example=rffa_rlo_20080624T144612.nii.gz. This path is also given relative to the image directory. This is created by ctrInit based on the two ROIs that you specify and is basically a mask with each ROI given a different mask value.
-
Desired Samples: Tony's suggested default value is 100,000. According to Bob, anything in the thousands will be sufficient (e.g., 5,000). The more, the better, because if there are multiple pathways, one of which is more likely (fewer intersecting fibers,etc), then having a larger number of desired samples will increase your ability to detect these less likely pathways. In addition, a danger of having too low a number is the danger of getting stuck in a local minimum, which means that you would get totally different fibers each time you'd run conTrack and ask for that number of samples. The way to detect this problem is to run the program multiple times and compare the results.
- Relationship between number of samples and running time: Each set of ROIs will have a different running time depending on how "difficult" it is for the algorithm to find paths between them (shorter running time if paths are easy to find, infinite if no paths exist). However, if you know how long the program takes to generate a certain number of fibers, the time to track more fibers will increase linearly. For example, if it takes 5 minutes to track 500 fibers, it will take 10 minutes to track 1,000 fibers.
- Max Pathway Nodes: This number is in units of stepsize. Pathways with greater number of nodes than this number will not be returned. You want it low enough that you won't get a fiber that loops around the whole brain multiple times (impossible). Tony's suggested default value for this is 240. Ideally, you would somehow calibrate this to the known distance between your ROIs.
- Min Pathway Nodes: This is also in units of stepsize, and will depend on the minimum distance between your ROIs. Pathways with fewer nodes than this number will not be returned. Right now, I am using a default value of 3 for fibers between LO and FFA, as these are close together.
- Step Size (mm): Default value is 1. You want this value to be less than the resolution of your data. For example, our DTI data voxel resolution is 2x2x2, so a step size of 1 works well. If our voxel resolution changes, we may want to revisit this issue and come up with a new step size.
- Start Is Seed VOI: If you check the box "Seed ROI 1" in the ctrInit gui, that will set the value of this to true. If you want symmetric tracking, which is usually the case, then both this field and the next field should be set to true. In some special cases (e.g., tracking to a fiber bottleneck like the corpus callosum), seeding both a gray matter ROI and the white matter bottleneck will be extremely inefficient, so you would want to set seeding in the white matter ROI to false.
- End Is Seed VOI: Same as above, but reflects checkbox status of "Seed ROI 2" in the ctrInit gui.
- Save Out Spacing: Tony's suggested default is 50. This has to do with how contrack_gen saves out the data. Rather than waiting until all billion paths are generated, it will only hold up to the amount you specify in memory before writing to file.
- Threshold for WM/GM specification: Tony's suggested default is 0.01. See discussion of WM/GM mask file above.
- Absorption Rate WM: Tony's suggested default is 0. Absorption refers to the penalty/reward parameter for length. In his 2008 JOV paper, Figure 5A, it is ln(lamda). It states on p.7 that "the user may want to adjust lamda depending on the distance between the anatomical regions of interest, though we never observed a large effect."
- Absorption Rate NotWM: Tony's suggested default is 0.
- Local Path Segment Smoothness Standard Deviation: Tony's suggested default is 14. This is sigma-subscript-c in Figure 5B on p.7 of the Sherbondy paper. He did many analyses and found that setting this parameter to this value resulted in the most consistent conTrack scoring. It is in units of degrees.
- Local Path Segment Angle Cutoff: Tony's suggested default is 130. This is the amount that a path is allowed to turn between tensors and is in units of degrees.
- ShapeFunc Params (LinMidCl,LinWidthCl,UniformS): Tony's suggested default is [0.175,] This is shown in Figure 4 of the paper and specifies how pathway likelihood varies as a function of tensor probability distribution/density/shape (from more prolate/pancake vs more oblate/spherical). These tensor probability density functions are computed from the pdf.nii.gz.
Number of samples generated: Sample of 200,000 for SFN, samples of 100,000 reported in prior Sherbondy papers
Step size: 1mm (less than voxel size is typical)
Procedure for restricting generated fibers: SFN -- we took the top scoring 10% of fibers which are estimated to be the most likely (top 20,000 of 200,000). NSF -- top scoring 5% of fibers (top 10,000 of 200,000).
Bootstrapping: The pdf.nii.gz uses the bootstrapping of the tensor values in addition to the eigenvalues computed during the tensor fit (dtiRawFitTensorMex). We should report how many samples are used for the bootstrapping and how many iterations are used.
Scoring parameters: Sherbondy et al, 2007 -- there are parameters that set conTrack's "priors"
-
sigma-sub-c: prior that smoother paths are more likely
- In ctrSampler file: Local Path Segment Smoothness Standard Deviation: 14 see above for more info
- Empirically defined from a set of STT pathways (Sherbondy et al, 2007, p.7)
-
lamda: length penalty / prior that shorter pathways are more likely
- In ctrSampler file: Absorption Rate WM: 0 see above for more info
- What does setting this to 0 mean? Does this mean that conTrack does not apply any kind of length penalty?
- Why does the geometric mean score ignore lamda? This was written here.
-
eta: data weighting / prior that pathways that depend on data are more likely
- In ctrSampler file: ShapeFunc Params (LinMidCl,LinWidthCl,UniformS): [0.175,] see above for more info
- Empirically defined from examination of the CL value of voxels in regions of suspected fasciculi crossings.
- Can setting this to a different value basically tell conTrack to ignore the data?
- Turning threshold
- In ctrSampler file: Local Path Segment Angle Cutoff: 130
- Not sure how this is used by conTrack, find out...
- data uncertainty: Sherbondy et al, 2007, p.5, after equation (6): this was set to the maximum level of uncertainty according to the Watson distribution, which was the equivalent of 54.4 degrees on a sphere. This is further described in Schwartzman, Dougherty, & Taylor, 2005.
- shape uncertainty: original tensors are replaced by tensors in which all eigenvalues are equal (sphere), though the trace (volume, MD) is preserved to match the original value. We did not set all tensor matrix traces to be equal to each other and to some particular value. Although absolute eigenvalue magnitudes are passed into conTrack via the pdf.nii.gz files, Tony says that only ratios are used. I can't verify this because the conTrack code is compiled/not human-readable.
Samir Menon has rewritten the ConTrack scoring algorithm in Matlab. Instructions on using his implementation can be found on the ConTrack Score page.
Notes related to script development.
- ctrCreate: all defaults set here should be rewritten as calls to ctrSet
- Menu: Edit --> Full Parameter list: this should eventually list all the parameters and allow the user to interact with/edit all the parameters (not just the handful currently exposed on the ctrInit GUI).
- Develop way to bypass ctrInit by specifying the info from the command line without going through the GUI. Type the command below for more information:
- help ctrInitParamsFile
- File --> Load a ctrParams.txt file that was previously created/saved.
- Bob: fix dtiFindWhiteMatter
- What is the difference between the two absorption parameters? Is the value I put in the value of lamda, or ln(lamda)? How should this value be adjusted for different distances between ROIs? Increased/decreased as ROIs are closer together? A: The Absorption rate for WM value is the same as ln(lambda) in the paper. The value provided in this parameter will be added to every point that the path goes through that intersects the white matter mask. The other absorption parameter is not used because the default of the program is to not allow pathways outside of the white matter mask. A penalty above zero will penalize long pathways, below zero shorter pathways. Note: The default score is now the geometric mean of the scores along the pathway. This version of the score will therefore ignore lambda.
- Check that I understand relationship between time and number of samples.
- Any way to make ctrInit automatically make the ctrScripts executable? (problem with permissions)
- Does the nohup command also require me to be running the script in the background, or will it work even if contrack_gen is running in the foreground? A: nohup means that no hangup signal will be sent...so you can have it in the foreground or background and you can kill it with Ctrl-c if it is in the foreground. The purpose of nohup is to be able to log out of your shell that you have initiated the command from and not have Linux kill the application because it thinks it is a zombie. You can do man nohup for more information.
- Why doesn't the ctrScript.sh file work as is? To make it work currently, I have to skip the nohups and 2> portions of the script.
- Why does the dtiFiberUI "import fibers" want Bfloat format but not pdb format?
Print status of bash shell script to screen
According to Brian, life is better if you treat all important structs as objects. This means that you would get information from them with a "get" call, change field values with a "set" call, and create the struct fields with default values in a "create" call.
Example of this is:
ctrCreate: should consist of calls to ctrSet
ctrSet: the reverse of ctrGet
ctrGet: Will typically be longer, with more functions than ctrSet. This is because you should be able to ask a variety of questions of the object -- for example, What is your mean, max, min length? What is the length in various units? Inside here we can manipulate the values stored in a particular field so they are output appropriately.
- Vistasoft
- Getting Started
- mrVista Overview
- Anatomy
- Functional MRI
- mrVista
- Retinotopy tutorial
- Population RF methods also prf Model, prf_tutorial, prf tutorial
- Diffusion weighted MRI
- Visualization
- Tractography
- Tutorials
- Software overview