Dear Professor Reuter,
Thank you for your reply and recommendation for computing the single FDR.
I could solve my problem as follow your explanation.
Best Regards,
Han.
On Fri, Jan 1, 2016 at 12:59 PM, Martin Reuter
wrote:
> Hi Han,
>
> I think detvtx is the detected vertices (a binary vector
Hi Koushik,
yes, this paper compares cross and long processing:
http://reuter.mit.edu/papers/reuter-long12.pdf
http://www.sciencedirect.com/science/article/pii/S1053811912002765
improvements are region dependent. When comparing the methods, you can
keep sample size and number of time points fix
Thanks for the clarification Martin. Is there some literature on how much the
sensitivity and reliability improves? Does it depend on sample size and number
of serial time points?
Happy new year to you too,
Koushik A. Govindarajan
From: freesurfer-boun..
Hi Han,
I think detvtx is the detected vertices (a binary vector with vertices
were you can reject null after FDR). The 3rd argument (pth) is the FDR
threshold.
You can compute pcor = -log10(pth)
from it and use it as a threshold when plotting the p-value map to limit
the plot to only signifi
Hello Freesurfer experts,
I need to upload some data to a public domain and I am trying to best limit
the amount of files needed.
I am wondering what subjects files are absolutely necessary to run an
analysis in qdec. This is for an lgi analysis. I tried only using the surf
directory, but qdec wo
Hello Freesurfer experts,
I need help me finding how to get the minimum number of vertices required
to pass the cluster-corrected threshold with monte carlo simulations?
Is there a way to figure this out?
--
Thank you,
Tara
___
Freesurfer mailing lis
Hi Koushik,
the cross sectional results are from processing the images
independently. Those results are more noisy than necessary. By using
information from across time points within the same subject
(longitudinal stream) noise can be removed, so you will get more
reliable and more sensitive e