RE: [NMusers] OMEGA matrix

2014-10-02 Thread Eleveld, DJ
Hi Jeroen,

I have also seen that adding correlations often got a impressive improvement in 
objective function. However, very often when I test that model using 
cross-validation the predictive performance is *worse* than the model without 
the correlation. I would call that classic over-fitting. Its the same thing you 
would expect when adding an unnecessary theta in the model. This was a bit of 
an eye-opener for me to see these things as equivalent. Try it, you might be 
suprised.

It might depend on how you evaluate and use your models. I care most about how 
well a model predicts datasets it has never seen and less about whether its the 
absolute best model for the current dataset. Your applications may be different.

As far as whether adding a correlation was an additional degree of freedom or 
not was always confusing to me. Now I just let NONMEM decide, it just tells you 
how many parameters it is optimizing.

warm regards,
Douglas Eleveld


Van: Jeroen Elassaiss-Schaap [mailto:jer...@pd-value.com]
Verzonden: September 30, 2014 1:00 AM
Aan: ken.kowal...@a2pg.com; nmusers@globomaxnm.com; joseph.stand...@nhs.net; 
non...@optonline.net; Eleveld, DJ
Onderwerp: Re: [NMusers] OMEGA matrix


Dear Pavel, others,

The underlying technical difference is that SAEM is in its core a sampling 
methodology. Off-diagonal elements (as explained by Bob Bauer) are available as 
sample correlations and do not have to be separately computed in contrast to 
linearization approaches such as FOCE.

The more interesting question to me, as also eluted to by Ken, is what criteria 
to set up for inclusion of an off-diagonal element. I completely support his 
argument for simulation performance of the model, as e.g. judged using a VPC. 
Whether to score it as an additional degree of freedom may be up to debate. An 
off-diagonal element in essence limits the freedom of the model as the random 
space in which samples can be generated will be smaller. In that perspective 
one could argue to retain any off-diagonal element that is sufficiently 
deviating from zero regardless of ofv changes, and to not apply the concept of 
over-parametrization (or at least not in comparison to other types of 
parameters). In practice inclusion of an important off-diagonal is mostly 
accompanied by a sound improvement in ofv anyway.

More can be found in earlier discussions we had on this list, see e.g. 
 
https://www.mail-archive.com/nmusers@globomaxnm.com/msg02736.html for quite an 
extensive one from 2010. Here also an r-script to visualize the parameter space 
impact can be found ;-).

In cases where a larger full or banded omega block is found, I would advice to 
explore its properties further using matrix decomposition approaches (PCA etc) 
to evaluate propagated correlations across the matrix.  But also on the basis 
of physiology/pharmacology as a data sample may not be informative enough to 
support robust interpretation of correlations. A discussion along those lines 
in reporting seems the more fruitful to me.

Best regards,
Jeroen

http://pd-value.com
-- More value out of your data!

-Original Message-
From: owner-nmus...@globomaxnm.com 
[mailto:owner-nmus...@globomaxnm.com] On Behalf Of Standing Joseph (GREAT 
ORMOND STREET HOSPITAL FOR CHILDREN NHS FOUNDATION TRUST)
Sent: Friday, September 26, 2014 09:15
To: Kowalski, Ken; 'Eleveld, DJ'; 'Pavel Belo'; 
nmusers@globomaxnm.com
Subject: RE: [NMusers] OMEGA matrix

Dear Pavel,

To answer your question I suggest you go on Bob Bauer's NONMEM 7 course.  The 
understanding I gleaned from that course (which I think was enhanced by the 
excellent wine we had at lunch in Alicante) was that with appropriate MU 
parameterisation there is virtually no computational disadvantage to estimating 
the full block with the newer algorithms.  So you might as well do it, at least 
in early runs where you want an idea of which parameter correlations might be 
useful/reasonably estimated.

BW,

Joe


Joseph F Standing
MRC Fellow, UCL Institute of Child Health
Antimicrobial Pharmacist, Great Ormond Street Hospital
Tel: +44(0)207 905 2370
Mobile: +44(0)7970 572435


From: owner-nmus...@globomaxnm.com 
[owner-nmus...@globomaxnm.com] On Behalf 
Of Ken Kowalski [ken.kowal...@a2pg.com]
Sent: 25 September 2014 22:43
To: 'Eleveld, DJ'; 'Pavel Belo'; 
nmusers@globomaxnm.com
Subject: RE: [NMusers] OMEGA matrix

Warning: This message contains unverified links which may not be safe.  You 
should only click links if you are sure they are from a trusted source.
Hi Douglas,

My own thinking is that you should fit the largest omega structure that can
be supported by the data rather than j

Re: [NMusers] OMEGA matrix

2014-10-02 Thread Gastonguay, Marc
Douglas makes important point in this discussion. That is, the method used
to judge parsimony of the model must consider the performance of the model
for intended purpose.

Consider the parsimony principle: "all things being equal, choose the
simpler model". The key is in how to judge the first part of that
statement.

A model developed based on goodness of fit metrics such as AIC, BIC, or
repeated likelihood ratio tests, may be the most parsimonious model for
predicting the current data set. This doesn't ensure that the model will be
"equal" in performance to more complex models for the purpose of predicting
the typical value in an external data set - external cross validation might
be required for that conclusion. Further, if the purpose is to develop a
model that is a reliable stochastic simulation tool, a simulation-based
model checking method should be part of the assessment of "equal"
performance when arriving at a parsimonious model.

Since most of our modeling goals go far beyond prediction of the current
data set, it's necessary to move beyond metrics solely based on objective
function and degrees of freedom when selecting a model. In other words, it
may be perfectly fine (and even parsimonious) for a model to include more
parameters than the likelihood ratio test tells you to, if those parameters
improve performance for the intended purpose.

Best regards,
Marc


RE: [NMusers] OMEGA matrix

2014-10-02 Thread Ken Kowalski
Hi All,

 

I agree with everything that Marc and Douglas have pointed out.  I too do not 
advise building the omega structure based on repeated likelihood ratio tests.  
The approach I take is more akin to what Joe had suggested earlier using SAEM 
to fit the full block omega structure and then look for patterns in the 
estimated omega matrix.  Even with FOCE estimation I will often fit a full 
block omega structure just to look for such patterns.  The full block omega 
structure may be over-parameterized and sometimes may not even converge.  
Nevertheless, as a diagnostic run it can be useful for uncovering patterns that 
may lead to reduced omega structures with more stable model fits (i.e., not 
over-parameterized).  I’m not necessarily driven to find a parsimonious omega 
structure as I’ll certainly err on the side of including additional elements in 
omega provided there is sufficient support to estimate these parameters (i.e., 
a stable model fit).   For example, I will select a full omega structure 
regardless of the magnitude of the correlations if the model is stable and not 
over-parameterized.  I have no issue with those who want to identify a 
parsimonious omega structure, however, I still maintain that a diagonal omega 
structure often is not the most parsimonious.  

 

I also agree with Marc’s comment that we must judge parsimony relative to the 
intended purpose of the model.  If we are only interested in our model to 
predict central tendency, then a diagonal omega structure may be all that is 
needed.  I would contend, however, that we often want to use our models for 
more than just predicting central tendency.  If we perform VPCs, 
cross-validation, or external validations on independent datasets,  but the 
statistics we summarize to assess predictive performance are only those 
involving central tendency then we’re not really going to get a robust 
assessment of the omega structure.  To evaluate the omega structure we need to 
use VPC statistics that describe variation and other percentiles besides the 
median.  My impression is that we aren’t as rigorous in our assessments of 
whether our models can adequately describe the variation in our data.  As I 
stated earlier, I see so many standard VPC plots where virtually 100% of the 
observed data are contained well within the 5th and 95th percentiles.  The 
presenter will often claim that these VPC plots support the adequacy of the 
predictions but clearly the model is over-predicting the variation.  The 
over-prediction of the variation may or may not be related to the omega 
structure as it could also be related to skewed or non-normal random effect 
distributions.   However, if  a diagonal omega structure was used and I saw 
this over-prediction in the variation in a VPC plot, one of the first things I 
would do is re-evaluate the omega structure and see if an alternative omega 
structure can lead to  improvements in predicting these percentiles.

 

Best,

 

Ken

 

From: Gastonguay, Marc [mailto:ma...@metruminstitute.org] 
Sent: Thursday, October 02, 2014 7:03 AM
To: Eleveld, DJ; nmusers@globomaxnm.com; ken.kowal...@a2pg.com; 
non...@optonline.net; joseph.stand...@nhs.net; Jeroen Elassaiss-Schaap
Subject: Re: [NMusers] OMEGA matrix

 

Douglas makes important point in this discussion. That is, the method used to 
judge parsimony of the model must consider the performance of the model for 
intended purpose. 

 

Consider the parsimony principle: "all things being equal, choose the simpler 
model". The key is in how to judge the first part of that statement. 

 

A model developed based on goodness of fit metrics such as AIC, BIC, or 
repeated likelihood ratio tests, may be the most parsimonious model for 
predicting the current data set. This doesn't ensure that the model will be 
"equal" in performance to more complex models for the purpose of predicting the 
typical value in an external data set - external cross validation might be 
required for that conclusion. Further, if the purpose is to develop a model 
that is a reliable stochastic simulation tool, a simulation-based model 
checking method should be part of the assessment of "equal" performance when 
arriving at a parsimonious model.

 

Since most of our modeling goals go far beyond prediction of the current data 
set, it's necessary to move beyond metrics solely based on objective function 
and degrees of freedom when selecting a model. In other words, it may be 
perfectly fine (and even parsimonious) for a model to include more parameters 
than the likelihood ratio test tells you to, if those parameters improve 
performance for the intended purpose.

 

Best regards,

Marc

 

 



SV: [NMusers] OMEGA matrix

2014-10-02 Thread Mats Karlsson
Hi all,

I agree with what Ken and Marc have said. On the point of a full matrix as a 
diagnostic, which I think is good, an alternative is to run a nonparametric 
estimation ($NONP) after your normal estimation. Even if you did not use a full 
block in the original estimation, this step will give you one (and it will 
“never” have estimation problems). It is not entirely unproblematic to use as 
is, because sometimes a variance can be biased due to an imposed diagonal 
structure in the preceding parametric step, but will often result in 
informative results for how to formulate an appropriate correlation structure. 
If you are ambitious, you can use the extended grid option which I think is 
recently implemented and addresses this problem.

I haven’t had the experience of Douglas that adding additional off-diagonal 
elements makes the simulation properties of a model worse. The nonparametric 
option does allow a fuller description of the correlation than the linear one 
though, so if that was the problem, $NONP may offer a solution.

Best regards,
Mats


Mats Karlsson, PhD
Professor of Pharmacometrics

Dept of Pharmaceutical Biosciences
Faculty of Pharmacy
Uppsala University
Box 591
75124 Uppsala

Phone: +46 18 4714105
Fax + 46 18 4714003
www.farmbio.uu.se/research/researchgroups/pharmacometrics/

Från: owner-nmus...@globomaxnm.com [mailto:owner-nmus...@globomaxnm.com] För 
Ken Kowalski
Skickat: den 2 oktober 2014 17:10
Till: ma...@metruminst.org; 'Eleveld, DJ'; nmusers@globomaxnm.com; 
non...@optonline.net; joseph.stand...@nhs.net; 'Jeroen Elassaiss-Schaap'
Ämne: RE: [NMusers] OMEGA matrix

Hi All,

I agree with everything that Marc and Douglas have pointed out.  I too do not 
advise building the omega structure based on repeated likelihood ratio tests.  
The approach I take is more akin to what Joe had suggested earlier using SAEM 
to fit the full block omega structure and then look for patterns in the 
estimated omega matrix.  Even with FOCE estimation I will often fit a full 
block omega structure just to look for such patterns.  The full block omega 
structure may be over-parameterized and sometimes may not even converge.  
Nevertheless, as a diagnostic run it can be useful for uncovering patterns that 
may lead to reduced omega structures with more stable model fits (i.e., not 
over-parameterized).  I’m not necessarily driven to find a parsimonious omega 
structure as I’ll certainly err on the side of including additional elements in 
omega provided there is sufficient support to estimate these parameters (i.e., 
a stable model fit).   For example, I will select a full omega structure 
regardless of the magnitude of the correlations if the model is stable and not 
over-parameterized.  I have no issue with those who want to identify a 
parsimonious omega structure, however, I still maintain that a diagonal omega 
structure often is not the most parsimonious.

I also agree with Marc’s comment that we must judge parsimony relative to the 
intended purpose of the model.  If we are only interested in our model to 
predict central tendency, then a diagonal omega structure may be all that is 
needed.  I would contend, however, that we often want to use our models for 
more than just predicting central tendency.  If we perform VPCs, 
cross-validation, or external validations on independent datasets,  but the 
statistics we summarize to assess predictive performance are only those 
involving central tendency then we’re not really going to get a robust 
assessment of the omega structure.  To evaluate the omega structure we need to 
use VPC statistics that describe variation and other percentiles besides the 
median.  My impression is that we aren’t as rigorous in our assessments of 
whether our models can adequately describe the variation in our data.  As I 
stated earlier, I see so many standard VPC plots where virtually 100% of the 
observed data are contained well within the 5th and 95th percentiles.  The 
presenter will often claim that these VPC plots support the adequacy of the 
predictions but clearly the model is over-predicting the variation.  The 
over-prediction of the variation may or may not be related to the omega 
structure as it could also be related to skewed or non-normal random effect 
distributions.   However, if  a diagonal omega structure was used and I saw 
this over-prediction in the variation in a VPC plot, one of the first things I 
would do is re-evaluate the omega structure and see if an alternative omega 
structure can lead to  improvements in predicting these percentiles.

Best,

Ken

From: Gastonguay, Marc [mailto:ma...@metruminstitute.org]
Sent: Thursday, October 02, 2014 7:03 AM
To: Eleveld, DJ; nmusers@globomaxnm.com; 
ken.kowal...@a2pg.com; 
non...@optonline.net; 
joseph.stand...@nhs.net; 

Re: SV: [NMusers] OMEGA matrix

2014-10-02 Thread Jeroen Elassaiss-Schaap
Hi everybody,

Nice discussion! Good to hear that we seem to be in agreement on how to deal 
with off-diagonal elements. Thanks for all your feedback!

I would like to underscore Mats comment about the expanded grid option. Also in 
my experience it seems to work very well, as an efficient approach to derive an 
omega block. 

Perhaps the loose end here is how to deal with the situation that Doug seems to 
encounter. Such cases (off-diagonal elements that worsen predictability) may be 
a signal of underlying processes or covariates that are not accounted for or 
not present in the dataset. Any better suggestions?

Best regards,
Jeroen

http://pd-value.com
-- More value out of your data!



On Oct 2, 2014, 6:43 PM, at 6:43 PM, Mats Karlsson 
 wrote:
>Hi all,
>
>I agree with what Ken and Marc have said. On the point of a full matrix
>as a diagnostic, which I think is good, an alternative is to run a
>nonparametric estimation ($NONP) after your normal estimation. Even if
>you did not use a full block in the original estimation, this step will
>give you one (and it will “never” have estimation problems). It is not
>entirely unproblematic to use as is, because sometimes a variance can
>be biased due to an imposed diagonal structure in the preceding
>parametric step, but will often result in informative results for how
>to formulate an appropriate correlation structure. If you are
>ambitious, you can use the extended grid option which I think is
>recently implemented and addresses this problem.
>
>I haven’t had the experience of Douglas that adding additional
>off-diagonal elements makes the simulation properties of a model worse.
>The nonparametric option does allow a fuller description of the
>correlation than the linear one though, so if that was the problem,
>$NONP may offer a solution.
>
>Best regards,
>Mats
>
>
>Mats Karlsson, PhD
>Professor of Pharmacometrics
>
>Dept of Pharmaceutical Biosciences
>Faculty of Pharmacy
>Uppsala University
>Box 591
>75124 Uppsala
>
>Phone: +46 18 4714105
>Fax + 46 18 4714003
>www.farmbio.uu.se/research/researchgroups/pharmacometrics/
>
>Från: owner-nmus...@globomaxnm.com
>[mailto:owner-nmus...@globomaxnm.com] För Ken Kowalski
>Skickat: den 2 oktober 2014 17:10
>Till: ma...@metruminst.org; 'Eleveld, DJ'; nmusers@globomaxnm.com;
>non...@optonline.net; joseph.stand...@nhs.net; 'Jeroen
>Elassaiss-Schaap'
>Ämne: RE: [NMusers] OMEGA matrix
>
>Hi All,
>
>I agree with everything that Marc and Douglas have pointed out.  I too
>do not advise building the omega structure based on repeated likelihood
>ratio tests.  The approach I take is more akin to what Joe had
>suggested earlier using SAEM to fit the full block omega structure and
>then look for patterns in the estimated omega matrix.  Even with FOCE
>estimation I will often fit a full block omega structure just to look
>for such patterns.  The full block omega structure may be
>over-parameterized and sometimes may not even converge.  Nevertheless,
>as a diagnostic run it can be useful for uncovering patterns that may
>lead to reduced omega structures with more stable model fits (i.e., not
>over-parameterized).  I’m not necessarily driven to find a parsimonious
>omega structure as I’ll certainly err on the side of including
>additional elements in omega provided there is sufficient support to
>estimate these parameters (i.e., a stable model fit).   For example, I
>will select a full omega structure regardless of the magnitude of the
>correlations if the model is stable and not over-parameterized.  I have
>no issue with those who want to identify a parsimonious omega
>structure, however, I still maintain that a diagonal omega structure
>often is not the most parsimonious.
>
>I also agree with Marc’s comment that we must judge parsimony relative
>to the intended purpose of the model.  If we are only interested in our
>model to predict central tendency, then a diagonal omega structure may
>be all that is needed.  I would contend, however, that we often want to
>use our models for more than just predicting central tendency.  If we
>perform VPCs, cross-validation, or external validations on independent
>datasets,  but the statistics we summarize to assess predictive
>performance are only those involving central tendency then we’re not
>really going to get a robust assessment of the omega structure.  To
>evaluate the omega structure we need to use VPC statistics that
>describe variation and other percentiles besides the median.  My
>impression is that we aren’t as rigorous in our assessments of whether
>our models can adequately describe the variation in our data.  As I
>stated earlier, I see so many standard VPC plots where virtually 100%
>of the observed data are contained well within the 5th and 95th
>percentiles.  The presenter will often claim that these VPC plots
>support the adequacy of the predictions but clearly the model is
>over-predicting the variation.  The ove

RE: [NMusers] OMEGA matrix

2014-10-02 Thread Ken Kowalski
Hi All,

 

My own anecdotal experiences are consistent with Mats’ comment that a variance 
can be biased when a diagonal omega structure is imposed.  When fitting a 
diagonal omega structure I sometimes find that a particular variance component 
may be estimated near zero.  However, as soon as you include a covariance 
parameter this variance component can now be reasonably estimated and the 
variance component for another random effect correlated with this one is 
smaller in magnitude.  Sometimes, when fitting a diagonal omega structure when 
there are correlations, NONMEM may have difficulty deciding how to partition 
the ‘total’ variability between the two random effects that are correlated 
often inflating one and driving the other to zero.  I have not investigated 
this through any kind of simulation study but this has been my experience.

 

Mats,

 

I’m curious to understand better what you mean by $NONP may provide a fuller 
description of the correlation than a linear one and may offer a better 
solution.  Are you referring to random effects that may not have a normal 
distribution when assessing these correlations?  Do you use $NONP to guide 
selection of transformations of the random effects so that they are more normal 
before estimating the omega matrix?  $NONP is an estimation method but I don’t 
see how it is useful for simulation purposes (if we assume normal random 
effects in the simulations) unless you somehow use it as a diagnostic to alter 
your model development with a fully parametric estimation method.  I’m 
intrigued…

 

Ken

 

From: owner-nmus...@globomaxnm.com [mailto:owner-nmus...@globomaxnm.com] On 
Behalf Of Mats Karlsson
Sent: Thursday, October 02, 2014 12:43 PM
To: Ken Kowalski; ma...@metruminst.org; 'Eleveld, DJ'; nmusers@globomaxnm.com; 
non...@optonline.net; joseph.stand...@nhs.net; 'Jeroen Elassaiss-Schaap'
Subject: SV: [NMusers] OMEGA matrix

 

Hi all,

 

I agree with what Ken and Marc have said. On the point of a full matrix as a 
diagnostic, which I think is good, an alternative is to run a nonparametric 
estimation ($NONP) after your normal estimation. Even if you did not use a full 
block in the original estimation, this step will give you one (and it will 
“never” have estimation problems). It is not entirely unproblematic to use as 
is, because sometimes a variance can be biased due to an imposed diagonal 
structure in the preceding parametric step, but will often result in 
informative results for how to formulate an appropriate correlation structure. 
If you are ambitious, you can use the extended grid option which I think is 
recently implemented and addresses this problem.

 

I haven’t had the experience of Douglas that adding additional off-diagonal 
elements makes the simulation properties of a model worse. The nonparametric 
option does allow a fuller description of the correlation than the linear one 
though, so if that was the problem, $NONP may offer a solution.

 

Best regards,

Mats

 

 

Mats Karlsson, PhD

Professor of Pharmacometrics

 

Dept of Pharmaceutical Biosciences

Faculty of Pharmacy

Uppsala University

Box 591

75124 Uppsala

 

Phone: +46 18 4714105

Fax + 46 18 4714003

  
www.farmbio.uu.se/research/researchgroups/pharmacometrics/

 

Från: owner-nmus...@globomaxnm.com [mailto:owner-nmus...@globomaxnm.com] För 
Ken Kowalski
Skickat: den 2 oktober 2014 17:10
Till: ma...@metruminst.org; 'Eleveld, DJ'; nmusers@globomaxnm.com; 
non...@optonline.net; joseph.stand...@nhs.net; 'Jeroen Elassaiss-Schaap'
Ämne: RE: [NMusers] OMEGA matrix

 

Hi All,

 

I agree with everything that Marc and Douglas have pointed out.  I too do not 
advise building the omega structure based on repeated likelihood ratio tests.  
The approach I take is more akin to what Joe had suggested earlier using SAEM 
to fit the full block omega structure and then look for patterns in the 
estimated omega matrix.  Even with FOCE estimation I will often fit a full 
block omega structure just to look for such patterns.  The full block omega 
structure may be over-parameterized and sometimes may not even converge.  
Nevertheless, as a diagnostic run it can be useful for uncovering patterns that 
may lead to reduced omega structures with more stable model fits (i.e., not 
over-parameterized).  I’m not necessarily driven to find a parsimonious omega 
structure as I’ll certainly err on the side of including additional elements in 
omega provided there is sufficient support to estimate these parameters (i.e., 
a stable model fit).   For example, I will select a full omega structure 
regardless of the magnitude of the correlations if the model is stable and not 
over-parameterized.  I have no issue with those who want to identify a 
parsimonious omega structure, however, I still maintain that a diagonal omega 
structure often is not the most parsimonious.  

 

I also agree with Marc’s comment that we must judg

[NMusers] Negative DV values

2014-10-02 Thread siwei Dai
Dear NM users:

I have a dataset where some of the concentrations are reported as negative
values.  I believe that the concentrations were calculated using a standard
curve.

My instinct is to impute all the negative values to zero, but worry that it
will introduce bias.

A 2nd thought is using the absolute value of the lowest (negative)
concentration as LLOQ. All the concentrations below LLOQ will be treated as
zero. By doing this, some positive and negative values  both will be zero
out which will help to cancel some of the unevenness that the 1st method
may introduce.

I believe that the 2nd method is better but wonder if there is any other
better way to do it. Any comments will be greatly appreciated.

Thank you in advance.

Siwei


Re: [NMusers] Negative DV values

2014-10-02 Thread Ron Keizer
hi Siwei,
you should include the BLOQ data as they are, i.e. negative. Any other
approach would decrease precision (e.g. M3 likelihood-based) and/or induce
bias (e.g. LLOQ/2 or LLOQ=0). I've done some simulations on this a while
ago to show this (
http://page-meeting.org/pdf_assets/2413-PAGE_2010_poster_LLOQ_v1.pdf), but
it should be intuitive too.
best regards,
Ron

--
Ron Keizer, PharmD PhD
Dept. of Bioengineering & Therapeutic Sciences
University of California San Francisco (UCSF)
--

On Thu, Oct 2, 2014 at 2:10 PM, siwei Dai  wrote:

> Dear NM users:
>
> I have a dataset where some of the concentrations are reported as negative
> values.  I believe that the concentrations were calculated using a standard
> curve.
>
> My instinct is to impute all the negative values to zero, but worry that
> it will introduce bias.
>
> A 2nd thought is using the absolute value of the lowest (negative)
> concentration as LLOQ. All the concentrations below LLOQ will be treated as
> zero. By doing this, some positive and negative values  both will be zero
> out which will help to cancel some of the unevenness that the 1st method
> may introduce.
>
> I believe that the 2nd method is better but wonder if there is any other
> better way to do it. Any comments will be greatly appreciated.
>
> Thank you in advance.
>
> Siwei
>


Re: [NMusers] Negative DV values

2014-10-02 Thread Nick Holford

Siwei,

I agree with Ron. Using the measurements you have is better than trying 
to use a work around such as likelihood or imputation based methods. 
Some negative measurement values are exactly what you should expect if 
the true concentration is zero (or 'close' to zero) when there is 
background measurement error noise.


As far as I know all common methods of measurement (HPLC, MS) have 
background noise. You can account for this noise when you model your 
data by including an additive term in the residual error model. The 
additive error estimate will also include other sources of residual 
error that are independent of concentration eg. due to model 
misspecification.


Here is a reference to a publication which used measured concentrations 
that included negative measured values. Note that a negative measured 
value does not mean the actual concentration was negative!


Patel K, Choy SS, Hicks KO, Melink TJ, Holford NH, Wilson WR. A combined 
pharmacokinetic model for the hypoxia-targeted prodrug PR-104A in 
humans, dogs, rats and mice predicts species differences in clearance 
and toxicity. Cancer Chemother Pharmacol. 2011;67(5):1145-55.


Best wishes,

Nick

On 3/10/2014 11:07 a.m., Ron Keizer wrote:

hi Siwei,
you should include the BLOQ data as they are, i.e. negative. Any other 
approach would decrease precision (e.g. M3 likelihood-based) and/or 
induce bias (e.g. LLOQ/2 or LLOQ=0). I've done some simulations on 
this a while ago to show this 
(http://page-meeting.org/pdf_assets/2413-PAGE_2010_poster_LLOQ_v1.pdf), but 
it should be intuitive too.

best regards,
Ron

--
Ron Keizer, PharmD PhD
Dept. of Bioengineering & Therapeutic Sciences
University of California San Francisco (UCSF)
--

On Thu, Oct 2, 2014 at 2:10 PM, siwei Dai > wrote:


Dear NM users:

I have a dataset where some of the concentrations are reported as
negative values.  I believe that the concentrations were
calculated using a standard curve.

My instinct is to impute all the negative values to zero, but
worry that it will introduce bias.

A 2nd thought is using the absolute value of the lowest (negative)
concentration as LLOQ. All the concentrations below LLOQ will be
treated as zero. By doing this, some positive and negative values
 both will be zero out which will help to cancel some of the
unevenness that the 1st method may introduce.

I believe that the 2nd method is better but wonder if there is any
other better way to do it. Any comments will be greatly appreciated.

Thank you in advance.

Siwei




--
Nick Holford, Professor Clinical Pharmacology
Dept Pharmacology & Clinical Pharmacology, Bldg 503 Room 302A
University of Auckland,85 Park Rd,Private Bag 92019,Auckland,New Zealand
office:+64(9)923-6730 mobile:NZ +64(21)46 23 53
email: n.holf...@auckland.ac.nz
http://holford.fmhs.auckland.ac.nz/

Holford SD, Allegaert K, Anderson BJ, Kukanich B, Sousa AB, Steinman A, Pypendop, 
B., Mehvar, R., Giorgi, M., Holford,N.H.G. Parent-metabolite pharmacokinetic models 
- tests of assumptions and predictions. Journal of Pharmacology & Clinical 
Toxicology. 2014;2(2):1023-34.

Ribba B, Holford N, Magni P, Trocóniz I, Gueorguieva I, Girard P, Sarr,C., 
Elishmereni,M., Kloft,C., Friberg,L. A review of mixed-effects models of tumor 
growth and effects of anticancer drug treatment for population analysis. CPT: 
pharmacometrics & systems pharmacology. 2014;Accepted 15-Mar-2014.