One last thing: " Or you live with the ill-conditionedness. "
I plan to work with the correlation matrix (related to the inverse of J^t J). If J has condition number about 1e13, do you think I can work on J^t J without getting into numerical problems? Best Simon Am Mo., 17. Okt. 2022 um 06:14 Uhr schrieb Wolfgang Bangerth < bange...@colostate.edu>: > On 10/15/22 03:15, Simon Wiesheier wrote: > > > > This makes sense. > > So, given the scaled eigenvectors E_1,...,E_8, how can I find the > coefficients > > A^*,...,H^* ? > > Is it just a matrix multiplication > > P* = (E_1; ... ; E_8) \times p* , > > where P* = (A^*,...,H^*) are the new parameters and p* = (a^*,...,h^*) > are the > > old parameters? > > Something of the sort. It's the same point in 8-space, you're just > expressing > it with regard to different bases. > > > > Assuming that my pde solver still converges for the new parameters, the > > overall procedure would be as follows: > > 1. run dealii program to compute J with old parameters p* > > 2. compute the new basis (EV_i) and the new parameters P* > > 3. run dealii program to compute the new J with the new parameters P* > > 4. compute p* = (E_1; ... ; E_8)^-1 \times P* > > Repeat 1-4 for all iterations of the optimsation algorithm > (Levenberg-Marquardt). > > Is that correct? > > Conceptually, this is correct. In practice, it may not be necessary to > actually do it that way: All you're looking for is a well-conditioned > basis. > You don't need the exact vectors E_i. In some applications you can guess > such > vectors (like in the model case I drew up), in others you compute them > once > for one case and re-use for other cases where maybe they are not exactly > the > eigenvalues of the matrix J^T J. Or you live with the ill-conditionedness. > > > > At the end, the ensuing parameters have to be the same, no matter > > wheter I use the above scaling or not. The sole difference is that > > the scaled version improves (amongs others) the condition number of J > and may > > lead to > > a better convergence of the optimsation algorithm, right? > Yes. > > Best > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bange...@colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > > -- > The deal.II project is located at http://www.dealii.org/ > For mailing list/forum options, see > https://groups.google.com/d/forum/dealii?hl=en > --- > You received this message because you are subscribed to the Google Groups > "deal.II User Group" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to dealii+unsubscr...@googlegroups.com. > To view this discussion on the web visit > https://groups.google.com/d/msgid/dealii/1e45e679-0528-d64d-e922-0dc6fdaeb471%40colostate.edu > . > -- The deal.II project is located at http://www.dealii.org/ For mailing list/forum options, see https://groups.google.com/d/forum/dealii?hl=en --- You received this message because you are subscribed to the Google Groups "deal.II User Group" group. To unsubscribe from this group and stop receiving emails from it, send an email to dealii+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/dealii/CAM50jEsShwppMZ6zGjQ-HGrrk5Yt16e3VThi%3DzsEZASP-Lgjsg%40mail.gmail.com.