Eigen: Is it possible to create LeastSquareDiagonalPreconditioner-like conditioner if i only can compute Aty...











up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    2 days ago






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    yesterday















up vote
1
down vote

favorite












I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question
























  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    2 days ago






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    yesterday













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.










share|improve this question















I want to solve least-squares like system A^t * A * x = -A^t * x. (I'm implementing Gauss-Newton method for special problem).



I wrote special routines which allow me to compute A * x and A^t * y products. With such routines it's easy to use matrix-free solvers thanks to Eigen.



But my approach converges not as good as Eigen::LeastSquaresConjugateGradient. I made a small test and it's looks like LeastSquareDiagonalPreconditioner speed ups convergence a lot.



My question is - how i can use LeastSquareDiagonalPreconditioner or implement own Preconditioner if i can only compute matrix products?
I'm not very good with understanding preconditioning/conjugate gradient stuff btw.



EDIT



For clarity - i want to use Matrix-free solvers from Eigen with my product routines.



EDIT 2



Matrix-vector products was obtained by using forward and reverse mode autodiff on some objective functions.







c++ linear-algebra eigen least-squares eigen3






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited yesterday

























asked 2 days ago









Dark_Daiver

7061830




7061830












  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    2 days ago






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    yesterday


















  • In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
    – Damien
    2 days ago










  • I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
    – Damien
    2 days ago










  • @Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
    – Dark_Daiver
    2 days ago






  • 1




    Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
    – chtz
    2 days ago






  • 1




    I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
    – chtz
    yesterday
















In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago




In my understanding, the preconditioner simply consists in replacing an equation Ax=b by AtAx=Atb, to get an equation with a symmetric positive-definite matrix. This allows to use iterative methods. You already have such a symmetric matrix. Difficult to answer you without more details about your implementation of the Gauss-Newton method
– Damien
2 days ago












I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago




I don't understand if you want to use Eigen, or only a subset of Eigen functions, or no Eigen function at all
– Damien
2 days ago












@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago




@Damien thank you for comment! I edited post - i want to use Eigen solvers with my product routines
– Dark_Daiver
2 days ago




1




1




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
2 days ago




Is the actual question how to compute the diagonal of A^t*A or, if you have that, how to use this a preconditioner? The second problem is answered by @ggael, for the first problem you'd need to describe more about A (and maybe it will be more a math question than a programming question).
– chtz
2 days ago




1




1




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
yesterday




I don't see any obvious solution except computing the squared norm of each A*Unit(i) -- maybe you can accelerate computing these, since you know that only a single element is non-zero each time.
– chtz
yesterday












1 Answer
1






active

oldest

votes

















up vote
1
down vote



accepted










The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



 m_invdiag(j) = 1./mat.col(j).squaredNorm();


for all column j using a strategy to what you already implemented for the product operators.






share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














     

    draft saved


    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
    }
    );

    Post as a guest
































    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    1
    down vote



    accepted










    The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



     m_invdiag(j) = 1./mat.col(j).squaredNorm();


    for all column j using a strategy to what you already implemented for the product operators.






    share|improve this answer

























      up vote
      1
      down vote



      accepted










      The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



       m_invdiag(j) = 1./mat.col(j).squaredNorm();


      for all column j using a strategy to what you already implemented for the product operators.






      share|improve this answer























        up vote
        1
        down vote



        accepted







        up vote
        1
        down vote



        accepted






        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.






        share|improve this answer












        The easiest might be to implement your own preconditioner class inheriting DiagonalPreconditioner and implementing something like LeastSquareDiagonalPreconditioner ::factorize() but adapted to your type. Basically you need to compute:



         m_invdiag(j) = 1./mat.col(j).squaredNorm();


        for all column j using a strategy to what you already implemented for the product operators.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered 2 days ago









        ggael

        19.5k22944




        19.5k22944






























             

            draft saved


            draft discarded



















































             


            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203840%2feigen-is-it-possible-to-create-leastsquarediagonalpreconditioner-like-condition%23new-answer', 'question_page');
            }
            );

            Post as a guest




















































































            Popular posts from this blog

            Schultheiß

            Verwaltungsgliederung Dänemarks

            Liste der Kulturdenkmale in Wilsdruff