[Topic-models] LDA gamma update

mohammad bakhtiari educatemb at gmail.com
Thu Mar 3 09:55:56 EST 2016


Hi everyone

I first want to mention a point and then ask two questions

In LDA paper, to update gamma in Variational Inference, as you can see:

[image: Inline image 1]
*in one iteration updating phi of a word should not have effect on updating
phi for other words*(e.g. consider first iteration). However, in lda-c it
seems that updating gamma immediately after updating phi for a word make
that word affect the updating (phi for) other words. If I am wrong, please
correct me?

*1-* I can't understand why, for each word , old phi subtracted from new
phi. can someone tell me nicely?
*2-* can I update gamma outside of for loop(on words) and then use it for
updating phi?

the code of updating gamma and phi from lda-c:
for (n = 0; n < doc->length; n++)
{
    phisum = 0;
    for (k = 0; k < model->num_topics; k++)
    {
        oldphi[k] = phi[n][k];
        phi[n][k] =
            digamma_gam[k] +
            model->log_prob_w[k][doc->words[n]];

        if (k > 0)
            phisum = log_sum(phisum, phi[n][k]);
        else
            phisum = phi[n][k]; // note, phi is in log space
    }

    for (k = 0; k < model->num_topics; k++)
    {
        phi[n][k] = exp(phi[n][k] - phisum);
        var_gamma[k] =
            var_gamma[k] + doc->counts[n]*(phi[n][k] - oldphi[k]);
        // !!! a lot of extra digamma's here because of how we're computing
it
        // !!! but its more automatically updated too.
        digamma_gam[k] = digamma(var_gamma[k]);
    }
}

thanks for your time and consideration, I am looking forward.
Mohammad
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/topic-models/attachments/20160303/0d3ecca4/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 36976 bytes
Desc: not available
URL: <http://lists.cs.princeton.edu/pipermail/topic-models/attachments/20160303/0d3ecca4/attachment-0001.png>


More information about the Topic-models mailing list