[Topic-models] Topic-models Digest, Vol 119, Issue 5

Ian ian.wood at anu.edu.au
Thu Jun 9 10:47:03 EDT 2016


Hi Michael,

Be careful with the idea behind that link (transposing the topic/word matrix to get topic probabilities given a word) - this assumes that all topics are equally probable, which generally isn’t the case. The true posterior is a bit more complex. In any case, the alternatives suggested probably to better than such a posterior anyway (:

Best
Ian

> On 8 Jun 2016, at 10:27 pm, topic-models-request at lists.cs.princeton.edu wrote:
> 
> From: Michael Klachko <michaelklachko at gmail.com <mailto:michaelklachko at gmail.com>>
> Subject: Re: [Topic-models] sparse word vectors and LDA
> Date: 8 June 2016 10:27:14 pm GMT+1
> To: Dat Quoc Nguyen <datquocnguyen at gmail.com <mailto:datquocnguyen at gmail.com>>
> Cc: "topic-models at lists.cs.princeton.edu <mailto:topic-models at lists.cs.princeton.edu>" <topic-models at lists.cs.princeton.edu <mailto:topic-models at lists.cs.princeton.edu>>
> 
> 
> Thank you everyone for the answers. I believe what I was asking for originally is available in Gensim:
> http://comments.gmane.org/gmane.comp.ai.gensim/591 <http://comments.gmane.org/gmane.comp.ai.gensim/591>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/topic-models/attachments/20160609/b516b7f0/attachment-0001.html>


More information about the Topic-models mailing list