|
Does anyone know if there is an implementation of RNNLM Toolkit by Tomas Mikolov on Python with Theano? Thanks for any clues! |
|
Take a look at Taylor Graham's implementation: https://github.com/gwtaylor/theano-rnn It's quite general so you can create language model feeding it with words using one-hot encoding. Though for large dictionaries it would be expensive. If it's a problem then you could consider character-level n-grams but it would require to learn much longer dependencies. As for me theano is quite complicated at the first glance. So be patient while going through the code :). What task are you trying to solve with your LM? I'm trying to implement char-based RNN right now. First I tried to do it with Theano but then found myself implementing pure C cuda code. Send me a message if you have any questions (my email could be found at my github profile: https://github.com/lightcaster ). I took the liberty to edit your link, since it has a bad formatting and is not directing to the right site. Thanks a lot for the great answer
(Aug 08 '13 at 13:02)
Leon Palafox ♦
Thanks Leon. Due to Russian's brand new censorship system some of cloudflare's ips was blacklisted. So I have some problems with access to metaoptimize.com/qa and its page formatting (I can see only a bare html). Hope they unblock it soon.
(Aug 09 '13 at 03:19)
Konstantin
Thank you very much! It's exactly what I was looking for
(Aug 09 '13 at 05:27)
Ira Korshunova
|
I don't know about any implementations for that specific toolkit but the documentation of the 'scan' function in theano gives examples of recurrent neural networks. As far as I know, the original RNNs by Tomas Mikolov are standard RNNs with softmax outputs and truncated backprop through time training.