Hey,
I am the author of https://github.com/whilo/boltzmann. I have had a look at your nice codebase and I would like to merge CD training for RBMs, e.g. as pretraining for DBNs. In general I think a maintained neural networking library with a set of standard protocols building on core.matrix (I'll comment the other issue about this) would be nice, other libraries could then extend them as well. A problem that I see is that all your approach for unsupervised learning assumes input/output relations for every layer, while for unsupervised models like RBMs this is not the case. Do you have any opinion or ideas of how to best integrate the two approaches? I am not sure why you picked multimethods instead of protocols for example. In general something like theano or torch would be nice, so maybe one should discuss the proper level of abstraction a little. The biggest technical problem there for me is that we still don't have a (good) GPU matrix backend.
Christian
Hey,
I am the author of https://github.com/whilo/boltzmann. I have had a look at your nice codebase and I would like to merge CD training for RBMs, e.g. as pretraining for DBNs. In general I think a maintained neural networking library with a set of standard protocols building on
core.matrix(I'll comment the other issue about this) would be nice, other libraries could then extend them as well. A problem that I see is that all your approach for unsupervised learning assumes input/output relations for every layer, while for unsupervised models like RBMs this is not the case. Do you have any opinion or ideas of how to best integrate the two approaches? I am not sure why you picked multimethods instead of protocols for example. In general something like theano or torch would be nice, so maybe one should discuss the proper level of abstraction a little. The biggest technical problem there for me is that we still don't have a (good) GPU matrix backend.Christian