Quantcast
Channel: Machine Learning
Viewing all articles
Browse latest Browse all 62625

Looking for ideas: processing and predicting arbitrary sequences of inputs using libmind

$
0
0

Hello redditers!

I would like to present a small project i was working during the last months. It aims to solve the problem in Machine Learning of processing sequences of inputs, instead of a fixed size vector that many algorithms requires to perform classification or regression and also providing a minimalist interface to learn and predict sequences in almost any domain.

The basic idea is to use an Echo State Networks to generate a single fixed size vector in Rn from a sequence of inputs of vectors in Rk. Then, we can use this representation to perform some kind of classification or regression. Also, a dual procedure can be used to reconstruct a sequence of vectors in Rk from a fixed size vector in Rn.

This project is called libmind. It is also designed to be a simple way for a programmer to learn sequences of arbitrary elements, as long as they can be vectorized.

In the repository of libmind two examples of its use are included:

  • Identification of part of speech (POS) of English words only using their letters.

  • Reduction of variableless propositional logic formulas.

I would like to hear if it can be improved and/or extended. So, in the spirit of open research, i'm publishing all the code, data and instructions, so anyone can view and improve this small project.

Thanks!

submitted by galapag0
[link] [comment]

Viewing all articles
Browse latest Browse all 62625

Trending Articles