How Much Computation and Distributedness is Needed in Sequence Learning Tasks?
Küçük Resim Yok
Tarih
2016
Yazarlar
Dergi Başlığı
Dergi ISSN
Cilt Başlığı
Yayıncı
Springer Int Publishing Ag
Erişim Hakkı
info:eu-repo/semantics/closedAccess
Özet
In this paper, we are analyzing how much computation and distributedness of representation is needed to solve sequence-learning tasks which are essential for many artificial intelligence applications. We propose a novel minimal architecture based on cellular automata. The states of the cells are used as the reservoir of activities as in Echo State Networks. The projection of the input onto this reservoir medium provides a systematic way of remembering previous inputs and combining the memory with a continuous stream of inputs. The proposed framework is tested on classical synthetic pathological tasks that are widely used in evaluating recurrent algorithms. We show that the proposed algorithm achieves zero error in all tasks, giving a similar performance with Echo State Networks, but even better in many different aspects. The comparative results in our experiments suggest that, computation of high order attribute statistics and representing them in a distributed manner is essential, but it can be done in a very simple network of cellular automaton with identical binary units. This raises the question of whether real valued neuron units are mandatory for solving complex problems that are distributed over time. Even very sparsely connected binary units with simple computational rules can provide the required computation for intelligent behavior.
Açıklama
9th International Conference on Artificial General Intelligence (AGI) Held as Part of Joint Multi-Conference on Human-Level Intelligence (HLAI) -- JUL 16-19, 2016 -- New York City, NY
Anahtar Kelimeler
[No Keywords]
Kaynak
Artificial General Intelligence (Agi 2016)
WoS Q Değeri
N/A
Scopus Q Değeri
Q3
Cilt
9782












