How Much Computation and Distributedness is Needed in Sequence Learning Tasks?

dc.contributor.authorMargem, Mrwan
dc.contributor.authorYilmaz, Ozgur
dc.date.accessioned2025-10-24T18:08:47Z
dc.date.available2025-10-24T18:08:47Z
dc.date.issued2016
dc.departmentMalatya Turgut Özal Üniversitesi
dc.description9th International Conference on Artificial General Intelligence (AGI) Held as Part of Joint Multi-Conference on Human-Level Intelligence (HLAI) -- JUL 16-19, 2016 -- New York City, NY
dc.description.abstractIn this paper, we are analyzing how much computation and distributedness of representation is needed to solve sequence-learning tasks which are essential for many artificial intelligence applications. We propose a novel minimal architecture based on cellular automata. The states of the cells are used as the reservoir of activities as in Echo State Networks. The projection of the input onto this reservoir medium provides a systematic way of remembering previous inputs and combining the memory with a continuous stream of inputs. The proposed framework is tested on classical synthetic pathological tasks that are widely used in evaluating recurrent algorithms. We show that the proposed algorithm achieves zero error in all tasks, giving a similar performance with Echo State Networks, but even better in many different aspects. The comparative results in our experiments suggest that, computation of high order attribute statistics and representing them in a distributed manner is essential, but it can be done in a very simple network of cellular automaton with identical binary units. This raises the question of whether real valued neuron units are mandatory for solving complex problems that are distributed over time. Even very sparsely connected binary units with simple computational rules can provide the required computation for intelligent behavior.
dc.identifier.doi10.1007/978-3-319-41649-6_28
dc.identifier.endpage283
dc.identifier.isbn978-3-319-41649-6
dc.identifier.isbn978-3-319-41648-9
dc.identifier.issn0302-9743
dc.identifier.scopus2-s2.0-84977560780
dc.identifier.scopusqualityQ3
dc.identifier.startpage274
dc.identifier.urihttps://doi.org/10.1007/978-3-319-41649-6_28
dc.identifier.urihttps://hdl.handle.net/20.500.12899/3292
dc.identifier.volume9782
dc.identifier.wosWOS:000386919900028
dc.identifier.wosqualityN/A
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherSpringer Int Publishing Ag
dc.relation.ispartofArtificial General Intelligence (Agi 2016)
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.snmzKA_20251023
dc.subject[No Keywords]
dc.titleHow Much Computation and Distributedness is Needed in Sequence Learning Tasks?
dc.typeConference Object

Dosyalar