-
Notifications
You must be signed in to change notification settings - Fork 3
Abstract
One of the strengths of artificial neural networks lies in their ability to find patterns in noised input data. The input data is presented as one big chunk and pre-processed in order to associate its many characteristics with input neurons. Then the network processes the data all at once. Quite often it is done this way even if the input data describes not a single point of time, like a photo or contents of a database table in a given moment in history, but a stream of data gathered over a longer period. In such case the program waits until a sufficiently big amount of data is gathered and only then presents it to the network with timestamps as additional information. This article discusses another possible approach to processing data streams by an artificial neural network: The network is specifically designed to process a given data stream and each of its neurons performs a well-designed task. Each neuron is implemented as an independent, asynchronous entity, working in parallel with other neurons and inherently using the passage of time as a source of information. It allows for a smaller number of neurons in the network, but each of them is a more complex entity. The computer program which is a part of this project builds such a network and uses it as a data stream transformer. The input stream of symbols is decoded into an input vector pushed into the network. The network is able to generate a stream of more abstract symbols, using as additional information both its internal state (i.e. data which it received before) and time gaps between consecutive chunks of data. In the following part the article discusses features exhibited by this type of a network and explores how to design it from smaller sets of neurons.