Tensor Product Representations (TPRs) Literature

Tensor Product Variable Binding is a way to transform the symbolic structure into the vector format [1] using the tensor product operation.

Definition 1. Filler – a particular instance of the given structural type. Definition 2. Role – a function that filler presents in a structure.

There are several recent advances in the field that allow building symbolic and sub-symbolic integrated solutions with TPRs. The integrated symbolic and sub-symbolic flow consists of following steps: • encoding the symbolic structure as a distributed representation with a neural network. In [2] a new encoder design was proposed for the simple structure that has only one nesting level. • flattenning the distributed representation to a vector format with a neural network [3]. • performing domain specific analysis of the structure on the neural level. For example, identification of whether the sentence is in Active or Passive voice. • structural manipulations on the neural level, for example, joining of two trees in one [3]. • decoding the new structure from a distributed representation to symbolic level.

TPRs pipeline

Literature

[1] Smolensky, P. (1990). Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artificial intelligence, 46(1-2), 159-216. doi

[2] Demidovskij, A. (2019, September). Implementation Aspects of Tensor Product Variable Binding in Connectionist Systems. In Proceedings of SAI Intelligent Systems Conference (pp. 97-110). Springer, Cham. doi

[3] Demidovskij, A. V. (2019, October). Towards Automatic Manipulation of Arbitrary Structures in Connectivist Paradigm with Tensor Product Variable Binding. In International Conference on Neuroinformatics (pp. 375-383). Springer, Cham. doi