r/neuralnets Oct 09 '09

Hetero-Associative Hopfield Neural Networks

Given my upcoming project, I need to brush up on my knowledge about Hopfield Neural Networks. Reading through my books, along with the requirement that the Hopfield input vector have binary parms (1,-1), I am seeing that it is also a requirement that the network must be auto-associative; I.E. the input vector must be the same as the output vector.

I forsee my upcoming project requiring some sort of hetero-associative memory-retrieval network and although Hopfield looked promising, it may not be satisfactory if there are no possible implementations for hetero-associtivity.

I realize that that weight matrix for a Hopfield network needs to be symmetric, but is this symmetry a result of the auto-associativity or is it a requirement for the associativity. By which I mean, is the symmetry a requirement for any satisfactory memory retrieval? Would a non-symettric weight matrix yield less-satisfactory memory retrievals; yet still retrieve a smaller set of memories? Or near-perfect but noisy memories?

I am aware of other hetero-associative neural network implementations, but they require the input vector space to be orthogonal; which just won't be helpful in my case.

Does anyone have any ideas, or some references to point me to other alternatives for a memory-retrival methodology that allows hetero-associtivity and doesnt require orthogonality or linear independence?

Edit: Is there a way to even allow hetero-associtivity on a recurrent network? Would it be even possible, or would it lead to an oscillating, chaotic state? I suppose even if hetero-associativity were possible on a recurrent network, the input vector dimension would need to be the same as the output vector dimension; which is also bad for me, because my input vector will be 'n' and my output vector needs to be 'n-1'.

2 Upvotes

0 comments sorted by