Neural Hypertext Networks.
As an expansion of the principles outlined in the previous chapter, (i.e. hypertext networks are associative networks much like neural networks), we constructed a hypertext network in which each connection between any two pair of nodes A and B is associated with a unique uni-directional measure of the strength of their associative relation.
The strength of this relation is dynamically derived from browsers's navigation patterns by a set of 3 Hebbian learning rules. (Hebb's original law of learning [Hebb, 1967] stated that if within a learning network two nodes were stimulated at the same of nearly the same time, learning should take place by increasing the strength of this connection. We have replaced the notion of excitation by navigation: a node is excited if it is being retrieved)
The learning rules:
As the network is being browsed, these learning rules will thus change the value of connections among nodes in the network, based on local patterns of user navigation. They operate in parallel and strictly local to:
- 1. Frequency: if the connection between a node A and a node B has been used, it will be strengthened by a small reward Fb.
- 2. Symmetry: if the connection between a node A and a node B has been used, the connection between the node B and A will be strengthened by a small reward Sb.
- 3. Transitivity: if the connection between a node A and a node B has been used, and subsequently the connection between node B and a node C has been used, the connection between node A and C will be strengthened by a small reward Tb.
-reinforce worthwhile, exisiting connections: frequency
-initiate new, previously non-existing connections among nodes: symmetry and transitivity