Monaghan, P. , Ormerod, T. & Sio, U. N.
Lancaster University
Remote Associates Tasks (RATs) require participants to find a word that is related to three given words, e.g., cake, cottage, swiss, answer: cheese. Explanations for how these tasks are solved typically refer to spreading activation, decay, and suppression of words within a semantic network, though no implementation has yet been constructed to simulate problem solution. We contend that understanding the participant’s approach to solving these RATs can provide insight into the nature of semantic memory.
We constructed a semantic associative network based on free association norms (Nelson et al., 1998), implementing spreading activation, decay, and response suppression. For each RAT, we generated the network of words that were associated to and from each stimulus and target word in the problem. Nodes in the network represented words, and weights of connections between nodes represented the association between those words. Node activation was a linear function of the activation of input nodes multiplied by weights. Activation of nodes reduced at each time step to simulate decay. The activation of the RAT stimuli nodes were clamped at 1. At each subsequent time step of the model, the highest activated node was taken as a potential answer to the problem. The activation of this node was then reset to 0. When the model selected the target answer to the problem, the model was halted.
The model correlated with human performance on a large set of RATs (Bowden & Jung-Beeman, 2003). Analysis of the model reflected the importance of “small worlds” in semantic associations. Associations demonstrate small sets of highly-interconnected clusters of words, and sparse connections between clusters. Easy RATs demonstrate target answers that are within the same small world cluster. Hard RATs have target answers that are in another small world. RATs thus provide a window into the structure of semantic memory.