Conceptual blending

Contents

1 Conceptual belnding

 

 

 

1

Conceptual blending

From Wikipedia, the free encyclopedia

Conceptual Blending (aka Conceptual Integration) is a general theory of cognition.[1] According to this theory, elements and vital relations from diverse scenarios are “blended” in a subconscious process known as Conceptual Blending, which is assumed to be ubiquitous to everyday thought and language. Insights obtained from these blends constitute the products of creative thinking, though conceptual blending theory is not itself a theory of creativity, inasmuch as it does not illuminate the issue of where the inputs to a blend actually come from. Blending theory does provide a rich terminology for describing the creative products of others, but has little to say on the inspiration that serves as the starting point for each blend.

Conceptual Blending Theory and the Social Sciences

Conceptual Blending Theory (CBT) provides a model of how meaning is constructed by selectively projecting – according to optimality and governing principles – materials from mental spaces (Fauconnier, 1985, 1997), small conceptual packets built as thought and discourse unfold. These materials – cultural frames, embodied schemata, relations, contextual knowledge – are integrated into new wholes or blends, which retain the links to their inputs, thus forming a network of mappings and projections. According to this theory, conceptual integration is a defining human capacity underlying all major products of meaning construction: metaphor, language, religion, art, etc.

A recent study in CBT regards metaphorical mappings as emergent from much more complex integration networks (Fauconnier & Turner, 2008a). Like Fauconnier’s analysis of anger metaphors (2009), the network for TIME-SPACE in that study has generic features, and can be used for comparative study beyond individual examples. CBT should provide more complex generalizations of conceptual patterns that generate emotion, in order to account for all the inferences that do not result from source-target projection. A closer look at affective connotations is also necessary. For example, in the classic example this surgeon is a butcher (Grady, Oakley & Coulson, 1999), we do not only have INCOMPETENCE as an emergent meaning: there are also connotations of fear, resentment, contempt… (heavily dependent on viewpoint and context: Brandt & Brandt, 2002) that do not arise in the inputs, but only in the blended space, where someone pursuits an objective with inadequate means or a careless attitude. CMT cannot deal with these meanings, since the domains of surgery, butchery, or even professions in general are not loaded a priori with such connotations, and thus these inferences cannot be transferred to their targets. None of them is a domain of emotion. These complex mappings and whatever systematicity there is in their interaction with context and diachrony can only be generalized

Emergence

In philosophy, systems theory, science, and art, emergence is the way complex systems and patterns arise out of a multiplicity of relatively simple interactions. Emergence is central to the theories of integrative levels and of complex systems.

……….

Jeffrey Goldstein in the School of Business at Adelphi University provides a current definition of emergence in the journal, Emergence (Goldstein 1999). Goldstein initially defined emergence as: “the arising of novel and coherent structures, patterns and properties during the process of self-organization in complex systems”.

…………….

The usage of the notion “emergence” may generally be subdivided into two perspectives, that of “weak emergence” and “strong emergence“. Weak emergence describes new properties arising in systems as a result of the interactions at an elemental level. Emergence, in this case, is merely part of the language, or model that is needed to describe a system’s behaviour.

But if, on the other hand, systems can have qualities not directly traceable to the system’s components, but rather to how those components interact, and one is willing to accept that a system supervenes on its components, then it is difficult to account for an emergent property’s cause. These new qualities are irreducible to the system’s constituent parts (Laughlin 2005). The whole is greater than the sum of its parts. This view of emergence is called strong emergence. Some fields in which strong emergence is more widely used include etiology, epistemology and ontology.

Complexity

Weaver perceived and addressed this problem, in at least a preliminary way, in drawing a distinction between “disorganized complexity” and “organized complexity“.

In Weaver’s view, disorganized complexity results from the particular system having a very large number of parts, say millions of parts, or many more. Though the interactions of the parts in a “disorganized complexity” situation can be seen as largely random, the properties of the system as a whole can be understood by using probability and statistical methods.

A prime example of disorganized complexity is a gas in a container, with the gas molecules as the parts. Some would suggest that a system of disorganized complexity may be compared, for example, with the (relative) simplicity of the planetary orbits—the latter can be known by applying Newton’s laws of motion, though this example involved highly correlated events.

Organized complexity, in Weaver’s view, resides in nothing else than the non-random, or correlated, interaction between the parts. These correlated relationships create a differentiated structure that can, as a system, interact with other systems. The coordinated system manifests properties not carried or dictated by individual parts. The organized aspect of this form of complexity vis a vis to other systems than the subject system can be said to “emerge,” without any “guiding hand”.

The number of parts does not have to be very large for a particular system to have emergent properties. A system of organized complexity may be understood in its properties (behavior among the properties) through modeling and simulation, particularly modeling and simulation with computers. An example of organized complexity is a city neighborhood as a living mechanism, with the neighborhood people among the system’s parts.[4]

Advertisements