Du er ikke logget ind
Beskrivelse
Natural products play an integral and ongoing role in promoting numerous aspects of scientific advancement, and many aspects of basic research programs are intimately related to natural products. The significance, therefore, of the Studies in Natural Product Chemistry series, edited by Professor Atta-ur-Rahman, cannot be overestimated. This volume, in accordance with previous volumes, presents us with cutting-edge contributions of great importance.Iceland provides an unique stage on which to study the natural environment, both past and present, and it is understanding both aspects of reconstructing the past and observing and interpreting the present that form the focus of the contributions to this volume. The papers are all written by active researchers and incorporate both reviews and new data. Although concentrating largely on the recent Quaternary timescale a wide range of topics is explored including subglacial volcanism, onshore and offshore evidence for the Last Glacial Maximum and subsequent deglaciation, current glacial characteristics including joekulhlaups and glacial landsystems, soil development, Holocene ecosystem change, current oceanography, impacts of volcanic sulphur loading, chemical weathering and the CO2 budget and documentary evidence for historical climate. The key element of the volume is that for the first time it provides a wide overview of a range of topics for which Iceland provides an almost unparalleled laboratory emphasizing the importance of research on this small island for studies over a much broader global scale. These reviews point the way to future research directions and are supplemented by extensive illustrations and a comprehensive bibliography.This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition - in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations.The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways; computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons; Hebbian learning rules and the elaboration of learning vector quantization; the linguistic pathway in the left hemisphere; memory and the hippocampus; truth-conditional vs. image-schematic semantics; objectivist vs. experiential metaphysics; and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book's website.