Condition: New. More information about this seller Contact this seller. Never used!. Seller Inventory P Christopher D. Local Models for Spatial Analysis. Publisher: CRC Press , This specific ISBN edition is currently not available.
C. Lloyd: Local Models for Spatial Analysis
View all copies of this ISBN edition:. Synopsis About this title In both the physical and social sciences, there are now available large spatial data sets with detailed local information. Review : " Buy New View Book. About AbeBooks. Customers who bought this item also bought. Stock Image.
The GAM algorithm was applied to the values for the live-born quality index for all neighborhoods of Rio. GAM found three clusters of high values for this index, located approximately in the Botafogo, Barra da Tijuca, and Ilha do Governador regions Figure 3. The results were concentrated in what is perceived by the algorithm as "extreme" events of high values for the index, disregarding cases which are not "significant" enough. As a basis for comparison, the traditional cloropleth-map is shown in Figure 4 , where the areal-based values are grouped by quintiles.
It should be noted that we have used the Rio de Janeiro birth patterns merely as an example to illustrate the computational behavior of the GAM technique. It is important to note that the algorithm was only searching for clusters of high values for the live-born quality index.
Clusters of low values are disregarded by GAM, since the algorithm was originally conceived to find clusters of high disease incidence. We hope to motivate health researchers to apply the GAM techniques to problems closer to its original intended use, such as sets of epidemiological events. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial data sets. Nevertheless, the application of statistical techniques to spatial data faces an important challenge, as expressed in Tobler's First Law of Geography : everything is related to everything else, but near things are more related than distant things.
The quantitative expression of this principle is the effect of spatial dependence : the observed values will be spatially clustered, and the samples will not be independent. Most spatial data sets, especially those obtained from geo-demographic and health surveys, not only possess global spatial autocorrelation, but also exhibit significant patterns of spatial instability, which is related to regional differentiations within the observational space.
As stated by Anselin , "the degree of non-stationarity in large spatial data sets is likely to be such that several regimes of spatial association would be present". As is well known, zone design is a major challenge for urban and regional planners, since it involves major decisions on how to distribute public resources. Given its present size over 13 million inhabitants and enormous socioeconomic inequalities, rational planning of the city requires a careful division of the urban space into administrative regions that are homogenous by some objective criteria.
Taking the social exclusion index as a basis, the proposed task was to group the 96 districts into a set of administrative zones, each containing a significant number of districts and homogeneous with respect to social exclusion status.
- Bestselling Series!
- Shots Rang Out!;
- My High Yield Investments?
- Local Models for Spatial Analysis;
- Local Models for Spatial Analysis, Second Edition.
We used two exploratory spatial analysis tools: the Moran Scatterplot Map Figure 6 , left and the local Moran index significance map Figure 6 , right. The Moran scatterplot map is a tool for visualizing the relationship between the observed values Z and the local mean values WZ , where Z indicates the array of attribute values expressed as deviations from the mean and WZ : is the array of local mean values, computed using matrix W. The association between Z and WZ can be explored to indicate the different spatial regimes associated with the data and to display graphically as indicated by Figure 6 left.
The Moran Scatterplot Map divides spatial variability into four quadrants:. The local Moran index I i is computed by multiplying the local normalized value z i by the local mean Anselin, :. In order to establish a significance test for the local Moran index, Anselin proposes a pseudo-distribution simulation by permutation of the attribute values among the areas. The "significant" indexes are then mapped and posited as "hot spots" of local non-stationarity. The local Moran index significance map indicated three "hot spots", two of which related to low values of inclusion located to the South and East of the city and one related to high values of inclusion located in the Center of the city.
These patterns correspond to the extreme regions of poverty and wealth in the city and were chosen as "seeds" in the zoning procedure. The remaining regions were defined interactively, taking into account the Moran scatterplot map, which clearly indicates a number of transition regions between the regions of Q1 and Q2 locations to so-called "high-high" and "low-low" areas , some of which are indicated by the ellipses.
These regions were grouped into separate zones. The work proceeded interactively until a final zoning proposal was produced, which can be confronted with the current administrative regions Figure 7. In order to assess the resulting map, a regression analysis was performed. This regression analyzes the correlation between the percentage of houses with proper sewage facilities as independent variable and the percentage of people over 70 years of age as dependent variable. The rationale behind this choice was that social deprivation is a serious impediment to healthy living, as measured by the percentage of elderly in the population.
Three OLS ordinary least squares regression analyses were performed: the first, taking all districts of the city overall; the second, using the current administrative division as separate spatial regimes; and the third, using the proposed new zoning as spatial regimes. The results as summarized in Table 1.
These results are a positive indication of the possible use of local spatial statistics as a basis for zoning procedures and show how indicators such as the social exclusion index of Sposati can be used as a support for urban planning. The key element in this paradigm is a processing system composed of a large number of highly interconnected elements neurons working in unison to solve specific problems.
Local Models for Spatial Analysis : Christopher D. Lloyd :
An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process Gopal, In principle, ANNs can represent any computable function, i. In practice, ANNs are especially useful for classification and function approximation and mapping problems which are tolerant of some imprecision and have plenty of training data available.
Almost any mapping between vector spaces can be approximated to arbitrary precision by feedforward ANNs which are the type most often used in practical applications if there are enough data and enough computing resources. Given the capabilities of ANNs as exploratory tools in data-rich environments, there has been considerable interest in their use for spatial data analysis, especially in remote sensing image classification Kannelopoulos, ; Leondes, Neural networks for spatial data integration: an economical-ecological zoning application. To illustrate the potential of ANN for spatial data analysis, we have selected one example: the use of neural networks for the integration of multiple spatial information for an environmental zoning application Medeiros, Although the chosen application does not involve health data, the integration procedure shown is relevant to heath-assessment applications, which involve multiple data sets as possible sources of epidemiological risk.
One of the more important problems in geographical data analysis is the integration of separate data sets to produce new spatial information. For example, in health analysis, a researcher may be interested in assessing the risks associated with a disease such as malaria based on a combination of different conditions land use and land cover, climatology, hydrological information, and distance to main roads and cities. These conditions can be expressed as maps, integrated into a common geographical database by means of GIS technology.
Once the data has been organized in a common geographical reference, the researcher needs to determine a procedure to combine these data sets. The main problem with these map inference procedures is their ad hoc , arbitrary nature: the researcher formulates hypotheses from previous knowledge and applies them to the data set. The process relies on inductive knowledge of the reality.
Additionally, when the input maps have many different conditions, the definition of combinatory rules for deriving the output may be difficult. For example, if an input map has eight different conditions e. There are two main alternative approaches to this problem. In this case, all input data are transformed into fuzzy sets in a [0,1] scale and a fuzzy inference procedure may be used. Alternatively, the use of neural network techniques aims at capturing the researcher's experience, without the need for the explicit definition of the inference procedure.
The application of neural networks to map integration can be done using the following steps:. For these areas, indicate the desired output response such as health risk. This idea was applied by Medeiros in his study of the integration of natural resources data as a basis for economical-ecological zoning in the Amazon region. Medeiros used five data sets as input: vegetation, geology, geomorphology, soils, and remote sensing images. Medeiros compared the result obtained by the neural network with a subjective operator interpretation and found a very strong spatial coherence between the two maps, with the neural-produced one being more restrictive in terms of results than the subjective one Figure 8.
The computer representation of geographical space in current GIS technology is essentially static. Therefore, one important research focus in geocomputation aims to produce models that combine the structural elements of space geographical objects to the processes that modify such space human actions as they operate in time.
Such models would free us from static views of space as centuries of map-making have conditioned us and to emphasize the dynamic components as an essential part of geographical space. This motivation has led to the use of cellular automata as a technique for simulation of urban and regional growth.
Cellular automata CA are very simple dynamic spatial systems in which the state of each cell in an array depends on the previous state of the cells within a neighborhood of the cell, according to a set of transition rules. CA are very efficient computationally because they are discrete, iterative systems that involve interactions only within local regions rather than between all pairs of cells.
Local Models for Spatial Analysis
A conventional cellular automaton consists of: a a Euclidean space divided into an array of identical cells; b a cell neighborhood; c a set of discrete cell states; d a set of transition rules which determine the state of a cell as a function of the states of cells in the neighborhood; and e discrete time steps , with all cell states updated simultaneously. The application of CA to geographical systems was first proposed by Tobler More recently, a number of researchers have proposed modifications of the original CA idea to accommodate geographical constraints.
The most important characteristic to be discarded is the homogeneous cell space, replaced by a space in which each cell has its own inherent set of attributes as distinct from its single state which represent its relevant physical, environmental, social, economic, or institutional characteristics. At present, however, CA models developed in GIS remain simple, because GIS do not yet provide operators with sufficient flexibility to define complex CA transition rules, and in addition they lack the simulation engines needed to run complex models at practical speeds.
The more practical approach is to couple GIS to special-purpose CA software modules, and possibly other models as well. White et al.
- Local Models for Spatial Analysis | Taylor & Francis Group.
- The Love-Times of a BBC News Analyst;
- Essentials of Orthopedic Surgery.
- Account Options?
In conclusion: geocomputation as a set of effective procedures. This survey has examined some of the main branches of research in geocomputation, and we conclude the paper with an attempt to provide a unified perspective of this new research field. We propose that a unifying perspective for geocomputation is the emphasis on algorithmic techniques. The rationale for this approach is that the emergence of data-rich spatial databases motivated a new set of techniques for spatial data analysis, most of them originally proposed under the general term "artificial intelligence", such as neural networks, cellular automata, and heuristic search.
Since there are fundamental differences in the perspectives of the set of techniques used by geocomputation, the only unification perspective is the computational side: such techniques can be thought of as a set of effective procedures that, when applied to geographical problems, are bound to produce results. Whatever results are obtained need to be interpreted in light of the basic assumptions of these techniques, and it may be extremely difficult to assign any traditional "statistical significance" criteria to them.
Therefore, the authors propose a tentative definition: "Geocomputation is the use of a set of effective computing procedures to perform spatial data analysis, whose results are dependent on the basic assumptions of each technique and therefore are not strictly comparable". According to this view, geocomputation emphasizes the fact that the structure and data dependency inherent in spatial data can be used as part of the knowledge-discovery approaches, and the choices involve theory as well as data.
This view does not deny the importance of the model-based approaches, such as the Bayesian techniques based on Monte Carlo simulation for the derivation of distribution parameters on spatial data. In fact, in this broader perspective, the use of Bayesian techniques that rely on computationally intense simulations can be considered a legitimate part of the geocomputational field of research.
In conclusion, what can public health researchers expect from geocomputation? When used with discretion, and always bearing in mind the conceptual basis of each approach, techniques such as GAM, local spatial statistics, neural nets, and cellular automata can be powerful aids to a spatial data analysis researcher, attempting to discover patterns in space and relations between its components. We hope this article serves as inspiration to health researchers and that it will have broadened their notions about what is possible in spatial data analysis. For readers interested in more information on geocomputation, we provide a set of references, organized by topics.
We suggest that prospective readers begin with Longley and then proceed to their specific area of interest.
Related Local Models for Spatial Analysis
Copyright 2019 - All Right Reserved