Conservation of habitat patches and the related environment benefits both the focal species and human well-being. Many indices use the dispersal range to identify habitat patches with conservation priorities. However, there lacks approaches to identify environmental variables with conservation priorities (noted as target variables) in those identified patches. Therefore, this paper proposes an approach to identify environmental variables with conservation priorities in habitat patches using perception range and introduces the related assumption. It is assumed the agents select habitats based on their prior preference and perceived information in their perception ranges, which avoids the omniscient assumption of agents. Based on such assumptions, the proposed approach identifies the target variables by approximating how animals identify their habitats. It highlights the use of perception range and identifies target variables using the maximum information gain. The variables that contribute the largest reduction of uncertainty are regarded as the target variables in the habitat patches. Taking the Common Moorhen (Gallinula chloropus) living in Tianjin, China as the case, different scenarios with 100 m, 250 m and 500 m perception ranges are designed to illustrate the feasibility of the proposed approach. The proposed approach identifies the normalized vegetation index, rather than the distance to water surface, is the target variable in 42.3%, 58.9% and 72.1% habitat patches with given perception ranges. Adjustments are made on areas within the given perception range of each patch. More grid cells that has increased suitability index can be found in scenarios given 250 m perception range, which indicates the conservation area is not always the large the better. Optimizations are expected on both a better approximation method and a more thorough hypothesis of using perception range.
Keywords: Common Moorhen (Gallinula chloropus); Conservation priorities; Information gain; MaxEnt; Perception range; Scenario analysis.
Copyright © 2021 Elsevier Ltd. All rights reserved.