Citation: Xiaoke Li, Fuhong Yan, Jun Ma, Zhenzhong Chen, Xiaoyu Wen, Yang Cao. RBF and NSGA-II based EDM process parameters optimization with multiple constraints[J]. Mathematical Biosciences and Engineering, 2019, 16(5): 5788-5803. doi: 10.3934/mbe.2019289
[1] | Guilherme Giovanini, Alan U. Sabino, Luciana R. C. Barros, Alexandre F. Ramos . Correction: A comparative analysis of noise properties of stochastic binary models for a self-repressing and for an externally regulating gene. Mathematical Biosciences and Engineering, 2021, 18(1): 300-304. doi: 10.3934/mbe.2021015 |
[2] | Meiling Chen, Tianshou Zhou, Jiajun Zhang . Correlation between external regulators governs the mean-noise relationship in stochastic gene expression. Mathematical Biosciences and Engineering, 2021, 18(4): 4713-4730. doi: 10.3934/mbe.2021239 |
[3] | Mainul Haque, John R. King, Simon Preston, Matthew Loose, David de Pomerai . Mathematical modelling of a microRNA-regulated gene network in Caenorhabditis elegans. Mathematical Biosciences and Engineering, 2020, 17(4): 2881-2904. doi: 10.3934/mbe.2020162 |
[4] | Zhenquan Zhang, Junhao Liang, Zihao Wang, Jiajun Zhang, Tianshou Zhou . Modeling stochastic gene expression: From Markov to non-Markov models. Mathematical Biosciences and Engineering, 2020, 17(5): 5304-5325. doi: 10.3934/mbe.2020287 |
[5] | Pierre Degond, Maxime Herda, Sepideh Mirrahimi . A Fokker-Planck approach to the study of robustness in gene expression. Mathematical Biosciences and Engineering, 2020, 17(6): 6459-6486. doi: 10.3934/mbe.2020338 |
[6] | Youlong Lv, Jie Zhang . A genetic regulatory network based method for multi-objective sequencing problem in mixed-model assembly lines. Mathematical Biosciences and Engineering, 2019, 16(3): 1228-1243. doi: 10.3934/mbe.2019059 |
[7] | Qiuying Li, Lifang Huang, Jianshe Yu . Modulation of first-passage time for bursty gene expression via random signals. Mathematical Biosciences and Engineering, 2017, 14(5&6): 1261-1277. doi: 10.3934/mbe.2017065 |
[8] | Changxiang Huan, Jiaxin Gao . Insight into the potential pathogenesis of human osteoarthritis via single-cell RNA sequencing data on osteoblasts. Mathematical Biosciences and Engineering, 2022, 19(6): 6344-6361. doi: 10.3934/mbe.2022297 |
[9] | Gerardo Aquino, Andrea Rocco . Bimodality in gene expression without feedback: from Gaussian white noise to log-normal coloured noise. Mathematical Biosciences and Engineering, 2020, 17(6): 6993-7017. doi: 10.3934/mbe.2020361 |
[10] | Heli Tan, Tuoqi Liu, Tianshou Zhou . Exploring the role of eRNA in regulating gene expression. Mathematical Biosciences and Engineering, 2022, 19(2): 2095-2119. doi: 10.3934/mbe.2022098 |
Since the publication of the Brundtland report Our Common Future [1], a new paradigm has appeared in the scientific community with the emergence of the new field of sustainability science [2]. Beyond the different possible definitions of sustainability science1, the application of advanced analytical-descriptive quantitative tools is recognized as an essential element to guide decision-making toward the goal of meeting society's needs, while remaining within a "safe operating space" (planetary boundaries) [3]. Hence the concept of quantitative sustainability assessment [4], the final aim of which is to support decision-making in a broad context encompassing three dimensions: Economic, environmental and social [2,5,6], in order to favor the process of sustainability transition. Sustainability transitions are described as "long-term, multi-dimensional, and fundamental transformation processes that bring socio-technical systems to shift to more sustainable modes of production and consumption" [7]. Sustainability transition is an emerging field of research, which is advancing at a fast pace and finds many applications in the study of STSs, such as energy, water or food supply, or transportation [8].
1 For a synthesis, see [2].
To foster such a sustainability transition, the "3rd industrial revolution strategy study for the Grand Duchy of Luxembourg" was presented in 2016 [9]. For the regional agricultural STS (embedded in the larger chapter on food), the document proposes a roadmap until 2050, highlighting opportunities for a sustainable food sector and a source of renewable energy, with the vision of 100% organic agricultural practices by 2050. Among the more specific strategic measures, the encouragement of new relations between food consumers and producers and the promotion of new habits of consumption are advanced.
While such measures and related goals can be a good starting point, the underlying sustainability claims must be tested. In the farming sector, as in other STSs, appropriated modelling approaches are thus needed, capable of capturing both the specificities of the investigated STS and the multidimensionality of the envisioned sustainability transition [10]. Furthermore, a deeper understanding of sustainability issues should be achieved among stakeholders. One of the approaches suggested to this end in the literature is for example the use of serious games featuring sustainable development practices and policies [11].
The main distinctive features of STSs are the presence of (networks of) actors, whether individuals, companies, and other organizations and collective actors, and institutions (societal and technical norms, regulations, standards), as well as material artefacts and knowledge [8]. The interaction of the different elements of the system is the fundamental to provide specific services for society. In the case of agriculture, the primary service associated to it is food provision, but other services can also derive from it, such as biomass production for energy purposes, or landscape and recreational values (in the case of farmhouses).
Given the important role of actors and their interactions, one of the most peculiar elements of quantitative sustainability assessment of STSs is the consideration of the effects of human behavior as the cause, and at the same time, the effect, of collective actions, which are the result of the interaction of social actors. These collective actions are the driver of the so-called "emerging features" of a system, which come into being with no central planning and would not be possible to observe by reducing them to consideration as single actors or representative actors [12].
The above-mentioned components call for important implications of computational social sciences and a trans-disciplinary approach in a larger sense as essential elements of modern sustainability research [13] and inevitably refer to the concept of complex systems when dealing with the human-environment interaction.
In the domain of complex simulations, AB modelling is a technique well suited to studying complex coupled human-natural systems [14,15]. ABMs have largely been applied in recent years, spanning a very wide landscape of application domains, including economics, techno-social systems, and the environment [16,17,18,19,20,21]. ABMs are used to assess system-level patterns that emerge from the actions and interactions of autonomous entities [20,22].
Agents can be defined as social autonomous entities that interact with other agents or with their environment to achieve their goals when necessary. They can represent a physical or a virtual entity [23], are embedded in a dynamic environment, and are capable of learning and adapting in response to changes in other agents and the environment [24,25].
In this paper, we will pay special attention to the use of ABMs in a particular field of quantitative sustainability assessment that is LCA. The aim of the paper is to give a perspective on the different "modelling blocks" one needs to take into account to build an ABM, starting from general issues that must be considered, regardless of the domain of application (such as validation, sensitivity analysis, agent definition, data provision) and before focusing on examples that are more specific for ABMs applied to agricultural and land use modelling.
The "modelling blocks" are described though a state of the art of the literature, conducted based only on scientific papers retrieved from www.sciencedirect.com and https://scholar.google.com. The terms used for the research were: (agent-based OR agent based) AND (additional search term). The additional search term was specific for each section of the paper: (LCA OR life cycle assessment; (Agriculture OR Land use); (Validation); (Visualization); (Uncertainty); (Sensitivity Analysis).
In its classical implementation, LCA represents the world via static connections between technologies, and linear relationships between production and supply (constant efficiency of production processes and unconstrained market). This is true for the A-LCA setting, although its limitations are currently being addressed by many researchers, and more advanced LCA methods are being applied, which go in the direction of an integrated LCSA framework [26,27,28]. A-LCA can in fact be useful in many contexts, but it reaches its limitations when evaluating complex systems [29]. In particular, using ABMs in a C-LCA context is very pertinent whenever the impacts that one wants ultimately to model are the effect of the interaction of a multitude of actors whose behavior in the system is difficult to schematize in a rational manner using average data and/or steady state equations. In these cases, an ABM is potentially a valuable tool to derive consistent foreground data for the Life Cycle Inventory (LCI) [29]. Baustert and Benetto [30] identifies three types of coupling of ABM and LCA models, according to the direction of the information flow: ABM-enhanced LCA (when the ABM feeds the LCA model); LCA-enhanced ABM (when the information flows in the opposite direction) and ABM/LCA symbiosis (when a loop of information occurs between the two models).
ABMs represent one of the methods for scenario modelling that have been proposed for analyzing future systems and the evolution of existing systems in LCA [31,32]. Page et al. [33] gives a comprehensive overview of ABMs applied to environmental management. According to Davis et al.[29], ABM complements LCA because it provides a means to model nonlinear dynamic systems, which allow the consideration of social and economic aspects, while A-LCA is a tool for the linear modelling of static systems. Davis et al. [29] presents a methodology that allows a type of LCA where the background2 system becomes, according to the authors, dynamic. The paper describes a methodology to combine LCA and ABM, with an application to a case on bio-electricity in the Netherlands. The authors apply the coupled model to the simulation of an energy infrastructure system, claiming that the use of ABM helps overcome the shortcomings of traditional LCA, namely: (ⅰ) the difficulty of making calculations dynamic; (ⅱ) the possibility of conducting an uncertainty analysis based on a supply chain that exists given the simulation conditions and therefore based on structural dynamics (such as policy regimes or economic circumstances); (ⅲ) the possibility of heeding non-linear effects arising from the possible permutations of the supply chains and the interactions emerging from them. Although the efforts made by the authors are praiseworthy, the characteristics they bring to the analysis are not those of a fully dynamical LCA, since they do not model every aspect of the background system in a dynamic way, using for example dynamic production or delivery functions [28,35]. In fact, the AB simulator was linked to a database containing the results (for several goods) of an LCA that had already been performed in a previous step. The link was not achieved via a hard coupling, i.e., a fully operational and automatic link between AB simulator and LCA software. The innovative element of the paper lays in the use of a dynamic (or potentially incremental) set of agents, in the sense that the agents (called Technology Agents) can invest in bioelectricity production and therefore enter or exit (if their investments turn out to cause losses) the simulation. The technology matrix3 can then expand or shrink as the simulation progresses. This is the only dynamic aspect of the LCI. A simplified LCA is calculated for each agent on the basis of the functional unit4 (which is the same for the same agent over time) and the results of the LCA evaluation can be fed back to the agents for use in decision-making at the next simulation steps.
2 Product systems can be divided into foreground processes, which denote the steps of the life cycle whose operations are directly modelled by the study, and a background process, which represents points of reliance on the global industrial system and are not under the control of the modeller [34].
3 For a technical definition of the technology matrix (sometimes called also technosphere matrix), see [36].
4 In LCA jargon, the functional unit is the reference amount of the function delivered by the system addressed by the study.
There are few examples of ABM/LCA coupling in the literature because, although the AB paradigm is increasingly gaining acceptance in the LCA community, its implementation is not easy for non-specialists. Miller et al. [37] applied ABM to a LCA of switchgrass cultivation by farmers in response to policies. All agents have the possibility to adopt switchgrass and Bayesian probabilities are used to evaluate farmers' orientation towards switchgrass adoption, as opposed to resistance to change. This is an original way to estimate farmers' level of risk aversion, which is normally a difficult parameter to evaluate. The same problem is addressed by Bichraoui-Draper et al. [38], using a decision tree based on variables such as familiarity with the new crop, risk aversion, economic profit and imitation of neighbors to implement agents' decisions to plant switchgrass. Very similarly, Heairet et al. [39] presents a basic example of an ABM-enhanced LCA of a switchgrass cultivation for biogas production in a theoretical study region.
An ABM was used to assess mobility policies, dealing in particular with the deployment of electric vehicles in Luxembourg and the neighboring French region Lorraine [31]. Via the AB simulations, they proved that the objective of deploying 40,000 battery-powered electric vehicles (BEVs) in Luxembourg by 2020 will probably not be achieved if counting Luxembourg's passenger cars only. The environmental implications of such a deployment scenario are presented [32]. In this latter paper, the authors use the results of AB simulations to derive inventory data for a C-LCA study, demonstrating the erroneous implications of upscaling the results from attributional LCAs of individual vehicles to answer policy-related questions at the national scale (acidification was presented as a show case). The ABM showed its ability to help obtain detailed consumption and emissions data for agents' daily travels, taking into account the large variability of car models and their utilization at the national scale without resorting to simplistic averages.
Coupling of ABM and LCA is introduced in Navarrete Gutiérrez et al. [40] and Marvuglia et al. [41] to simulate the evolution of the Luxembourgish agriculture: The agents (farmers) are informed of the LCA-based environmental impacts deriving from their crops at the end of each simulation step, and use them to plan the next crop rotation step. This is not yet an ABM/LCA symbiosis, but rather two unidirectional couplings, since the LCA results are not updated during a simulation using the ABM information. According to Navarrete Gutiérrez et al. [40] and Marvuglia et al. [41], this detailed bottom-up modelling allows a simulation of aspects that it is not possible to achieve with a purely economic-oriented model. The results are actually compared to a top-down (economic model) approach presented in an earlier study [41].
The environment in which agents operate can be summarized in a few elements [42]: Spatial explicitness (each agent has a specific location at each step of the simulation); time (the model can be static or dynamic); exogenous events (for example, extreme climatic events in agricultural models). When LCA is at stake, the environment also takes into account the natural resources consumed and pollutants emitted that the studied system (e.g., the agricultural land of a country or a region) exchanges with the ecosphere in a LCI perspective. If the modeler wishes to link the model results with existing environmental regulations, or to implement specific environmentally-conscious scenarios, environmental constraints can be set as thresholds in the agents' set of rules (see section 3.2).
The setting of the ABM's environment shall be done in parallel with the definition of the system boundaries of the LCA study that will be fed by the ABM results. In this way, the modeler makes sure that the system studied by the ABM and the system studied in the LCA model are the same or at least largely overlap. For example, if the aim of the LCA is to study the effects of an agricultural policy, the ABM should be focused on the territory addressed by the policy (a nation, a region, or a larger scale); it should have the necessary time resolution to capture the effects of this policy over time and it should include exogenous elements (say, for example, abrupt crop price changes) that could help understand the unfolding and evolution of possible future scenarios in the studied domain.
The definition of agents' profiles and the specific features one wishes to model depend on the problem at stake. Data surveys can be carried out, and the analysis of the information achieved through these surveys can help the modeler to define the most suitable characteristics that the agents should have, as well as their rules of action and interaction [43]. Each agent in the model (that we can call abstracted agent) may be characterized by the class of agents it belongs to (e.g., farmer agent, real estate agent, cooperative agent, etc.), and by a set of attributes which assume certain values. It is however worth noting that data collection via surveys is normally resource- and time-consuming. If these resources are not available to the modeler's team, a common approach is to use aggregated statistics (such as census data) [44]. The agents' attributes can then be assigned arbitrarily [38,45] or by importing the joint distribution of agents' attributes from the available empirical data source [46]. An interesting approach based on Bayesian probabilities is used in [37] to estimate the adoption of technology by farmers; in other words, Bayes' theorem is used to estimate the probability that a farmer would make the decision to cultivate switchgrass starting from the values assumed by its attributes. This is an original way to estimate farmers' level of risk aversion, which is normally a difficult parameter to evaluate.
Attributes: The attributes represent features of the abstracted agent itself which are used to characterize its behavioral component, mimicking the behavior of the entities in the real world (i.e., the real agents) they simulate (e.g., the proneness of the agent to assume certain investment risks, or the level of confidence it has about other agents when activating imitation mechanisms) and that other agents may not have; or features that all the agents belonging to a class inherit from the class (e.g., if an agent belongs to the "farm" class, then it will have certain attributes related to the size of the arable land, the content of nutrients in the soil at each simulation step, etc.). Some attributes may remain unchanged during the simulation steps (e.g., the size of the arable land of a farm) and some others may evolve dynamically as the simulation progresses (e.g., the risk aversion of an agent "farmer" may be updated after interaction with other agents, if the modeler has implemented this kind of adaptation mechanism in the model).
Architectures: Concerning the definition of the agents' structure, two main architectures have been developed in the literature on ABMs [47]. The first one is the reactive approach, in which the agents (which are often memory-less) adopt simple behaviors based on their reaction to stimuli from the surrounding environment. The second one is the deliberative (or cognitive) approach. Cognitive agents are capable of acting ("behaving") based on an evaluation of their environment and their own presence therein. They respond to changes in the environment, including those caused by their own actions, by updating their evaluation of the environment. A well-known framework for the implementation of cognitive agent architecture is BDI [48], which has its roots in "folk psychology", i.e., the way humans think they think [49]. This architecture has also been applied to land-use planning [50,51], although its implementation faces two main obstacles: Complexity for the field-expert modeler, and a high computational cost.
A complete and quite "general purpose" set of good practices to build an ABM is provided by Rand and Rust [52]. An operational framework to define agent profiles is provided by Smajgl et al. [43].
Behavior: Each agent will also need to have a set of behavioral rules, which will determine the logic according to which the agents will take actions (thus possibly updating their state, i.e., their set of properties). Normally, these rules are determined by a combination of deterministic and stochastic functions. Each agent will have a set of neighboring agents, with which it will interact using the mechanisms set for social interaction [45], which very often consists of the implementation of a so-called "small world" network [53].
Clearly, if the ABM is being built to produce inventory data for an LCA study, then the definition of the agents' structure should reflect the nature of the system one intends to assess with the LCA. For example, if the final aim of the research is to conduct an LCA of a policy fostering the production of biogas, the ABM could focus on the field operations phase of the value chain, of which the farmers are the main agents. Other "upper-level" agents, such as an agent representing "the market", can also be implemented to interact with the farmer agents.
In the field of AB modeling, visualization plays an important role in identifying, communicating and understanding the relevant dynamics of the modeled phenomenon.
Graphics of ABM platforms are generally rather simple. For example, using the NetLogo5 platform, the agents (called turtles) are represented as pixels or stylized icons and standard graphs are produced as a result of the simulation runs. An example is provided in Figure 1. It was realized in the NetLogo platform by Marvuglia et al. [54]. The green rectangle contains a symbolic representation of farms (represented as pixels with different icons and colors according to the crop planted on that land parcel and to the level of nitrogen present in the soil in that parcel). The colors and the shapes of the icons evolve with the simulation (which simulates a four-year crop rotation scheme with time steps of one week). The graph within the plot box on the right-hand side of the figure shows the evolution of the number of farms planting each crop during each week of the simulation (the crop names appear in the legend on the right of the plot box). Initially, when the model is set up, a certain amount of nitrogen is randomly attributed to each patch, following a Gaussian distribution with a certain mean (μ) and standard deviation (s), which can be set by the user with a slider. The nitrogen content in each patch decreases with a certain law as the simulation progresses. The model also contains a basic social imitation component that can be switched on and off. If the social imitation component is switched on, when an agent (i.e. a farmer) needs to decide which crop to plant in the next planting cycle, he will choose the most frequent crop among those planted in the last cycle by 10 neighbors within a radius of 15 patches from his patch (with a probability set by a parameter called "imitation-influence", whose values can be set using another slider). Finally, a basic economic driver is also implemented for the planting choice. It is implemented through a decision rule depending on the "min_margin" parameter (minimum profit margin), which states that a crop is excluded by the agent from the list of the possible crops to plant if the net gain foreseen for that crop is not equal to at least min_margin times the cost borne to plant that crop. In this example, min_margin can vary between 1 and 4 (with integer variation steps of 1 unit).
5 https://ccl.northwestern.edu/netlogo/.
These are the two ways commonly used to present the results of AB simulations: Live runs (the time sequence of "maps" of which the left-hand side of Figure 1 is a screenshot) and graphs showing the cumulative results of multiple runs (like the ones shown in the right-hand side of Figure 1).
Although the model presented in Marvuglia et al. [54] was a quite simple one (with no representation of the data on a georeferenced system), the graphical representation shown by the picture is quite a standard way of visualizing information in NetLogo, which is commonly used in most of the applications found in the literature [55], although NetLogo has a GIS extension (that allows the importation of georeferenced vector and raster maps) and a 3D visualization extension (that allows the creation of 3D environments for the agents). Interestingly enough, AnyLogic [56] and GAMA also allow for 3D visualization [57].
AgriPoliS [58] is a powerful AB agricultural model with high explanatory power in describing the development of farms in competitive markets and forecasting their evolution as the results of policies. In AgriPoliS, the space is represented in a stylistic way. Space is represented by a set of equally sized cells organized in a chessboard-like pattern.
A generic ABM to model land-use decisions and consequent energy consumption and pollution dynamics is presented in Zellner et al. [59]; the scenarios modelled are however highly stylized (no GIS data used; zoning represented as concentric circles).
ABM together with role-playing games is used by Astier et al. [60] to support participatory processes in the sustainability assessment of natural resource management by small farmers. The model the authors describe is characterized by a very user-friendly interface with nice pictorial interfaces for data display, which facilitates stakeholders' participation. However, the GIS-based representation of the ABM is once again missing.
Murray-Rust et al. [61] explores different aspects in a new ABM framework. This paper investigates: (ⅰ) how the social factors, versus economic and environmental factors, are capable of steering on-farm crop rotation choices and the provision of ecosystem services; (ⅱ) the degree of subsidy acceptance among farm households and their level of effectiveness; (ⅲ) the conditions under which marginal land enters or exits agricultural production. The landscape is modelled as a collection of parcels, which represent the smallest spatial units at which decisions are taken (e.g., a farmer's field). Although the paper does not explicitly present maps, the approach described represents a very advanced modelling framework, which combines the ABM of land use, a detailed multilevel consideration of temporal aspects, a varied socio-economic context, and ecosystem service modelling.
A spatially explicit ABM based on the use of GIS maps is presented in Wise and Crooks [62]. It consists of a number of modules describing the physical, economic, and social processes that have an impact on land-use patterns, and also includes a GUI where a number of parameters can be adjusted according to one's underlying assumptions. This is indeed an important feature for an ABM to have in order to be fully informative. More precisely, one task the ABM could help to perform is finding a possible set of the decision variables (if it exists) with which the model would produce a certain, desired result. The GUI should facilitate this operation, because the user should be able, via the GUI, to choose certain scenarios (these scenarios having been pre-set by the modelers) and tweak the decision variables to observe the results of the simulations. Kravari and Bassiliades [63] briefly presents the different ABM platforms available and gives some indications on the availability of their visualization functionalities. A comprehensive overview of the visualization properties and principles in ABMs is outside of the scope of this paper, but useful ABM visualization design guidelines are provided in [64].
Besides the application in the context of the ABM, data visualization is a general issue that deserves the appropriate attention. The first important question is always what data users need to see and why. The answer to these questions should lead to an understanding of how visual results representation can be optimized.
Unlike the conventional validation, based on the assessment of a model's results against measured data once the model has been completed, validation of ABMs should start from the conceptual design stage. A number of different strategies exist to shape a model in the early stages of model building [65]. Nonetheless, non-statistical models like ABMs have often been criticized for lacking a consistent validation protocol [66].
As a matter of fact, the macro-scale behaviors generated by interacting agents are often difficult to predict. Behavior can be extremely sensitive to initial conditions and show path dependency because of feedback loops [67]. ABMs contain non-linear, recursive interactions and feedback, and they also tend to have multiple degrees of freedom, which are responsible for the generation of several possible simulation outputs which need to be validated. In this context, validating the ABM coupled with an LCA model does not mean validating the LCA model the ABM is meant to feed. The ABM results are used only to feed the LCA study, and the assumptions behind the LCA model, as well as the quality of the LCI data, will influence the final results of the environmental assessment. In addition to that, one has to remember that the results of an LCA are expressed in terms of "potential" environmental consequences, but they cannot be directly measured and they cannot be compared either to theoretical values, or to real measurements. Empirical validation of LCA results per se is therefore not possible.
Different classifications of validation methodologies for simulation models have been proposed. Sargent [68] considers that a valid model must have: conceptual validity, specification verification, implementation verification and operational validation. In Zeigler et al. [69], validity is the degree to which a model faithfully represents its system counterpart. Bianchi et al. [70] identifies three possible ways of validating computational models: Descriptive output validation, matching the output generated with the model against already available (past) actual data (called historical validity in [71]); Predictive output validation, matching computationally generated data against field data that have yet to be acquired (be it data about future events, therefore not yet existing, or data about past events, that have not yet been gathered); Input validation, ensuring that the structure of the model is able to reproduce the main aspects of the actual system (called rwDGP in [72]).
In the remainder of this section, we will follow the classification proposed in Knepell and Arangno [73], which lists the following six types of validation: conceptual, internal, external, cross-model, data, and security.
Conceptual (or theoretical) validity can be assessed by determining to what extent the chosen theories and underlying assumptions are appropriate for the purpose of the model; while the correctness of model construction is termed internal validity. McKelvey [74] refers to internal validity as analytical adequacy. Participatory modelling (the engagement, collaboration or participation of stakeholders both in model design and validation), also known as group model building, mediated modelling, companion modelling, or shared vision planning [75], can be an important component of conceptual ABM validation [75]. In the participatory approach philosophy, repeated interactions between modelers, domain experts and other stakeholders contribute to ensure that validity and reliability of model structure, assumptions, processes and outcomes [76]. Moreover, stakeholder participation also contributes to enhancing the quality, transparency, credibility, and relevance of the model [75]. Stakeholders' involvement is not limited to phases of model development, but may also include running simulations.
Internal validity is referred to as analytical adequacy in McKelvey [74], where the term means that "the model correctly produces effects predicted by the theory". In Bianchi et al. [77] the same concept is instead expressed using the term ex ante validation. External validity considers the adequacy and accuracy of the model in replicating real-world data. It is also referred to as operational validity [68,76] or empirical validation [78,79,80].
For example, Bichraoui-Draper [45] compares the evolution over time of the quantity of by-products exchanged by the companies within the industrial cluster of the Champagne-Ardenne region (France) with that obtained by the Ulsan industrial park (South Korea) with the model presented by Park et al. [81]. In Bichraoui-Draper et al. [38], the authors compare the yearly evolution of the area planted under switchgrass, as predicted by their model, with the data observed from genetically engineered soybeans in the U.S., using the latter as a proxy to detect farmers' orientation to changes. The two evolutions show a very similar trend, with a rapid increase of adopters and the tendency towards saturation after about 15 years. The conformity of the two trends is assumed as a way of validating the model empirically.
An empirical validation of the model is performed in Busch et al. [82], using a participatory/companion process engaging stakeholders. According to the definition given by Rotmans [83], in Busch et al. [82] the stakeholders are involved as "users", since they are already engaged in the validation of the conceptual model, prior to implementation.
Axtell [84] reports on a way of carrying out validation by running the model through a range of initial conditions and parameters multiple times in order to assess the robustness of the results by observing how much they change as an effect of changing the parameters. As suggested by Davis [85], this exploration of the parameters' space may also be achieved using sampling techniques such as Latin Hypercube Sampling [86].
The empirical validation of ABMs may involve not only the assessment of the discrepancy between the model's results and reality, but also the simultaneous calibration, tuning and further development of the model. The testing, tuning and further development of an existing model can be guided by observed patterns that the model seeks to explain [67]. Topping et al. [87] refers to this process as post-hoc POM [88]. POM is certainly becoming a widely used framework for the validation of ABMs as an alternative to numerical validation methods, especially in systems ecology [89,90], but also in land use and agricultural processes modelling. Topping et al. [87] describes an example of POM for reproducing observed patterns, as well as calibrating and further developing the model based on these patterns. The patterns are generally shown to people who are knowledgeable about the dynamics of the target system (subject-experts) who are asked if they can distinguish between system and model outputs. This sort of external validation is also called the Turing test [71] and is generally applicable to any simulation model. In principle, the idea behind the Turing test is that if the outputs of the simulation model are qualitatively or quantitatively indistinguishable from the observed target system, one can assume that a substantial level of validation has been obtained. A similar approach to validation is taken in Delmotte et al. [91], which describes the implementation of a novel method, termed IAAS, based on ABM with farmers and BEM with local stakeholders for scenario assessment. In this approach, the decision-making process is provided as an input by real farmers participating in ISS. During an ISS, each farmer used a laptop connected to a central computer running the ABM. The BEM is a multiple goal linear programming model solved with GAMS [92]. The validation was performed with stakeholders and farmers who had to comment on the plausibility of a number of socio-economic and environmental indicators calculated with the model.
Cross-model validity [93] is similar to external validity, but models are assessed independently at both the micro and macro levels [52].
Data validity refers to data accuracy and data adequacy, while security validity refers to measures in place to protect data integrity. However, in practice, no model is built based on complete information and a totally faithful picture of the reality. In this respect, Laurent [94] argues that ABM modelers should not be worried about not being able to reproduce every aspect of the reality.
An advanced decision-support tool based on an ABM is presented in [95]. The results presented in the paper were obtained via a tool called Mersim, which was developed during a participatory learning process [96]. The model aims to improve the understanding of unintended and undesired effects of policies, simulating the poverty and economic outcomes of scenarios selected and designed by decision-makers participating in the model design process. The tool was used to investigate the impacts of Mekong mainstream dams, payments for ecosystem services on changes in land use, large scale irrigation, and sea-level rise.
In Smajgl et al. [97], the simulation results for the policy scenarios are plotted as graphs and maps for each of the local study areas and are shown to subject-experts and stakeholders. The analysis focused on identifying relevant patterns (from simple causal patterns to spatial patterns) that emerged independently from the numerical differences between individual runs. This is a very interesting example of patter-oriented validation because in this case the emerging patterns highlighted counter-intuitive results, which in a first phase challenged stakeholders' and experts' existing beliefs. But once the simulation dynamics were explained to them and the results clarified, the stakeholders confirmed that the simulated results were consistent with their expectations. This represents a very good example of stakeholders' involvement in a participatory approach, in which the findings of the model are able to reverse initial beliefs because the stakeholders are guided (via participatory workshops) through a deeper understanding of reality.
Figure 2 shows a schematic representation of the steps described on this section of the paper, especially adapted for ABMs in the agriculture sector.
Table 1 summarizes the different methodologies and naming equivalences for the validity of ABM models.
Reference | Terms used interchangeably for the different types of validation | |||
[73] | Conceptual validation | Internal validation | External validation | Cross-model validation |
[65] | -- | -- | Cross-validation** | -- |
[80] | -- | -- | Empirical validation | -- |
[74] | -- | Analytical adequacy | Ontological adequacy | -- |
[77] | Ex-ante validation | -- | Empirical validation | -- |
[76] | Conceptual validation | Internal validation | Operational validation | -- |
[52] | -- | Cross-model validation* | ||
[79] | -- | -- | Empirical validation | -- |
*Note: *between two different models; **between the model results and macro-level statistical data and the micro-level qualitative data. |
To conclude this section, it is useful to recall that, given the high level of intrinsic complexity of the system at stake (human-environment nexus), any model, even after the most careful validation, will certainly fail in describing reality comprehensively. Thanks to George E. P. Box, we know that "all models are wrong, but some are useful'' [98]. All the efforts described above must therefore be considered "useful" only in the context of building models to gain insights into reality and thereby inform decision-makers.
Uncertainty analysis aims to quantify the level of uncertainty over the results obtained with a model, while SA provides insights into the effect of the different input parameters and their variations on the results of the model. In the framework of policy support, SA can be useful to identify regions of the parameter space in which policy decisions can be particularly efficient or, on the contrary, particularly inefficient [99]. Filatova et al. [100] identified SA as one of the thematic modelling challenges in the field of ABMs applied to socio-ecological systems. Most of the applications of SA in ABMs are in the area of ecological systems or socio-sanitary contexts (e.g., spread of diseases), with some applications in transportation systems and very few applications in agricultural and land use modelling.
An exhaustive presentation of SA strategies in general (not in the specific case of ABMs) is given in Saltelli et al. [101], while a short but effective overview is given in Cariboni et al. [102]. While some of these methods have been adapted to ABMs, in most cases, they are in the domain of biological system simulation [103]. Lorscheid et al. [104] proposes a protocol for the experimental design of SA in ABMs. The assumption behind it is that confidence in ABM results increases if simulations are seen as a way of conducting experiments that, as such, have to be planned following a systematic DOE procedure.
SA techniques used for ABMs comprise visual analysis, local techniques (e.g., partial derivative), and techniques based on the assumption of linear or monotonic relationships between model parameters and model outputs. However, those techniques cannot provide sensitivity indices for the non-monotonic input-output relationships typically observed in ABMs and do not account for interactions between parameters. To this end, different techniques have been applied in the literature.
When linear or monotonic relationships cannot be assumed, the development of meta-models (also called surrogate models or emulators [105]) of ABMs (in other words the approximation of the computer code with a surrogate statistical model) is oftentimes a necessary step to perform SA, especially when the model comprises a large set of parameters. In these cases, in fact, exploring the behavior of the system through the variation of all its parameters in their feasibility ranges would be extremely computationally intensive. The more interested readers can refer to Ratto et al. [106] and the references therein.
Schouten et al. [99] proposes a regression-based SA. It is one of the few papers to apply SA to an ABM dealing with agriculture and land use management. They apply a mixed methodological approach, in which an OAT SA is compared with a traditional MC approach. The MC random sampling is used to select sets of input parameters used to run simulations and thus produce model outputs that are subsequently used to build a regression-based surrogate model, which serves as the basis for the decomposition of the variance of the output. The surrogate regression model is accepted if it manages to explain a percentage of the variance close to 100%, otherwise a higher order regression model is applied. It is quite interesting to note that, when comparing the two approaches, the MC approach with meta-modelling is not able to detect sensitivities in extreme cases (i.e., when some of the parameters attain extreme values).
Also, the meta-modelling approach [107], in combination with DOE, is used to systematically analyze the relationship between policy change and model assumptions.
Fonoberova et al. [108] applies a global sensitivity approach. Their approach, which preserves interactions between model parameters, is based on a meta-model (developed using a machine learning technique) corresponding to the given ABM. Differently than other applications [109], the authors do not select a specific output, but compute and observe all nine outputs of the model and present their respective sensitivities as heat maps (derivative sensitivity, derivative absolute value and variance sensitivity).
In Ligmann-Zielinska et al. [109], the variance decomposition of a selected output of the ABM is applied in a case study on farmland conservation resulting in land use change. Instead of relying on a surrogate model to reduce the computational burden of the SA, the authors use a quasi-random sampling (quasi-random Sobol experimental design) to generate a uniform set of parameters spanning the factors space. This set of parameters is used to generate different instances of the output variable, which can then be analyzed using descriptive statistics. The recognized limitations of this approach lie in the fact that the influence of the variables depends in some way on the variable chosen as the output of the model (assuming that the model can provide different outputs, which is most often the case). Furthermore, the type and features of the probability distributions used for each factor could influence not only the variability of the outputs, but also the relative contribution of the factors to this variability.
Another interesting application of GSA in the field of land use change modelling is described in Ligmann-Zielinska et al. [110], where time paths of endogenous variables are also taken into account, resulting in a time-dependent GSA (time-GSA). This is a particularly interesting and novel investigation, since it explores the sensitivity of parameters through model execution and not only at the final step of the simulation. A number n of input samples is generated with the MC procedure, based on pre-defined probability density functions using a quasi-random design. The model is executed n times for each time step t and the first-order and total-effect sensitivity indices are then calculated and finally displayed as time series, which are eventually able to reveal the onset of nonlinearities in the model. The use of time-GSA allows an in-depth exploration of the model behavior that would not be possible using other approaches, but the calculation of sensitivity indices through MC generated samples is computationally demanding and for large samples of simulations, may become prohibitive.
A temporal significance modelling of parameters is also performed by Alam et al., [111]. Their approach is based on a DOE strategy using an orthogonal array to ensure a good spread of the selected points in the parameter space, which would not be possible to obtain with a random design.
A very detailed ABM is implemented in Troost and Berger [112] to assess the effectiveness of certain farm management activities (essentially shifts in crop management time slots) to improve farmers' response to climate change. They describe, document and apply a very in-depth parameter uncertainty analysis pinpointing the parameters with which an SA must be performed. Interestingly, the method distinguishes between parameters that are relevant only in simulations with a short-term time horizon and parameters that are influential in long-term simulations (although some are influential in both cases). They perform model calibration using observed land use data and use a modified scheme of Latin hypercube sampling (unbiased permuted column sample) to cover the parameter space for the parameters that remained after calibration. Their model was applied in a number of case studies, with a number of farm agents spanning from a few hundred to almost 35,0006.
6 https://mp-mas.uni-hohenheim.de/.
A MC-type approach is also followed by Busch et al. [82], which realize multiple model runs of four scenarios designed via a participatory process using companion modelling involving a group of stakeholders involved in heat network projects. A MC approach is also followed by Bell et al. [113], where 26 variables are drawn from uniform distributions. To rank the variables in order of importance of their effect on the outcomes of the MC exercise, they used a machine learning technique (namely, the Random Forest Classifier), which implicitly takes into account the interaction between variables.
Parry et al. [114] uses a Bayesian approach to identify sensitive variables. They build an emulator of the model using a Gaussian process. An efficient space-filling methodology is used to select training input points in a 10-dimensional input space, achieving a dramatic reduction in the computation time compared to MC sampling: the GSA is carried out using only 200 model runs, as opposed to the thousands of runs required by a MC analysis [115].
Ligmann-Zielinska and Jankowski [116] apply a variance-based GSA using Sobol's quasi-random design procedure for exploring the input parameter space and calculate the traditional first order sensitivity and total-effect sensitivity indices.
The issue of uncertainty analysis has been addressed very little in the framework of coupled LCA-ABM models. A consistent review on the topic can be found in Baustert and Benetto [30], where uncertainty sources for such coupled models (independently from the direction of the information flow between the two components) have been classified into four types: parameter uncertainty; uncertainty due to choices; structural uncertainty; and systemic variability. For each of these uncertainty sources, Baustert and Benetto [30] gives concrete examples with reference to the model presented by Marvuglia et al. [41]. An important issue mentioned in Baustert and Benetto [30] is the importance of uncertainty propagation and the difficulty of performing it in an ABM/LCA coupled model. Uncertainty quantification in LCA studies, from the different sources separately (parameters, data, modeler's assumptions and model itself), is a well-explored topic (a very comprehensive summary is given in Igos et al. [117]), but only few studies treat the simultaneous propagation of uncertainty in LCA from all sources [118,119,120].
Agricultural systems are in essence complex systems, as they have multiple scales of interactions, are strongly influenced by human decision-making and include feedback with natural ecosystems [67].
Many ABMs rely on heuristic rules or single-objective optimization models to determine agents' actions. Consequently, agents are often assumed to act in an economically rational way [121,122,123]. Such models, such as the agricultural model AgriPoliS [58], have substantial explanatory power, for instance, in describing the evolution of farms or entrepreneurs in competitive market settings.
However, human behavior is often not totally rational, due to limited knowledge and information about exogenous factors or because of personal preferences and beliefs. It is important to note that the very concept of rationality can be subjected to different interpretations and described using different theories [124,125]. Therefore, in this context, we simplify the concept of "rational behavior" or "rational decision" with a decision geared towards maximizing utility. In an acceptation close to the homo economicus model of neoclassical economics, utility can be identified with economic profit, although in a larger sense it could include also other aspects, such as well-being. In this respect, humans can act as a result of "individual cognitive deliberation", putting into practice (with deliberate actions) behaviors that they consider worth pursuing, even though they might go against "rationality" (i.e., against the maximization of the recognized utility).
In this respect, culture and traditions, as well as peer influence, may play an important role in land-use decisions, and information biases may limit knowledge about market developments and technology trends.
Spurred on by this plethora of social and human factors, humans employ a number of strategies in land-use decision-making that go beyond the maximization of profits and opportunity cost and risk minimization [121,126]. This is the reason why the optimization approaches exclusively based on microeconomic theory are not suitable for capturing the complexity, uncertainty, heterogeneity, and bounded rationality of human behavior [40] and approaches based on ABMs have gained increasing attention in land-use change literature [127,128] and agricultural modelling [123,129,130,131,132].
A review of multi-agent simulation models in agriculture, including optimization, is provided in [133]. The authors suggest a two-part model (including both a multi-agent sub-model and a cellular automata sub-model) is the most appropriate for the agricultural sector. The variables can therefore be divided into those linked to specific locations, and thus assigned to land in the cellular automata sub-model, and those linked to decision-making, and thus assigned to agents in the multi agent sub-model.
In this solution, the two parts still remain separated, therefore one cannot really think about "spatial agents". A special issue of the journal Environmental Modelling & Software is devoted to the topic of spatial ABMs for socio-ecological systems [100]. However, in none of the papers in this issue can one actually see a fully-fledged spatially differentiated ABM, with geographical agents. The highest level of spatial differentiation is attained in Marohn et al. [134], which assesses low-cost soil conservation strategies for highland agriculture, using an AB modelling approach to couple two software packages that simulate soil, water and plant dynamics, and farm decision-making. Through the coupling, the authors manage to obtain the same level of spatial differentiation as the digital elevation model of the studied territory.
A generic ABM to model land-use decisions and consequent energy consumption and pollution dynamics is presented in [59]. The authors justify their choice of AB modelling over other spatial modelling tools by asserting that AB modelling can better support the explicit representation of socio-economic, political and natural processes in space and time and their reciprocal feedback mechanisms. The scenarios modelled are however highly stylized (no GIS data used; zoning represented as concentric circles) "for the purposes of illustrating the applicability of this integrated framework to urban policy assessments" [59].
In Wise and Crooks [62], the spatial differentiation aspect is brought to a more advanced level and the model described by the authors is empirically grounded, reaching a very realistic model of a complex socio-physical system. The model is a spatially explicit ABM programmed in Java, and based on the use of GIS maps; it consists of several modules that describe the physical, economic, and social processes which have an impact on land-use patterns. The model, which comprises a GUI, includes a number of parameters which can be adjusted to suit the underlying assumptions of the researcher.
Astier et al. [60], uses AB modelling together with role-play games to support participatory processes. The model the authors describe is characterized by a very user-friendly interface with agreeable pictorial interfaces for data display, which facilitates stakeholders' participation; however, the GIS-based representation of the ABM is missing.
Murray-Rust et al. [61] presents a new ABM framework that allows different factors to be explored, such as: How the relative influence of social factors versus economic and environmental factors steers the on-farm crop rotation selection and the provision of ecosystem services; the degree of subsidy adoption among farm households and its level of effectiveness; the conditions under which marginal land is taken out of (or put into) agricultural production. The landscape is modelled as a collection of parcels, which represent the minimal spatial unit at which decisions are made, (e.g., a farmer's field). On each parcel, three attributes are present: (ⅰ) Land Cover, (ⅱ) Management practices and (ⅲ) Regimes (which are multi-year management sequences applied to individual parcels). Although the paper does not explicitly present maps, the approach described represents a very advanced modelling framework, combining advances in ABM in land use, a detailed multilevel handling of the temporal dimension, a diverse socio-economic context, and ecosystem service modeling.
As shown in the paper, ABMs have proven to be a very powerful tool in several areas of sustainability assessment. They are particularly adapted for representing complex systems and detecting emerging properties arising from the interaction of their individual actors (the agents). The paper presents a literature review of the application of ABMs in the sustainability assessment field, and in particular, in the LCA field, with special attention to agricultural and land-use modelling.
The paper gives an overview of the different elements (the "modelling blocks") one needs to take into account to build an ABM, from general issues that hold regardless of the domain of application of the ABM model (such as validation, sensitivity analysis, agent definition, data provision) to issues that are more relevant for agricultural and land-use modelling (such as spatial data representation).
The final aim of the paper is to provide elements to justify the use and facilitate the implementation of ABMs to build a consistent LCI for a (consequential) LCA.
One of the advantages of using a bottom-up model (such as an ABM) in LCI building is the possibility to bring the modeling to a level of granularity that allows an assessment more adapted to the local context one is dealing with (i.e., which reproduces a local situation, instead of being based on generic LCI data, ideally covering all three spheres: Environmental, economic and social). It is however worth noting that, as observed by Ligmann-Zielinska and Jankowski [116], if the aim of the modeler is to explore the effects of different policies on individual decision-making, then ABM is actually complementary to top-down land-use allocation modeling. In fact, the use of an ABM helps in designing policies that do not consider regional development objectives exclusively, but also take into account individual goals and behavioral drivers. The same concept is also advanced by Murray-Rust et al. [135], who observe that ABMs are generally limited to small spatial scales, while top-down land allocation models can be applied at large spatial scales, but fall short in accounting for human behavior and non-economic factors (e.g., the supply of ecosystem services). For this reason, they claim that models combining the advantages of the two approaches are necessary for the advancement of land-use science.
From our perspective, one of the major priorities one has to bear in mind when embarking on the implementation of an ABM of an agricultural system is the collection of farm-specific data. This issue holds true at two levels: at the level of the property and technical activities and at the level of farmers' personal thinking and behavior tendencies.
At the first level, the modeler needs data on the crops, meat and milk production, crop prices, land rental costs, time elapsed since the beginning of the rental lease contract, etc. If one models a real, dynamic market, not only for the crops, but for the land as well, then the duration of the land lease contract has an influence on the model because it determines the moment in time when the market can experience variations on the distributions of land among farmers.
At the level of farmers' behavior, the modeler needs data on social interaction level, risk aversion, and familiarity with a certain technology or trend (e.g., switchgrass cultivation). These data are difficult to obtain and although surveys can always be conducted, the questions included in the surveys have to be carefully formulated in order to prevent at least two risks: (1) the risk of asking redundant information or information that does not allow a proper estimation of the farmer's "personal" variable that one wishes to estimate; (2) the risk of obtaining biased information due to the fact that the farmer answers the questions in an inaccurate way, which does not reflect his/her real thinking.
In order to limit these risks, the survey has to be designed together with experts in the domain (in this case, agriculture), who will orient the design of the survey towards the most relevant questions, and preferably also with experts on survey design (e.g., from the statistics area), who will insert some "double-check" questions at strategic points in the questionnaire, which aim to verify whether the responder is answering similar or closely related questions in a contradictory way. It is however worth noting that data collection via surveys is normally resource- and time-consuming. If these resources are not available to the modeler's team, a common approach is to use aggregated statistics (such as census data). The agents' attributes can then be assigned arbitrarily [38,45] or by importing the joint distribution of agents' attributes from the available empirical data source [46].
Another data-related issue concerns the knowledge of the exact spatial location of the farms. If this information is missing, the results of the analysis, however interesting, lose a considerable part of their informative power, because they cannot be used to implement location-specific corrective actions. Let us suppose, as an example, that the results of the AB simulations are used to run an LCA whose results highlight high impacts in the "terrestrial acidification" category. Since terrestrial acidification is a site-specific impact, correction actions cannot be properly implemented if the location where the crops with high acidification impacts are grown is not known. Knowing farm-specific data, together with farm locations, normally conflicts with data confidentiality issues and is therefore a very difficult problem to solve. In the absence of this information, a common approach is to assign areas of land (patches) randomly on a grid to represent farms [38].
One needs to take into account that the final aim of the ABMs coupled with LCA is to produce scenarios to inform the LCI. The closer the ABM is to the rwDGP, the more realistic the information transferred to the LCA phase; however, if this is not complemented by a suitable collection of the environmental information (LCI data), the real advantage of this coupling is in principle levelled off.
In addition to this, it is important to address once again the issue of model validation and uncertainty, referring this time to the LCA model. We can in fact observe that the validation of the LCA run on the results obtained from the ABM is a separate issue from the validation of the ABM. Validating the LCA model per se is actually not possible, in the sense that with the LCA, one computes "potential" environmental impacts (on humans and on ecosystems) that cannot be compared with the "actual" impacts, because these are simply not known. The validity of the LCA results rests upon the validity (based on scientific consensus) of characterization models applied in Life Cycle Impact Assessment (LCIA), which are very difficult to validate. Their validity "is typically based on their derivation from scientifically peer reviewed and accepted environmental models which are adopted to operate within the restrictions posed by the boundary conditions of LCA" [136].
A similar line of reasoning concerns the uncertainty by which the results of the coupled ABM-LCA model are affected. They certainly carry the uncertainty of the ABM data and assumptions (e.g., on risk aversion, crop prices, level of social interaction, etc.), but also the uncertainty of the LCI data. As for the ABM model, and also for the LCA model, SA can be used to study the robustness of results and their sensitivity to uncertainty factors.
This paper especially focused on placing the ABM approach in the context of its support to LCA. As discussed in the paper, we must however observe that, despite ABMs still being regarded with some skepticism with respect to more established approaches (such as those grounded in economics), solutions based on complex system simulations are starting, to some extent, to be influential in policymaking. Participatory modeling processes can certainly boost the acceptance of ABMs among stakeholders and decision-makers, but practical user friendly tools allowing scenarios simulations also to non-expert users are clearly still lacking.
The project MUSA-MUlti agent Simulation for consequential Life Cycle Assessment of Agrosystems-(C12/SR/4011535) was financed by Luxembourg's National Research Fund (FNR), which is gratefully acknowledged by the authors.
All authors declare no conflicts of interest in this paper.
[1] | A. M. Nikalje, A. Kumar and S. K. V. Sai, Influence of parameters and optimization of EDM performance measures on MDN 300 steel using Taguchi method, Int. J. Adv. Manuf. Tech., 69 (2013), 41–49. |
[2] | X. P. Dang, Constrained multi-objective optimization of EDM process parameters using kriging model and particle swarm algorithm, Mater. Manuf. Process., 33 (2018), 397–404. |
[3] | G. D'Urso, C. Giardini, G. Maccarini, et al., Analysis of the surface quality of steel and ceramic materials machined by micro-EDM, European Society for Precision Engineering and Nanotechnology, Conference Proceedings-18th International Conference and Exhibition, EUSPEN 2018, 431–432. |
[4] | G. D'Urso, C. Giardini and M. Quarto, Characterization of surfaces obtained by micro-EDM milling on steel and ceramic components, Int. J. Adv. Manuf. Tech., 97 (2018), 2077–2085. |
[5] | S. Gopalakannan, T. Senthilvelan and S. Ranganathan, Modeling and Optimization of EDM Process Parameters on Machining of Al 7075-B 4 C MMC Using RSM, Procedia Eng., 38 (2012), 685–690. |
[6] | H. Mohammadjifar, L. Q. Bui and C. T. Ngyen, Experimental investigation of the effects of tool initial surface roughness on the electrical discharge machining (EDM) performance, Int. J. Adv. Manuf. Tech., 95 (2018), 2093–2104. |
[7] | J. L. Lin, K. S. Wang, B. H. Yan, et al., Optimization of the electrical discharge machining process based on the Taguchi method with fuzzy logics, J. Mater. Process. Tech., 102 (2000), 48–55. |
[8] | K. P. Somashekhar, N. Ramachandran and J. Mathew, Optimization of material removal rate in micro-EDM using artificial neural network and genetic algorithms, Mater. Manuf. Process., 25 (2010), 467–475. |
[9] | M. Arindam, Process parameter optimization during EDM of AISI 316 LN stainless steel by using fuzzy based multi-objective PSO, J. Mech. Sci. Tech., 27 (2013), 2143–2151. |
[10] | G. D'Urso, C. Giardini, M. Quarto, et al., Cost index model for the process performance optimization of micro-EDM drilling on tungsten carbide, Micromach., 8 (2017), 251. |
[11] | S. Parsana, N. Radadia, M. Sheth, et al., Machining parameter optimization for EDM machining of Mg–RE–Zn–Zr alloy using multi-objective Passing Vehicle Search algorithm, Arch. Civ. Mech. Eng., 18 (2018), 799–817. |
[12] | G. Rajyalakshmi and P. V. Ramaiah, Multiple process parameter optimization of wire electrical discharge;machining on Inconel 825 using Taguchi grey relational analysis, Int. J. Adv. Manuf. Tech., 69 (2013), 1249–1262. |
[13] | R. Świercz, D. Oniszczuk-Świercz and T. Chmielewski, Multi-Response Optimization of Electrical Discharge Machining Using the Desirability Function, Micromach., 10 (2019), 72. |
[14] | P. S. Bharti, S. Maheshwari and C. Sharma, Multi-objective optimization of electric-discharge machining process using controlled elitist NSGA-II, J. Mech. Sci. Tech., 26 (2012), 1875–1883. |
[15] | D. Kanagarajan, R. Karthikeyan, K. Palanikumar, et al., Optimization of electrical discharge machining characteristics of WC/Co composites using non-dominated sorting genetic algorithm (NSGA-II), Int. J. Adv. Manuf. Tech., 36 (2008), 1124–1132. |
[16] | C. Ulas and H. Ahmet, Modeling and analysis of electrode wear and white layer thickness in die-sinking EDM process through response surface methodology, Int. J. Adv. Manuf. Tech., 38 (2008), 1148–1156. |
[17] | A. Majumder, P. K. Das, A. Majumber, et al., An approach to optimize the EDM process parameters using desirability-based multi-objective PSO, Prod. Manuf. Res., 2 (2014), 228–240. |
[18] | F. G. Cao and D. Y. Yang, The study of high efficiency and intelligent optimization system in EDM sinking process, J. Mater. Process. Tech., 49 (2004), 83–87. |
[19] | G. K. M. Rao, G. Rangajanardhaa, D. H. Rao, et al., Development of hybrid model and optimization of surface roughness in electric discharge machining using artificial neural networks and genetic algorithm, J. Mater. Process. Tech., 209 (2009), 1512–1520. |
[20] | G. Rajyalakshmi and P. V. Ramaiah, Multiple process parameter optimization of wire electrical discharge machining on Inconel 825 using Taguchi grey relational analysis, Int. J. Adv. Manuf. Tech., 69 (2013), 1249–1262. |
[21] | R. Maneswar and P. P. Kumar, Parametric optimization for selective surface modification in EDM using Taguchi analysis, Mater. Manuf. Process., 31 (2016), 422–431. |
[22] | J. L. Lin and C. L. Lin, The use of the orthogonal array with grey relational analysis to optimize the electrical discharge machining process with multiple performance characteristics, Int. J. Mach. Tool. Manuf., 42 (2002), 237–244. |
[23] | P. Sathiya, S. Aravindan and A. N. Haq, Mechanical and metallurgical properties of friction welded AISI 304 austenitic stainless steel, Int. J. Adv. Manuf. Tech., 26 (2005), 505–511. |
[24] | Z. Zhang, X. Q. Yang and H. Ma, Optimization of Enzymatic Hydrolysis of Tilapia Waste by Plackett-Burman Design and Central Composite Design, Food Sci., 32 (2011), 1–5. |
[25] | X. Li, C. Gong, L. Gu, et al., A reliability-based optimization method using sequential surrogate model and Monte Carlo simulation, Struct. Multidiscip. O., (2018), 1–22. |
[26] | F. Biancofiore, M. Busilacchio, M. Verdecchia, et al., Recursive neural network model for analysis and forecast of PM10 and PM2.5, Atmo. Pollut. Res., 8 (2017), 652–659. |
[27] | Q. Zhou, H. Jiang, J. Wang, et al., A hybrid model for PM 2.5 forecasting based on ensemble empirical mode decomposition and a general regression neural network, Sci. Total Environ., 496 (2014), 264–274. |
[28] | K. Deb, A. Pratap, S. Agarwal, et al., A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE T. Evolut. Comput., 6 (2002), 182–197. |
[29] | H. Li and Q. F. Zhang, Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II, IEEE T. Evolut. Comput., 13 (2009), 284–302. |
[30] | S. Mardle and K. M. Miettinen, Nonlinear multiobjective optimization, J. Oper. Res. Soc., 51 (2000), 246. |
[31] | R. Gudolph, On a multi-objective evolutionary algorithm and its convergence to the Pareto set, IEEE International Conference on Evolutionary Computation, IEEE ICEC 1998, 511–516. |
[32] | S. Chaki, N. B. Rani, S. Ghosal, et al., Multi-objective optimisation of pulsed Nd:YAG laser cutting process using integrated ANN–NSGAII model, J. Intell. Manuf., 29 (2018), 175–190. |
[33] | T. Chang, J. Q. Lu, A. X. Shen, et al., Simulation and optimization of the post plasma-catalytic system for toluene degradation by a hybrid ANN and NSGA-II method, Appl. Catal. B-Environ., 244 (2019), 107–119. |
[34] | A. Adel, Y. Kai and M. Alper, ASRSM: A sequential experimental design for response surface optimization, Qual. Reliab. Eng. Int., 29 (2013), 241–258. |
[35] | W. Chen, M. H. Nguyen, W. H. Chiu, et al., Optimization of the plastic injection molding process using the Taguchi method, RSM, and hybrid GA-PSO, Int. J. Adv. Manuf. Tech., 83 (2016), 1873–1886. |
[36] | S. J. An, W. Q. Liu and S. Venkatesh, Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression, Pattern Recogn., 40 (2005), 2154–2162. |
[37] | T. Chen and S. J. Lu, Accurate and efficient traffic sign detection using discriminative adaboost and support vector regression, IEEE T. Veh. Tech., 65 (2016), 4006–4015. |
[38] | X. K. Li, H. B. Qiu, Z. Z. Chen, et al., A local Kriging approximation method using MPP for reliability-based design optimization, Comput. Struct., 162 (2016), 102–115. |
[39] | X. K. Li, H. B. Qiu, Z. Z. Chen, et al., A local sampling method with variable radius for RBDO using Kriging, Eng. Computation., 32 (2015), 1908–1933. |
[40] | G. Shieh, Improved shrinkage estimation of squared multiple correlation coefficient and squared cross-validity coefficient, Organ. Res. Meth., 11 (2008), 387–407. |
[41] | H. Wang, S. Shan, G. G. Wang, et al., Integrating least square support vector regression and mode pursuing sampling optimization for crashworthiness design, J. Mech. Design, 133 (2011), 041002. |
1. | Guilherme Giovanini, Alan U. Sabino, Luciana R. C. Barros, Alexandre F. Ramos, Correction: A comparative analysis of noise properties of stochastic binary models for a self-repressing and for an externally regulating gene, 2021, 18, 1551-0018, 300, 10.3934/mbe.2021015 | |
2. | Pavol Bokes, Heavy-tailed distributions in a stochastic gene autoregulation model, 2021, 2021, 1742-5468, 113403, 10.1088/1742-5468/ac2edb | |
3. | Guilherme Giovanini, Luciana R. C. Barros, Leonardo R. Gama, Tharcisio C. Tortelli, Alexandre F. Ramos, A Stochastic Binary Model for the Regulation of Gene Expression to Investigate Responses to Gene Therapy, 2022, 14, 2072-6694, 633, 10.3390/cancers14030633 | |
4. | Heli Tan, Tuoqi Liu, Tianshou Zhou, Exploring the role of eRNA in regulating gene expression, 2021, 19, 1551-0018, 2095, 10.3934/mbe.2022098 | |
5. | Yue Wang, Siqi He, Inference on autoregulation in gene expression with variance-to-mean ratio, 2023, 86, 0303-6812, 10.1007/s00285-023-01924-6 | |
6. | Luiz Guilherme S. da Silva, Romain Yvinec, Guilherme Prata, Varshant Dhar, John Reinitz, Alexandre F. Ramos, Two-State Stochastic Model of In Vivo Observations of Transcriptional Bursts, 2025, 55, 0103-9733, 10.1007/s13538-025-01785-y | |
7. | Xueying Tian, Yash Patel, Yue Wang, Laura Cantini, TRENDY: gene regulatory network inference enhanced by transformer, 2025, 41, 1367-4811, 10.1093/bioinformatics/btaf314 |
Reference | Terms used interchangeably for the different types of validation | |||
[73] | Conceptual validation | Internal validation | External validation | Cross-model validation |
[65] | -- | -- | Cross-validation** | -- |
[80] | -- | -- | Empirical validation | -- |
[74] | -- | Analytical adequacy | Ontological adequacy | -- |
[77] | Ex-ante validation | -- | Empirical validation | -- |
[76] | Conceptual validation | Internal validation | Operational validation | -- |
[52] | -- | Cross-model validation* | ||
[79] | -- | -- | Empirical validation | -- |
*Note: *between two different models; **between the model results and macro-level statistical data and the micro-level qualitative data. |
Reference | Terms used interchangeably for the different types of validation | |||
[73] | Conceptual validation | Internal validation | External validation | Cross-model validation |
[65] | -- | -- | Cross-validation** | -- |
[80] | -- | -- | Empirical validation | -- |
[74] | -- | Analytical adequacy | Ontological adequacy | -- |
[77] | Ex-ante validation | -- | Empirical validation | -- |
[76] | Conceptual validation | Internal validation | Operational validation | -- |
[52] | -- | Cross-model validation* | ||
[79] | -- | -- | Empirical validation | -- |
*Note: *between two different models; **between the model results and macro-level statistical data and the micro-level qualitative data. |