CoMSES Net maintains cyberinfrastructure to foster FAIR data principles for access to and (re)use of computational models. Model authors can publish their model code in the Computational Model Library with documentation, metadata, and data dependencies and support these FAIR data principles as well as best practices for software citation. Model authors can also request that their model code be peer reviewed to receive a DOI. All users of models published in the library must cite model authors when they use and benefit from their code.
CoMSES Net also maintains a curated database of over 7500 publications of agent-based and individual based models with additional metadata on availability of code and bibliometric information on the landscape of ABM/IBM publications that we welcome you to explore.
The TERROIR agent-based model was built for the multi-level analysis of biomass and nutrient flows within agro-sylvo-pastoral villages in West Africa. It explicitly takes into account both human organization and spatial extension of such flows.
This article presents an agent-based model of an Italian textile district where thousands of small firms specialize in particular phases of fabrics production. It reconstructs the web of communication between firms as they arrange production chains. In turn, production chains result in road traffic between the geographical areas on which the district extends. The reconstructed traffic exhibits a pattern that has been observed, but not foreseen, by policy makers.
Genetic algorithms try to solve a computational problem following some principles of organic evolution. This model has educational purposes; it can give us an answer to the simple arithmetic problem on how to find the highest natural number composed by a given number of digits. We approach the task using a genetic algorithm, where the possible answers to solve the problem are represented by agents, that in logo programming environment are usually known as “turtles”.
Dawkins’ Weasel is a NetLogo model that illustrates the principle of evolution by natural selection. It is inspired by a thought experiment presented by Richard Dawkins in his book The Blind Watchmaker (1996).
Modeling an economy with stable macro signals, that works as a benchmark for studying the effects of the agent activities, e.g. extortion, at the service of the elaboration of public policies..
This model was built to estimate the impacts of exogenous fodder input and credit loans services on livelihood, rangeland health and profits of pastoral production in a small holder pastoral household in the arid steppe rangeland of Inner Mongolia, China. The model simulated the long-term dynamic of herd size and structure, the forage demand and supply, the cash flow, and the situation of loan debt under three different stocking strategies: (1) No external fodder input, (2) fodders were only imported when natural disaster occurred, and (3) frequent import of external fodder, with different amount of available credit loans. Monte-Carlo method was used to address the influence of climate variability.
Demand planning requires processing of distributed information. In this process, individuals, their properties and interactions play a crucial role. This model is a computational testbed to investigate these aspects with respect to forecast accuracy.
B3GET Classic includes previous versions used in the classroom and for publication. Please check out the latest version of B3GET here, which has several user-friendly features such as directly importing and exporting genotype and population files.
The classic versions of B3GET include: version one was and version three is currently used in undergraduate labs at the University of Minnesota to demonstrate principles in primate behavioral ecology; version two first demonstrated proof of concept for creating virtual biological organisms using decision-vector technology; version four was presented at the 2017 annual meeting at the American Association of Physical Anthropologists; version five was presented in a 2019 publication from the Journal of Human Evolution (Crouse, Miller, and Wilson, 2019).
The current rate of production and consumption of meat poses a problem both to peoples’ health and to the environment. This work aims to develop a simulation of peoples’ meat consumption behaviour in Britain using agent-based modelling. The agents represent individual consumers. The key variables that characterise agents include sex, age, monthly income, perception of the living cost, and concerns about the impact of meat on the environment, health, and animal welfare. A process of peer influence is modelled with respect to the agents’ concerns. Influence spreads across two eating networks (i.e. co-workers and household members) depending on the time of day, day of the week, and agents’ employment status. Data from a representative sample of British consumers is used to empirically ground the model. Different experiments are run simulating interventions of application of social marketing campaigns and a rise in price of meat. The main outcome is the average weekly consumption of meat per consumer. A secondary outcome is the likelihood of eating meat.
EffLab was built to support the study of the efficiency of agents in an evolving complex adaptive system. In particular:
- There is a definition of efficiency used in ecology, and an analogous definition widely used in business. In ecological studies it is called EROEI (energy returned on energy invested), or, more briefly, EROI (pronounced E-Roy). In business it is called ROI (dollars returned on dollars invested).
- In addition, there is the more well-known definition of efficiency first described by Sadi Carnot, and widely used by engineers. It is usually represented by the Greek letter ‘h’ (pronounced as ETA). These two measures of efficiency bear a peculiar relationship to each other: EROI = 1 / ( 1 - ETA )
In EffLab, blind seekers wander through a forest looking for energy-rich food. In this multi-generational world, they live and reproduce, or die, depending on whether they can find food more effectively than their contemporaries. Data is collected to measure their efficiency as they evolve more effective search patterns.