Our mission is to help computational modelers develop, document, and share their computational models in accordance with community standards and good open science and software engineering practices. Model authors can publish their model source code in the Computational Model Library with narrative documentation as well as metadata that supports open science and emerging norms that facilitate software citation, computational reproducibility / frictionless reuse, and interoperability. Model authors can also request private peer review of their computational models. Models that pass peer review receive a DOI once published.
All users of models published in the library must cite model authors when they use and benefit from their code.
Please check out our model publishing tutorial and feel free to contact us if you have any questions or concerns about publishing your model(s) in the Computational Model Library.
We also maintain a curated database of over 7500 publications of agent-based and individual based models with detailed metadata on availability of code and bibliometric information on the landscape of ABM/IBM publications that we welcome you to explore.
Displaying 10 of 90 results for "Caroline Schill" clear search
InformalCity, a spatially explicit agent-based model, simulates an artificial city and allows for testing configurations of urban upgrading schemes in informal settlements.
This is a NetLogo version of Buhl et al.’s (2005) model of self-organised digging activity in ant colonies. It was built for a master’s course on self-organisation and its intended use is still educational. The ants’ behavior can easily be changed by toggling switches on the interface, or, for more advanced students, there is R code included allowing the model to be run and analysed through RNetLogo.
Micro-targeted vs stochastic political campaigning agent-based model simulation. Written by Toby D. Pilditch (University of Oxford, University College London), in collaboration with Jens K. Madsen (University of Oxford, London School of Economics)
The purpose of the model is to explore the various impacts on voting intention among a population sample, when both stochastic (traditional) and Micto-targeted campaigns (MTCs) are in play. There are several stages of the model: initialization (setup), campaigning (active running protocols) and vote-casting (end of simulation). The campaigning stage consists of update cycles in which “voters” are targeted and “persuaded” - updating their beliefs in the campaign candidate / policies.
This is the full repository to run the survival analysis (in R) and run the population viability model and its analysis (NetLogo + R) of the Northern Bald Ibis (NBI) presented in the study
On the road to self-sustainability: Reintroduced migratory European Northern Bald Ibises (Geronticus eremita) still need management interventions for population viability
by Sinah Drenske, Viktoriia Radchuk, Cédric Scherer, Corinna Esterer, Ingo Kowarik, Johannes Fritz, Stephanie Kramer-Schadt
…
This model implements two types of network diffusion from an initial group of activated nodes. In complex contagion, a node is activated if the proportion of neighbour nodes that are already activated exceeds a given threshold. This is intended to represented the spread of health behaviours. In simple contagion, an activated node has a given probability of activating its inactive neighbours and re-tests each time step until all of the neighbours are activated. This is intended to represent information spread.
A range of networks are included with the model from secondary school friendship networks. The proportion of nodes initially activated and the method of selecting those nodes are controlled by the user.
Default Initial skill, read ODD for more info. The purpose of the model presented by Salau is to study the ’player profit vs. club benefit’ dilemma present in professional soccer organizations.
The Labour Markets and Ethnic Segmentation (LaMESt) Model is a model of a simplified labour market, where only jobs of the lowest skill level are considered. Immigrants of two different ethnicities (“Latino”, “Asian”) compete with a majority (“White”) and minority (“Black”) native population for these jobs. The model’s purpose is to investigate the effect of ethnically homogeneous social networks on the emergence of ethnic segmentation in such a labour market. It is inspired by Waldinger & Lichter’s study of immigration and the social organisation of labour in 1990’s Los Angeles.
Although beneficial to scientific development, data sharing is still uncommon in many research areas. Various organisations, including funding agencies that endorse open science, aim to increase its uptake. However, estimating the large-scale implications of different policy interventions on data sharing by funding agencies, especially in the context of intense competition among academics, is difficult empirically. Here, we built an agent-based model to simulate the effect of different funding schemes (i.e., highly competitive large grants vs. distributive small grants), and varying intensity of incentives for data sharing on the uptake of data sharing by academic teams strategically adapting to the context.
The St Anthony flu model is an epidemiological model designed to test hypotheses related to the spread of the 1918 influenza pandemic among residents of a small fishing community in Newfoundland and Labrador. The 1921 census data from Newfoundland and Labrador are used to ensure a realistic model population; the community of St. Anthony, NL, located on the tip of the Northern Peninsula of the island of Newfoundland is the specific population modeled. Model agents are placed on a map-like grid that consists of houses, two churches, a school, an orphanage, a hospital, and several boats. They engage in daily activities that reflect known ethnographic patterns of behavior in St. Anthony and other similar communities. A pathogen is introduced into the community and then it spreads throughout the population as a consequence of individual agent movements and interactions.
The purpose of the simulation was to explore and better understand the process of bridging between an analysis of qualitative data and the specification of a simulation. This may be developed for more serious processes later but at the moment it is merely an illustration.
This exercise was done by Stephanie Dornschneider (School of Politics and International Relations, University College Dublin) and Bruce Edmonds to inform the discussion at the Lorentz workshop on “Integrating Qualitative and Quantitative Data using Social Simulation” at Leiden in April 2019. The qualitative data was collected and analysed by SD. The model specification was developed as the result of discussion by BE & SD. The model was programmed by BE. This is described in a paper submitted to Social Simulation 2019 and (to some extent) in the slides presented at the workshop.
Displaying 10 of 90 results for "Caroline Schill" clear search