-
Notifications
You must be signed in to change notification settings - Fork 3
gaining participation
In comparison to many other scholarly fields, economics has been a desert for open source software. We want, during this session, to present some of our ideas, and solicit yours, about how to ensure that our project becomes an oasis in that desert.
Preliminary ideas include:
-
Sponsoring "prizes" for graduate (or even undergraduate) students who contribute new modules to the open-source toolkit;
-
Exploiting the fact that journals increasingly are (a) requiring scholars to make code archives available for papers that they publish and but (b) are generally not offering any useful resources to help construct, format, or share such archives. At a minimum, one could imagine our project offering a permanent home to any archive that uses code derived from the website. More ambitiously, one could imagine collaboration with journals who might designate our web resource as a "preferred" location for such archives, following particular quality standards
-
Providing standardized "virtual machines" on which any code from the archive is guaranteed to run, and on which a standard suite of open computational tools has been installed
We have a number of other ideas, but the purpose of this session is to solicit input from conference participants on how to make this work. Specifically, what would motivate THEM to contribute?
# E-PublicationThe Journal of Statistical Software (JSS) has been extremely successful in harnessing the power of peer review and citation to drive open collaboration and scientific code development. Code packages are often cross-posted to the Comprehensive R Archive Network (CRAN) with descriptions and examples in "vignettes." The code and the papers are peer-reviewed and citable, thus ensuring that the regular engines of academic production will drive development here just as they do in traditional contribution.
We are not quite as ambitious yet. One simple low-cost way of implementing a similar system is a "computational letters" page in the back of a leading quantitative methods journal, which might have 1-3 entries on a single page -- each one would be simply title, abstract, and link to a permanent Github academic archive page, which might be an IPython or IJulia notebook with both math and code. If we achieve that, code contributions would be cite-able in the traditional way. A simple, standard unit testing framework and code documenting framework can ease the burden of review for reviewers.
Using current off-the-shelf technology, we believe we could launch a "Computational Letters" nearly instantaneously, possibly in as little as a weekend. The things we would need are:
- A permanent place to store IPython notebooks, which include code, mathematics, images, tables and references.
- Fulfilled by Github's academic archives option and the ability to post IPython notebooks as gists. See the following additional information:
- Writing Scholarly Notebooks
- Quickly sharing notebooks as gists
- Gallery of IPython Notebooks
- To Do: Confirm that one can create a scholarly, viewable IPython notebook in a Github repository or Gist, which one can then archive as discussed above. Ideally this process, start to
- A template for what a "scholarly notebook" should look like: math, citations, code examples. See example noted above.
- A standard for unit testing and code documentation. This may be considered optional initially but should become standard at some point.
- We intend for the code to be peer-reviewed, as well as the content of the vignette. To ease the burden of peer review of code, which can be painful and time-intensive, we should require both unit testing and documentation to follow a style guide.
- Examples of style guides for documentation can be found in a few prominent sources, such as Python's PEP-8, the Google Style Guide, or a combination with Sphinx.
- Enforcing a style on the entire code layout may be considered, but may be too far-reaching.
The first item above can almost certainly be achieved with off-the-shelf tools, right now. The second item can likely be achieved quickly by modifying an already-existing style guide. The final item exists in a number of forms; for example Sargent and Stachurski's excellent quant-econ project already employ a set documentation and unit-testing framework.
The technical details can be completely addressed via off-the-shelf, open-source solutions.
For further examples from the JSS:
- Discussion of code "vignettes" can be found here.
- An academic page with many examples of academic code contributed at vignettes and snippets can be seen here; see in particular his software page for links to work.
- Nice discussion of the currently (very mature) process of submitting code to the JSS can be found in an editor's post here.
To be expanded. The following topics are key ideas for establishing incentives for contribution:
- "Google Summer of Code," for specific, sponsored computational economics projects. Target is particular model replication and module expansion.
- Creation of specific "artifacts" to include in performance reviews, such as planning targets, vignettes, citation and review as discussed above. Having specific, defined product targets ease this task considerably.
- "Bounties" for particular well-defined contributions from a sponsoring organization.
- A system of well-defined documentation and testing for submitted modules, such that anything which is accepted into the toolkit can clear a threshold of review. Ideally this will dovetail with journals' requirements for posting code.
Bottom-Up Approach
- Overview
- Presentation [ pdf ]
- Suggested Modules List
- Examples in Progress
- Replication Wishlist
- Important Questions
Top-Down Approach
- Overview
- Top-down-topics
- Extending the Dolo or Dynare languages
- Replication Wishlist
- Important Questions
Participation and E-Publication
Lessons from Open Source
Languages and Tools
Related Groups and Conferences
- Zotero Reading List
- SCE
- SED
Notes and Archives
In Progress