The CDI sent out ballots for voting on November 30, 2018, with the voting open until December 14, 2018.
973 ballots were distributed and 140 ballots were returned.
The top 20 ranked projects invited back for full proposals. They are listed here, alphabetically by PI last name.
Title | Principal Investigator (PI) | PI Organization |
---|---|---|
Open-source and open-workflow Climate Scenarios Toolbox for adaptation planning | Aparna Bamzai | USGS North Central Climate Adaptation Science Center |
Extending ScienceBase for Disaster Risk Reduction | Joe Bard | USGS Cascades Volcano Observatory |
Transforming Biosurveillance by Standardizing and Serving 40 Years of Wildlife Disease Data | David Blehert | National Wildlife Health Center |
Integrating short-term climate forecasts into a restoration management support tool | John Bradford | USGS Southwest Biological Science Center |
National Public Screening Tool for Invasive and Non-native Aquatic Species Data | Wesley M. Daniel | USGS Wetland and Aquatic Research Center |
High-Resolution, Interagency Biosurveillance of Threatened Surface Waters in the United States | Sara L Eldridge | USGS Wyoming-Montana Water Science Center |
Develop Cloud Computing Capability at Streamgages using Amazon Web Services GreenGrass IoT Framework for Camera Image Velocity Gaging | Frank L. Engel | USGS Texas Water Science Center |
Serving the U.S. Geological Survey’s geochronological data | Amy Gilmer | USGS Geology and Environmental Change Science Center |
Establishing standards and integrating environmental DNA (eDNA) data into the USGS Nonindigenous Aquatic Species database | Margaret Hunter | USGS Wetland and Aquatic Research Center |
Subsidence Susceptibility Map for the Conterminous U.S. | Jeanne Jones | USGS Western Geographic Science Center |
A generic web application to visualize and understand movements of tagged animals | Ben Letcher | Leetown Science Center |
Building a Roadmap for Making Data FAIR in the U.S. Geological Survey | Fran Lightsom | USGS Woods Hole Coastal and Marine Science Center |
Developing an Analytical Tool to Compare Hazard-related Crowdsourced and Citizen Science Data to Official Sources | Sophia B Liu | Science and Decisions Center |
Coupling Hydrologic Models with Data Services in an Interoperable Modeling Framework | Richard McDonald | Water Mission Area - Model Support and Coordination Branch |
Leveraging deep learning through use of the dl_tools software package to enhance wetland mapping capabilities of the NWI | David Millar | USGS Fort Collins Science Center |
Implementing a Grassland Productivity Forecast for the U.S. Southwest | Sasha Reed | USGS Southwest Biological Science Center |
Building web-service based forecasting tools for wildlife disease managers | Katie Richgels | USGS National Wildlife Health Center |
Water Security in U.S. Megacities: Building Decision Frameworks Beyond Water Management | Sachin Shah | Texas Water Science Center |
ExDetect: a cloud-based remote sensing and GIS tool to detect and monitor the spread of exotic annuals around energy development sites | Miguel Villarreal | Western Geographic Science Center |
Image Analysis with Machine Learning: Tile-drain detection and delineation in agricultural landscapes | Tanja N Williamson | USGS Ohio-Kentucky-Indiana Water Science Center |
For details, see: CDI-FY19-SOI-Results.pdf
Voters were asked to rank up to 15 statements of interest. The following chart shows that about half of voters ranked a full 15 statements. Smaller numbers of voters only ranked 1-14 statements and there were no trends in particular statements being voted on singularly.
For details, see: CDI-FY19-SOI-Raw_Results.csv
The Borda Count method was used. We had N = 33 total statements. N (33) points were assigned for rank 1, N-1 (32) for rank 2, and so on down to N-14 for rank 15.
1) It would be great to have all the project summaries on one page in the future as currently they are very difficult to access quickly.
Thanks for the suggestion for next time.
2) It is not clear how the themes of emphasis in the RFP are integrated into the selection/voting process.
That is something we can improve on explaining - the themes help us to recruit ideas that may be USGS priorities or that the CDI executive sponsors have a current interest in supporting. We want all of the statements to be looked at with consideration of the evaluation criteria and guiding principles, whether they are under a theme or not. When voting and review panel results are presented to our sponsors, they make the final selections based both on the ranking by the community or review panel, in combination with their interest in supporting specific themes.
3) Perhaps subcategories should be considered to better represent the diversity of projects. Four major categories seem to be 1) end user data portals/DBs/decision tools 2) software toolkits: algorithm/workflow development 3) field technologies 4) actual synthesis projects.
Subcategories are an idea we can consider, though in the past (FY13) this was part of the process and the CDI decided to do away with the categories in favor of evaluating the ideas in one group, in order to see best met the principles of the CDI. Point taken that it could help people to rank and organize in their heads though.
4) It is difficult to judge whether various data serving proposals are helping or hindering the advent of interoperable/redeployable software. More development dollars on unsustainable or ultimately incompatible approaches are a poor use of CDI funds, but impossible for the average voter to judge. SAS (Science Analytics and Synthesis, the group that CDI sits under) and other groups should weigh in on best practices for portal development (maybe they have?).
Another good point. What we have available is seed funding, which is not useful for sustainability and maintenance. The ideal case is that a new innovative idea is demonstrated so that it can be taken under the wing of an existing program. This is not always possible. The second ideal case is that there is adequate description and documentation of the project so that the community can learn lessons from its execution, whether a sustainable tool is developed or not. We will continue to search for ways to help projects with their sustainability.
Some things that may inform us as a community:
5) Was surprised to have to rank from 1 to 15 versus placing number 1 votes or simply votes received. Much harder to figure out nuances to accomplish a ranking.
It was more challenging than I expected to rank my top 15. It was definitely easier to be able to weight similar proposals the same. That being said, this voting tool was easy to use.
We moved away from custom-designed software this year, so had limited options on how votes could be distributed, and we settled on ranking up to 15 as the best option in what was available. We will continue searching to see if there is a way to do distribution of votes in the future, since it does allow more flexibility. That being said, the distribution of votes across projects does not look too different, qualitatively, from past years.
6) I thought the proposals generally seemed pretty weak in covering the experience and the timeline this year. Also, the hardest factor to judge was the idea of Best Practice. There were some proposals that I would have probably proposed a quite different solution - but I am not an expert in their field. Maybe CDI should propose a few questions related to each of the guiding principles to prompt them…
We hope that the community will ask questions of the PIs or make suggestions if they have different solutions! I know this is difficult as I am a little shy myself to comment too much, but we’d like to foster a place where honest questions and suggestions can be discussed. Maybe in future years we could place some guiding principle-related questions on the forum (or as a format to the lightning talks?) to get the discussion started for all statements.
7) I don't approve of this method of selection. The problem is, some critical but domain specific developments may receive very little attention compared to others that are more generic and therefore understandable by all CDI participants.
CDI is a very diverse community and so it is true that domain-specific developments will be hard to get supported unless framed in a way that may benefit a wider audience. Traditionally we have focused on this notion of funding ideas that have the potential for USGS-wide benefit, and thus have a tendency to support ideas that are collaborative across mission areas, and as stated, understandable to a larger audience. That being said, this is a good opportunity to learn more about important topics.
Further comments are invited in the comments to this page (you must be logged in to the wiki to see that option) or sent to us at cdi@usgs.gov.