Skip to end of metadata
Go to start of metadata

The CDI sent out ballots for voting on October 25, with the voting open until November 8, 2019.

1296 ballots were distributed and 151 ballots were returned.


Results

Voting results are shown below. The final decisions about projects invited back for full proposals will be decided after meetings with CDI's executive sponsors. The final decision involves a threshold for points in combination with consideration of USGS priorities. PIs will be notified by email by early December. Yes, it takes a few weeks to put everything in place and get feedback from all parties involved, thanks for your patience.




For details, see: Results-CDI-RFP-FY20.pdf


Distribution of number of statements ranked

Voters were asked to rank up to 15 statements of interest. The following chart shows that a little less than half of voters ranked a full 15 statements. Smaller numbers of voters only ranked 1-14 statements and there were no trends in particular statements being voted on singularly.


For details, see: Raw_Results-CDI-RFP-FY20.csv


Algorithm for points

The Borda Count method was used. We had N = 24 total statements. N (24) points were assigned for rank 1, N-1 (23) for rank 2, and so on down to N-14 for rank 15. 


Comments that came in through the ballots


1) Maybe I missed it, which is entirely possible, but I am/was confused whether I am supposed to vote based on what is most personally interesting to me/who makes the most compelling case vs who matches the priorities of the RFP. Maybe put that in somewhere on the website? (I went kinda through the middle) Thank you for all you do- I am sure this is a lot to pull together in a short amount of time!

This is a good point, in the past there were detailed instructions about what considerations to make when ranking statements of interest, which were identical to the ones used in the full proposal review panel. In recent years we have moved more to measure community interest in the ideas, with the thinking that the community will be learning from the project outputs and we want the greatest number of people to benefit. Next year we will work to have better instructions about this issue.

2) "Allow for numerical input of ranking instead of drop-down options.
Record all lightning talks such that they could be linked to every project page for reference."

With the move to an off-the-shelf solution for voting, we sacrificed some of the things we are able to customize. Numerical input or other types of voting (high, medium, low support) are features that we have not found in our search of voting platforms. We are not able to support a customized voting system, so If anyone has suggestions for more flexible existing platforms, please let us know!

The suggestion to split up the lightning talks is a good one, and we will consider devoting more hours to do that, or maybe links to specific places (start of new lightning talk) in the single file, if that is possible. On the other hand, we do want voters to consider all statements and having people watch the entire file is one way to work toward that goal.


3) "Conflict of interest:
a-Ranking of our own proposal
b-I also work closely with X and Y"

We've not dealt with conflicts of interest in Phase 1 Community Voting, and view ranking one's own proposal the same as voting for oneself in an election. (Insert obligatory reference to the 1999 movie "Election" here.)

4) For the next survey, I would like to see more response from the authors as to how the statement of purpose / project ties into the theme.

This is a good suggestion. Statements are not required to tie to the theme, but it would be useful to see explicit statements for those that do.

5) The Lightning Presentations were excellent! Thank you to the CDI for this opportunity!


6) The order above should also be alphabetical by last name to help the voters stay organized.

In the ballot, the order is randomized to attempt to prevent bias in voting (even though voters are encouraged to complete their rankings first). We are up for suggestions on how to stay organized yet not introduce bias from consistent ordering.


Further comments are invited in the comments to this page (you must be logged in to the wiki to see that option) or sent to us at cdi@usgs.gov.

  • No labels