2016 October 19 - November 9: Voting Period
2016 November 9: Voting Closing Session
2016 November 9 - 28: Brief Executive Sponsors
2016 November 28: Invite Full Proposal Submissions
2017 January 20, 5pm ET: Full Proposals Due
2017 January - February: Full Proposal Review Panel
2017 March 8: Awarded Projects Announced
Phase I of the RFP process involves community voting. All CDI members can vote online to determine the SOIs recommended for the full proposal phase. The voting period will begin Monday, October 19th at 8:00am ET. Voting will close on November 9th during the SOI Voting Closing Session at the CDI Monthly Meeting at 11:00 am ET. CDI members will be allowed to change their votes during the first few minutes of the session, but the system will be locked down after the first 10 minutes. During the SOI Voting Closing Session, the community will decide on the number of SOIs that will be recommended to the CDI Executive Sponsors to move to the full proposal phase of the RFP.
To become a member, or ask any questions, email firstname.lastname@example.org.
CDI members must agree to review and consider all SOIs before voting!
All SOIs should be considered based on the Elements of the CDI Science Support Framework, the Evaluation Criteria of the RFP guidance, and the CDI guiding principles.
See U.S. Geological Survey Community for Data Integration (CDI) Science Support Framework (SSF) documentation for detailed descriptions of each element.
Evaluation will be based on whether the proposal adequately demonstrates the need for the effort/activity, how much the proposal contributes to the guiding principles and element(s) of the CDI Science Support Framework, and whether the effort has potential impact beyond a single Program, Center, Mission Area, or Region. CDI projects will also be evaluated on anticipated return on investment (e.g. cost savings, code utilization, publications, operational efficiencies, etc.).
Evaluation will be based on the reasonableness of the technical approach applied to the problem and whether the approach is innovative or employs a proven, reliable technique that is appropriate to the problem.
Evaluation will be based on the appropriateness of the experience, special qualifications, and skills possessed for successful completion of the proposed project. Evaluation will also consider whether the inclusion of inter-disciplinary or cross-Mission Area/Region collaboration and partnerships has been pursued where appropriate.
Evaluation will be based on how well the proposal describes the intended sustainability of the project deliverables (products, tools, services, metadata) for long-term access, reusability, and potential for integration. All products resulting from CDI projects must comply with the new Office of Science Quality and Integrity Instructional Memoranda on data management. These products must be freely shared and made available, without charge or restriction, to the CDI, the broader USGS community, and beyond as appropriate. Software products developed with CDI funding must have, at a minimum, a copy uploaded to a USGS Bitbucket Repository at the close of the funding period. Additional links to active repositories are encouraged.
Evaluation will be based on whether the budget is at or below $50,000 and meets the minimum 30% in-kind match. The budget should include travel to the CDI biennial meeting. Evaluation will also consider whether justification of salaries and contractor costs, travel, and equipment/publication costs are appropriate to project needs and the work hours proposed are reasonable within the timeframe. Projects with contractor support must describe how the contract work will be managed and documented to ensure that products are USGS property.
Evaluation will be based on clear presentation of the project phases and milestones described in the technical approach and the feasibility of the proposed workload given the project duration.
Go to http://my.usgs.gov/CDI_SOI to vote:
SOIs are listed randomly each time you enter the system. You can download a CSV file, which will list all SOIs and their metadata. Click on the title of the SOI to get to the voting page.
You will have 15 votes to use across all SOIs, and each SOI can receive 1-3 votes. Click on the "Vote" button to select either one, two or three votes for the SOI.
Comments are encouraged, and PIs can and should respond to the comments using the system.
Click on "Settings" to see all of your SOI votes and comments.
Consider giving each interesting SOI 1 vote to go back and review and modify later. Also, consider downloading the csv file and using the spreadsheet to add comments and manage votes.
Full Proposals will be evaluated according to the Evaluation Criteria. Proposals will be reviewed by a panel consisting of a professional peer group that is knowledgeable in data management, information technology, and other relevant disciplines in the context of the CDI. All efforts will be made to include broad disciplinary representation and expertise. Reviewers may consult with subject experts outside of the Review Panel as needed. Reviewers will be asked to sign a Conflict of Interest Statement and Certification to ensure objectivity of the evaluation. Proposals will be scored by reviewers and ranked through a peer consensus process. Recommendations by the Review Panel will be presented to the CDI Executive Sponsors for final selection.