|February 6, 2018||SOI Lightning Presentation Session|
February 6 - 15, 2018
February 20 - 23, 2018
Brief Executive Sponsors
February 26, 2018
Invite Full Proposal Submissions
March 22, 2018
Full Proposals Due
|April 3, 2018||Full Proposal Review Panel|
April 16, 2018
Awarded Projects Announced
|September 30, 2018||Awarded Funds must be spent|
This year, the CDI will be holding an online session for 1-minute lightning talks to help the CDI community learn about the proposed ideas.
Tuesday, February 6, 1:00-2:30pm Eastern Time, Add to Google Calendar
Each submission will have one 16:9 slide (no animations - it will be converted to PDF) and 1 minute to share their idea with the community!
(Optional) Template (pptx)
Include somewhere on your slide:
Email your slide to email@example.com by February 5, COB. Slides will be compiled, converted to PDF, and the presentation order will be sent out first thing on February 6.
Phase I of the RFP process involves community voting. All CDI members can vote online to determine the SOIs recommended for the full proposal phase. The voting period will begin Tuesday, February 6th at 10:00am ET. Voting will close on February 15th at 5:00 pm ET.
To become a member, or ask any questions, email firstname.lastname@example.org.
CDI members should review and consider all SOIs before voting.
All SOIs should be considered based on the Elements of the CDI Science Support Framework, the Evaluation Criteria of the RFP guidance, and the CDI guiding principles.
See U.S. Geological Survey Community for Data Integration (CDI) Science Support Framework (SSF) documentation for detailed descriptions of each element.
Evaluation will be based on whether the proposal adequately demonstrates the need for the effort/activity, how much the proposal contributes to the guiding principles and element(s) of the CDI Science Support Framework, and whether the effort has potential impact beyond a single Program, Center, Mission Area, or Region. CDI projects will also be evaluated on anticipated return on investment (e.g. cost savings, code utilization, publications, operational efficiencies, etc.).
Evaluation will be based on the reasonableness of the technical approach applied to the problem and whether the approach is innovative or employs a proven, reliable technique that is appropriate to the problem.
Evaluation will be based on the appropriateness of the experience, special qualifications, and skills possessed for successful completion of the proposed project. Evaluation will also consider whether the inclusion of inter-disciplinary or cross-Mission Area/Region collaboration and partnerships has been pursued where appropriate.
Evaluation will be based on how well the proposal describes the intended sustainability of the project deliverables (products, tools, services, metadata) for long-term access, reusability, and potential for integration, as well as the plan for communicating the value of the products during and after the project period. All products resulting from CDI projects must comply with the new Office of Science Quality and Integrity Instructional Memoranda on data management. These products must be freely shared and made available, without charge or restriction, to the CDI, the broader USGS community, and beyond as appropriate. Software products developed with CDI funding must be uploaded to an appropriate code repository at the close of the funding period.
Evaluation will be based on whether the budget is at or below $50,000 and meets the minimum 30% in-kind match. The budget should include travel to the CDI biennial meeting. Evaluation will also consider whether justification of salaries and contractor costs, travel, and equipment/publication costs are appropriate to project needs and the work hours proposed are reasonable within the timeframe. Projects with contractor support must describe how the contract work will be managed and documented to ensure that products are USGS property.
Evaluation will be based on clear presentation of the project phases and milestones described in the technical approach and the feasibility of the proposed workload given the project duration.
Go to (link will be posted here and emailed) to vote: https://my.usgs.gov/CDI_RFP/event/voterAgreement/7765
SOIs are listed randomly each time you enter the system. You can download a CSV file, which will list all SOIs and their metadata. Click on the title of the SOI to get to the voting page.
You will have 15 votes to use across all SOIs, and each SOI can receive 1-3 votes. Click on the "Vote" button to select either one, two or three votes for the SOI.
Comments are encouraged, and PIs can and should respond to the comments using the system.
Click on "Settings" to see all of your SOI votes and comments.
Consider giving each interesting SOI 1 vote to go back and review and modify later. Also, consider downloading the csv file and using the spreadsheet to add comments and manage votes.
Full Proposals will be evaluated according to the Evaluation Criteria. Proposals will be reviewed by a panel consisting of a professional peer group that is knowledgeable in data management, information technology, and other relevant disciplines in the context of the CDI. All efforts will be made to include broad disciplinary representation and expertise. Reviewers may consult with subject experts outside of the Review Panel as needed. Reviewers will be asked to sign a Conflict of Interest Statement and Certification to ensure objectivity of the evaluation. Proposals will be scored by reviewers and ranked through a peer consensus process. Recommendations by the Review Panel will be presented to the CDI Executive Sponsors for final selection.