Page tree
Skip to end of metadata
Go to start of metadata

Timeline for the 2017 RFP

2016 October 19 - November 9:   Voting Period

2016 November 9: Voting Closing Session

2016 November 9 - 28:   Brief Executive Sponsors

2016 November 28:    Invite Full Proposal Submissions

2017 January 20, 5pm ET:  Full Proposals Due

2017 January - February:  Full Proposal Review Panel

2017 March 8:   Awarded Projects Announced 

Statement of Interest Voting

Overview

Phase I of the RFP process involves community voting. All CDI members can vote online to determine the SOIs recommended for the full proposal phase. The voting period will begin Wednesday, October 19th at 8:00am ET. Voting will close on November 9th during the SOI Voting Closing Session at the CDI Monthly Meeting at 11:00 am ET. CDI members will be allowed to change their votes during the first few minutes of the session, but the system will be locked down after the first 10 minutes. During the SOI Voting Closing Session, the community will decide on the number of SOIs that will be recommended to the CDI Executive Sponsors to move to the full proposal phase of the RFP.  

To become a member, or ask any questions, email cdi@usgs.gov

CDI FY2017 Statement of Interest Voting Closing Session Details
Wednesday, November 9th, 2016 from 11:00am - 11:30am ET
WebEx: Go to usgs.webex.com and Select (50) Community for Data Integration
Teleconference:  (703) 648-4848 or (855) 547-8255; Code: 47919#

Voter Responsibilities

CDI members must agree to review and consider all SOIs before voting! 

All SOIs should be considered based on the Elements of the CDI Science Support Framework, the Evaluation Criteria of the RFP guidance, and the CDI guiding principles.

Science Support Framework

See U.S. Geological Survey Community for Data Integration (CDI) Science Support Framework (SSF) documentation for detailed descriptions of each element.

Evaluation Criteria

Scope (25%)

Evaluation will be based on whether the proposal adequately demonstrates the need for the effort/activity, how much the proposal contributes to the guiding principles and element(s) of the CDI Science Support Framework, and whether the effort has potential impact beyond a single Program, Center, Mission Area, or Region. CDI projects will also be evaluated on anticipated return on investment (e.g. cost savings, code utilization, publications, operational efficiencies, etc.).

Technical Approach (25%)

Evaluation will be based on the reasonableness of the technical approach applied to the problem and whether the approach is innovative or employs a proven, reliable technique that is appropriate to the problem.

Project Experience and Collaboration (25%)

Evaluation will be based on the appropriateness of the experience, special qualifications, and skills possessed for successful completion of the proposed project. Evaluation will also consider whether the inclusion of inter-disciplinary or cross-Mission Area/Region collaboration and partnerships has been pursued where appropriate.

Sustainability (15%)

Evaluation will be based on how well the proposal describes the intended sustainability of the project deliverables (products, tools, services, metadata) for long-term access, reusability, and potential for integration. All products resulting from CDI projects must comply with the new Office of Science Quality and Integrity Instructional Memoranda on data management. These products must be freely shared and made available, without charge or restriction, to the CDI, the broader USGS community, and beyond as appropriate. Software products developed with CDI funding must have, at a minimum, a copy uploaded to a USGS Bitbucket Repository at the close of the funding period. Additional links to active repositories are encouraged.

Budget Justification (5%)

Evaluation will be based on whether the budget is at or below $50,000 and meets the minimum 30% in-kind match. The budget should include travel to the CDI biennial meeting. Evaluation will also consider whether justification of salaries and contractor costs, travel, and equipment/publication costs are appropriate to project needs and the work hours proposed are reasonable within the timeframe. Projects with contractor support must describe how the contract work will be managed and documented to ensure that products are USGS property.

Timeline (5%)

Evaluation will be based on clear presentation of the project phases and milestones described in the technical approach and the feasibility of the proposed workload given the project duration.

Guiding Principles

  • Focus on targeted efforts that yield near-term benefits to Earth and biological science
  • Leverage existing capabilities and data
  • Implement and demonstrate innovative solutions (e.g. methodologies, tools, or integration concepts) that could be used or replicated by others at scales from project to enterprise
  • Preserve, expose, and improve access to Earth and biological science data, models, and other outputs
  • Develop, organize, and share knowledge and best practices in data integration 

SOI Online Voting System

Go to https://my.usgs.gov/CDI_RFP/event/voterAgreement/5248 to vote:

MyUSGS Voting Entry Page

SOI List Page

SOIs are listed randomly each time you enter the system. You can download a CSV file, which will list all SOIs and their metadata. Click on the title of the SOI to get to the voting page.

UserVoice SOI Voting Page

You will have 15 votes to use across all SOIs, and each SOI can receive 1-3 votes. Click on the "Vote" button to select either one, two or three votes for the SOI.

Comments are encouraged, and PIs can and should respond to the comments using the system.

Click on "Settings" to see all of your SOI votes and comments.

Voting Strategies

Consider giving each interesting SOI 1 vote to go back and review and modify later. Also, consider downloading the csv file and using the spreadsheet to add comments and manage votes.

Review Panel

Full Proposals will be evaluated according to the Evaluation Criteria. Proposals will be reviewed by a panel consisting of a professional peer group that is knowledgeable in data management, information technology, and other relevant disciplines in the context of the CDI. All efforts will be made to include broad disciplinary representation and expertise. Reviewers may consult with subject experts outside of the Review Panel as needed. Reviewers will be asked to sign a Conflict of Interest Statement and Certification to ensure objectivity of the evaluation. Proposals will be scored by reviewers and ranked through a peer consensus process. Recommendations by the Review Panel will be presented to the CDI Executive Sponsors for final selection.   

  • No labels