Confluence Retirement

In an effort to consolidate USGS hosted Wikis, myUSGS’ Confluence service is scheduled for retirement on January 27th, 2023. The official USGS Wiki and collaboration space is now SharePoint. Please migrate existing spaces and content to the SharePoint platform and remove it from Confluence at your earliest convenience. If you need any additional information or have any concerns about this change, please contact Thank you for your prompt attention to this matter.
Skip to end of metadata
Go to start of metadata

Community for Data Integration: FY13 Proposal Review Process

Each year, the Community for Data Integration (CDI) has provided funding support for projects that promote data integration and

  • Focus on short-term benefits to science
  • Leverage existing capabilities
  • Apply solutions/methodologies that can be replicated
  • Ensure sustainability
  • Seek substantial return on investment
  • Expose corporate data
  • Organize science models and outputs
  • Preserve and access project data

In FY13, CDI released the Science Support Framework (SSF), a conceptual architecture that illustrates how CDI contributes to Bureau-level data integration efforts.  As part of SSF implementation, a new process was established for soliciting and reviewing proposals.

A formal Request for Proposals (RFP) was announced on September 5, 2012, for projects that would focus in four different Categories that support the SSF:

  • Category 1 - Management, Policy, and Standards:  Data Management includes data and metadata standards and policies and occurs in all phases of the Science Data Life Cycle (SDLC) from planning and data acquisition through scientific research to finished information products. Knowledge Management involves the creation, standardized documentation, and organization of artifacts describing or encapsulating knowledge resulting in the creation of reusable knowledge bases. Activities may involve the development, application and/or testing of data integration processes, protocols, and products that are or result in improved or more effective data integration management, policies, and standards.
  • Category 2 - Computational Tools and Services: Computational Tools and Services include SDLC processes, tools, and services that move data through the SDLC, related human and machine interactions, and interactions with data through technology. Activities are primarily technical in scope and aimed toward the development of applications, Web services, semantics, or combinations of technologies centered upon the advancement of data discovery and integration.
  • Category 3 - Data and Information Assets: Data and Information (assets) represent what SDLC Processes, Data Management, and Knowledge Management processes operate on. Activities may be aimed at creating or improving data and information products. They may involve or result in the development, application and/or testing of data and information assets that support, facilitate or improve data integration and the transition from data to information to knowledge.
  • Category 4 - Community Innovation: Community Innovation may involve or result in the development, application and/or testing of unique data and information assets, tools or services and demonstrate novel, innovative approaches and solutions for data integration; or the development of new unique tools and services for data management and integration.

Proposal Submissions

By the FY13 RFP cutoff of November 9, 2012, a total of 43 proposals were submitted under the four SSF Categories. Review Panels were recruited from across USGS Regions and Mission Areas.

Two of the Categories were assigned multiple Review Panels, based on (1) a review of several RFP processes that indicated reviewers should not evaluate more than 8 proposals and (2) an identified need to reduce the time commitment of the volunteer reviewers. 

In total, 7 Review Panels were established in the CDI FY13 RFP Review Process: 

  • Category 1: Management, Policy and Standards (1 Review Panel)
  • Category 2: Computational Tools and Services (3 Review Panels)
  • Category 3: Data and Information Assets (2 Review Panels)
  • Category 4: Community Innovation (1 Review Panel)

Review Panel Objectivity

The following actions were taken to minimize potential conflicts of interest by Review Panel members:

  • Assigned reviewers to proposals outside their program
  • Combined CDI members and non-CDI members on individual panels
  • Ensured that reviewers who submitted proposals did not evaluate proposals in that category
  • Required reviewers to disclose and sign a Conflict of Interest Form
  • Maintained anonymity of reviewers across panels to ensure no discourse between Review Panels would occur

Each Review Panel worked independently and read proposals only within their designated category. For categories with multiple Review Panels, each panel also worked separately to review proposals of that category. All panels consisted of three reviewers, so that each proposal within a category was reviewed by three individuals as well as collectively by the entire Review Panel. 


All panel reviewers were USGS Federal employees who volunteered their time. The 21 Reviewers came from a wide variety of expertise and represented a broad range of Mission Areas, Regions, and Program areas. Nearly half of the reviewers were affiliated with CDI through Working Groups or were on the mail membership list, and the rest participated out of interest in the topic of data integration.

The following Regions and Mission Areas were represented on the Review Panels: 

Mission Area/Office

  • Administration and Enterprise Information
  • Climate and Land Use Change
  • Core Science Systems
  • Ecosystems
  • Energy and Minerals and Environmental Health
  • Water


  • Midwest
  • Northwest
  • Pacific
  • Southwest

Review Panel Process

 Reviewers conducted individual evaluations of proposals within their category as well as a group evaluation.

Individual Reviewer Evaluation

Reviewers were asked to document a summary of the strengths /weaknesses of each proposal based on guidance from the RFP Document. Reviewers also scored each proposal based on a rubric also outlined in the RFP Document.

The Score Sheet allocated points for required sections within the proposal as follows:

  • Scope (25 points)
  • Technical Approach ( 25 points)
  • Project Experience (25 points)
  • Commitment to Effort (15 points)
  • Budget (5 points)
  • Timeline (5 points)

Group Evaluation

Having read and evaluated the proposals individually, the reviewers then participated in a group Review Panel meeting to discuss and rank proposals within their category.

Reviewers discussed the strengths and weaknesses of each proposal and gave their recommended scores. Scores for each proposal were averaged to obtain an initial score for the proposal.

To encourage student involvement in CDI, two Review Panels included a student reviewer. Students were able to provide input during proposal discussions and gain valuable experience in the proposal review process. However, students' scores were not included in the calculation of proposal scores and the final recommendations of the panel.      

Based on initial scoring and reviewer discussion, the panel collectively agreed on a final recommendation for each proposal.  Three types of recommendation could be made for each proposal:

  • Proposal has merit and aligns with CDI goals and should be funded by CDI
  • Proposal has merit but should be funded through a Program or Region
  • Proposal does not meet merit standards and should not be funded

Proposals recommended by the panel to be funded by CDI were then ranked in order of first choice, second choice, third choice, etc.


Review Panel results were submitted to CDI Executive Sponsor Kevin Gallagher for final recommendation.  On February 13, 2013, Kevin announced funding for the 7 proposals ranked #1. At the August 14, 2013 CDI Webinar Series, Kevin Gallagher announced an additional 3 proposals that were ranked #2 will also be funded for a total of 10 Proposals funded in FY13. 


  • No labels