Skip to end of metadata
Go to start of metadata

January's monthly meeting covered how to evaluate web applications and better understand how they are working for users, and explored well-established strategies for USGS crowdsourcing, citizen science, and prize competition projects. 

Application Evaluation: How to get to a Portfolio of Mission Effective Applications 

Nicole Herman-Mercer, a social scientist in the Decision Support Branch of the Water Resources Mission Area's Integrated Information Dissemination Division, presented on how to evaluate web applications based on use, value, impact, and reach, as defined below. 

Use  

Definition: take, hold, view, and/or deploy the data/application as a means of accomplishing or achieving something. 

  • How many people use this application? 
  • How many are new users? 
  • How many are returning users? 
  • Are users finding what they need through this site/application? 

Herman-Mercer used Google Analytics to answer some of these questions. Google Analytics provided information such as total daily visits, visits through time, what pages users are visiting and how they're getting there (links from another website, search, or direct visits), how often they're visiting, how many repeat visits occur, and how long users spend on individual pages. 

Value 

Definition: The importance, worth, and/or usefulness of the application to the user(s) 

  • How willing are users to pay for the application? 
  • How important is this application to the user's work and/or life? 
  • What/how large would the impact of the loss of this application be to the user? 

To estimate the value of selected applications to users, an electronic survey was sent to internal water enterprise staff, which asked respondents to indicate which applications they used for work, and then to answer a series of questions about those applications. Questions attempted to pinpoint how important applications were to users, and how affected their work would be should the application be decommissioned. 

Impact 

Definition: The effect the application has on science, policy, or emergency management 

  • How many scientific journal articles use this application? 
  • Is this application relevant for policy decisions? 
  • Do emergency managers use this application? 

Publish or Perish software for text mining was used to get at some of these data points. Publish or Perish searches a variety of sources (Google Scholar, Scopus, Web of Science, etc.) and returns any citations that applications are getting. Attempts to search for policy document citations has proven more difficult, and was not factored into this evaluation as a result. 

Reach 

Definition: How broadly the application reaches across the country and into society 

  • Where are users? (Geographically) 
  • Who are users? (Scientists? Academia? Government?) 

Google Analytics was again used to gather visits by state, which was then compared with the state population to get an idea of use. These analytics could also identify which networks users are on, i.e., .usgs, .gov, or .edu. Finally, an expert survey was deployed, surveying users who developed the application or currently manage it to get a sense of who the experts think the intended and actual audience is. 

Contact Nicole at nhmercer@usgs.gov for a detailed report on the full evaluation. 

Herman-Mercer's team was inspired by Landsat Imagery Use Case studies. 

USGS Open Innovation Strategy for Crowdsourcing, Citizen Science, and Competitions 

Sophia Liu, an Innovation Specialist at the USGS Science and Decisions Center in Reston, VA, as well as the USGS Crowdsourcing and Citizen Science Coordinator and Co-Chair of the Federal Community of Practice for Crowdsourcing and Citizen Science, presented an overview of well-established USGS crowdsourcing, citizen science, and prize competition projects. 

Citizen science, crowdsourcing, and competitions are all considered by Liu to be types of open innovation. Definitions of these terms are as follows: 

  • Citizen science: public participation or collaboration with professional scientists requesting voluntary contributions to any part of the scientific research process to enhance science. 
  • Crowdsourcing: a way to quickly obtain services, ideas, or content from a large group of people, often through simple and repeatable micro tasks. 
  • Competitions: challenges that use prize incentives to spur a broad range of innovative ideas or solutions to a well-defined problem. 

A popular example of citizen science/crowdsourcing is citizen seismology or public reports of earthquakes, like Did You Feel It? 

Liu has documented about 44 USGS crowdsourcing and citizen science projects, and 19 USGS prize competitions. Some examples of open innovation projects and information sources are listed here: 

Participants during the presentation were asked to use the following Mentimeter poll to answer short questions and provide feedback on the talk. 

Sophia is looking for representatives from across all USGS mission areas, regions, and science support offices interested in giving feedback on the guidance, catalog, toolkit, and policies she is developing for the USGS Open Innovation Strategy. Feedback can be provided by joining the USGS Open Innovation Strategy Teams Site or emailing her at sophialiu@usgs.gov. 

See the recording and slides at the meeting page.