January's monthly meeting covered how to evaluate web applications and better understand how they are working for users, and explored well-established strategies for USGS crowdsourcing, citizen science, and prize competition projects.
Nicole Herman-Mercer, a social scientist in the Decision Support Branch of the Water Resources Mission Area's Integrated Information Dissemination Division, presented on how to evaluate web applications based on use, value, impact, and reach, as defined below.
Definition: take, hold, view, and/or deploy the data/application as a means of accomplishing or achieving something.
Herman-Mercer used Google Analytics to answer some of these questions. Google Analytics provided information such as total daily visits, visits through time, what pages users are visiting and how they're getting there (links from another website, search, or direct visits), how often they're visiting, how many repeat visits occur, and how long users spend on individual pages.
Definition: The importance, worth, and/or usefulness of the application to the user(s)
To estimate the value of selected applications to users, an electronic survey was sent to internal water enterprise staff, which asked respondents to indicate which applications they used for work, and then to answer a series of questions about those applications. Questions attempted to pinpoint how important applications were to users, and how affected their work would be should the application be decommissioned.
Definition: The effect the application has on science, policy, or emergency management
Publish or Perish software for text mining was used to get at some of these data points. Publish or Perish searches a variety of sources (Google Scholar, Scopus, Web of Science, etc.) and returns any citations that applications are getting. Attempts to search for policy document citations has proven more difficult, and was not factored into this evaluation as a result.
Definition: How broadly the application reaches across the country and into society
Google Analytics was again used to gather visits by state, which was then compared with the state population to get an idea of use. These analytics could also identify which networks users are on, i.e., .usgs, .gov, or .edu. Finally, an expert survey was deployed, surveying users who developed the application or currently manage it to get a sense of who the experts think the intended and actual audience is.
Contact Nicole at firstname.lastname@example.org for a detailed report on the full evaluation.
Herman-Mercer's team was inspired by Landsat Imagery Use Case studies.
Sophia Liu, an Innovation Specialist at the USGS Science and Decisions Center in Reston, VA, as well as the USGS Crowdsourcing and Citizen Science Coordinator and Co-Chair of the Federal Community of Practice for Crowdsourcing and Citizen Science, presented an overview of well-established USGS crowdsourcing, citizen science, and prize competition projects.
Citizen science, crowdsourcing, and competitions are all considered by Liu to be types of open innovation. Definitions of these terms are as follows:
A popular example of citizen science/crowdsourcing is citizen seismology or public reports of earthquakes, like Did You Feel It?
Liu has documented about 44 USGS crowdsourcing and citizen science projects, and 19 USGS prize competitions. Some examples of open innovation projects and information sources are listed here:
Participants during the presentation were asked to use the following Mentimeter poll to answer short questions and provide feedback on the talk.
Sophia is looking for representatives from across all USGS mission areas, regions, and science support offices interested in giving feedback on the guidance, catalog, toolkit, and policies she is developing for the USGS Open Innovation Strategy. Feedback can be provided by joining the USGS Open Innovation Strategy Teams Site or emailing her at email@example.com.
See the recording and slides at the meeting page.