Confluence Retirement

In an effort to consolidate USGS hosted Wikis, myUSGS’ Confluence service is scheduled for retirement on January 27th, 2023. The official USGS Wiki and collaboration space is now SharePoint. Please migrate existing spaces and content to the SharePoint platform and remove it from Confluence at your earliest convenience. If you need any additional information or have any concerns about this change, please contact Thank you for your prompt attention to this matter.
Skip to end of metadata
Go to start of metadata

The formatting of this page was lost in translation from my.usgs 3.0.

FY11 CDI Project Proposal - Web Services Publishing Best Practices


Rob Dollison, Marc Levine, Kevin Hope, Matt Tricomi, National Geospatial Program, The National Map

Suggest under Data Management Track

Suggest engage Greg Gunther WSWG team and major USGS Corporate services

Introduction and Problem

Create a Web Services Best Practices document or wiki leveraging existing best practice captures - i.e. TNM has a nice start based its new services capability and experiences with converting to REST, working with WMS, projection support, api flavors (REST, WMS, KML, etc.), metadata/organization, etc.

This document should look to organize the best practices into the key topics – i.e. naming standards, metadata, projection, service types, etc. and organize bullets from other groups to recommend.

Relevance and Benefits

This may not be the most appealing of projects, but it is a simple great tool that would be great if enforced to public services. For instance, if a user was to add a service to the TNM Viewer or to Google Earth from USGS, they would have the same level of experience, usability, and brand feel when using USGS services. Currently, without using the best practices, it looks like multiple authors, complexities in trying to add because of slight nuances, or flat out cant add due to projection or setup issues.


1. Establish metadata, naming, and technical practices for publishing services

2. Update services for those involved to these standards

Total Level of Effort for Objectives: .5 FTE

Level of Efforts estimates are unvalidated “swags”, based on general experience in these areas.


Objective 1: Establish metadata, naming, and technical practices for publishing services

Suggest to review the TNM Best practices as a start. Look to establish a taxonomy of things to consider

Translate the initial information into a wiki page.

Have an edit period for the team

Have a final review

Create a review checklist tool for developers to use to evaluate their own or other services

Objective 2: Update services for those involved to these standards

Apply these rules to the participating services or key services

Apply checklist to selected services

Post results

For participating CDI members, Commit to updating the services by X time in FY11 to demonstrate

Wrap with demonstrating the resulting user experience of the services

Show the before results with the checklist and what that means

Show the after including the ease of integrating the services as well as the usability of the naming, metadata clarity, etc.

( font-size: large; color: #365f91; )Appendix A: TNM Best Practice Inputs

The following are deployment inputs from The National Map based on its deployment best practices for services.

1. Multiple platform testing –

a. tested to work ( text-decoration: underline; )as expected in Google Earth, ESRI JS API, Bing, Google Maps?

b. For instance, dynamic services do not work in Bing, dynamic MSD services do not work in Google Earth, but WMS should work in Google Maps and Google Earth and all should work in the ESRI JS API.

c. Does the WMS call work as well as REST Service OUTSIDE the USGS network – many times when publishing, its been found that a developer put the relative server name, not the fully qualified name? Are the server references referring to the external USGS server name, not the internal name which should not be disclosed outside of the USGS Boundary?

d. When adding, are the services discoverable and added to the client quickly?

e. Do the layers display quickly?

2. Source Data –

a. Validated with the Data manager that the source being used is the approved source to meet the requirement requested by Data/Project Sponsor?

b. Has the data been setup for optimal publishing performance – i.e. if using ArcGIS, considered using a File GDB copy for fairly static data-- with exception of imageserver services, real-time or very dynamic nature of data, or non ArcGIS services, all services should consider going against File GDBs.

c. When publishing the service, i.e. using and MXD, are the layers in the order that best is used for publishing?

i. For example, status layers, usually shown with fills, should in general show below or transparent with the features. Simply re-ordering and grouping layers top to bottom in the MXD file where status are on bottom or more transparent helps when the user addes the service, and all the layers are defaultly on. Otherwise, the features are hard to show

d. Are the fields that are identifiable now aliased with pretty names? Are the fields that shouldn’t be identifiable or simply not useful to user not displayed? Are their hyperlinks in the identifiable fields are they working links? Is the Identify responding in a timely manner (read as “response time not annoying to the user”)

3. Projection –

a. Doesthe cached services support the proper WGS 84 Web Mercator projection. Does the service work in the Google Maps Tiliing Scheme.

b. Is the default projection, due to source data being set as such, for dynamic projection as WGS 84 Web Mercator

c. Do the dynamic services support the proper multiple reprojections (i.e. WGS84 Geographic?) Is it set up with the standard list?

< CRS > CRS:84 </ CRS >

< CRS > EPSG:4326 </ CRS >

< CRS > EPSG:4269 </ CRS >

< CRS > EPSG:4267 </ CRS >

< CRS > EPSG:54004 </ CRS >

< CRS > EPSG:54008 </ CRS >

< CRS > EPSG:3785 </ CRS >

< CRS > EPSG:102113 </ CRS >

< CRS > EPSG:3857 </ CRS >

< CRS > EPSG:102100 </ CRS >

< CRS > EPSG:900913 </ CRS >

4. Product Representation

a. Data Sponsor Lead should determine the Products presented at each scale. The Lead, in coordination with the cross-cutting guidance and evaluating sources available, should determine the requirements for what products to show at what scales.

i. Feature or Product Representation at each appropriate scale has been reviewed by the Product Lead as well as compared against previous standards implemented providing recommendations back to the Product Lead

1. (i.e. showing what feature or coverage product at what user view scale.

ii. Note: The implied need is that the technologist will make sure that the min/max range provided. The Product Lead will not explicitly provide this (or be aware of this)

1. Should be halfway between largest scale and larger service smallest scale + .0001 and

2. Should be halfway between largest scale and smaller service largest scale

b. Have you tested the SOAP URL or URL for the Legend?

5. TRB and Architecture Considerations (Check which are applicable; Be Conservative)

a. If hardware capital required or software recommended using new standards, has your design been approved by infrastructure? Be sure decisions for infrastructure load should be first dedicated to the primary users. For instance, Static service concepts will be used for small scales to divert dynamic infrastructure needs during unplanned high load situations (i.e. emergency response, post-outreach times, etc.) for information that is highly static.

b. Is the Service setup to support best response to primary method of client delivery. For instance (examples only), the primary methods use Projected WGS84-Web Mercator. Does it support the download service as expected? If service for ArcMap users, has it been tested to be usable in ArcMap?

c. The service should present the latest data available to be visualized in a “what you see is what you get” brand – all other products should be offered for download only. For instance, the latest imagery should be setup in a way to simplify presenting the core products. In this case, that is 1-m and 1-ft, where the other products – 3”, 6”, etc., as deemed by the Leadand Cross-Cutting P&S Chief, secondary or archived products.

d. Has the Operational Concept to maintain the data flow been updated?

i. Is it documented how often – regular or irregular – data changes will be updated

ii. Demonstrate how this process is supported by automation versus manual labor updates WHERE POSSIBLE (i.e. scripts run by crons or scripts started by user versus manual ETL).

e. If the change is sunsetting a service or even an old URL, has it been reviewed how the Catalog, Monitoring, Viewer, Download, Function, or other users are impacted?

6. Apply to most changed, but the following apply ( text-decoration: underline; )Especially important if URLS, Layers or Fields Changed?

a. Coordinating with JUST ONE OF THESE POINTS is not acceptable for changing unless is not applicable:

b. Analyzed if name change necessary – Have you discovered aliases or ways to avoid name changes?

c. Catalog Coordination -ScienceBase to assure the service registration pointing to that service is made aware of the changes and when.

d. Standards – Have new name changes follow general best practices

i. This “pretty naming” should include domain name, directory structure, URL name, Service Name, and field names. The Service URL should be pretty, descriptive, and memorable.

ii. Is the domain name pretty or user friendly? Is the URL an IP or pretty name? Domain name should be something easy to remember – not an IP or internal service name. This can be accomplished via your outbound domain aliasing capabilities via eSAS or your IT team. The original name would still work, but the pretty name would be what you publish.

iii. In the metadata review, is author named the organization, not the developer name (a lot of times, the author field is defaultlly added by the PC that the user is editing on).

iv. Is there a good description? Also for the layers, are they described well. If in an MXD, it’s the description field for the group or data layer where this is edited

e. If MSD service, do you also have a MXD/WMS service? Does the MSD description note the standard text referencing the MXD/WMS version?

f. FAQs – Have they been updated?

g. Download – If the service used by download, have you coordinated the use of this in the download service

h. Staging Deployment and Acceptance Test – Tested on Staging to assure all works – DEMO’d in staging to Deployment manager and release Manager

i. Service MXD File(NEW!)--These will be an MXD Service Palette file for the 2,000 ArcMap Users at USGS available pointing to all the services. Assure the published MXD file has been prepared to reflect the new changes.

j. Use Analysis – Analyzed in the Web statistics where this is being used and provided that analysis to Deployment manager. Provide suggestions for communicating these changes if high use.

k. Monitoring Coordination-- assure IT that the monitoring in place pointing to that service is made aware of the changes and when.

l. Communications – Timed the proper communications to assure use is notified to consumers with enough lead time – monitoring, sciencebase, non-viewer users?

  • No labels