Confluence Retirement

Due to the feedback from stakeholders and our commitment to not adversely impact USGS science activities that Confluence supports, we are extending the migration deadline to January 2023.

In an effort to consolidate USGS hosted Wikis, myUSGS’ Confluence service is targeted for retirement. The official USGS Wiki and collaboration space is now SharePoint. Please migrate existing spaces and content to the SharePoint platform and remove it from Confluence at your earliest convenience. If you need any additional information or have any concerns about this change, please contact myusgs@usgs.gov. Thank you for your prompt attention to this matter.
Skip to end of metadata
Go to start of metadata

Primary Project Page:
TSWG Proposal - Expand TNM Save As & Open In to USGS Wide

Matt Tricomi originally mentioned that NGTOC is continuing to develop the things users can do with a saved map viewing session, including editing and revisualizing/re-opening it in a viewer. If the particulars of a map viewing session are saved as a context encoded in JSON, this can be re-opend and referenced from remote servers that an API like the TNM Viewer API could call. If a JSON file can be edited, then it's conceivable that malicious code (or at least references to malicious code) could be introduced. This raises obvious security issues when re-opening a JSON file.

Project Goal - Solution Live By ESRI UC

Per Rob Dollison and a major goal from in the NGP 5-year Delivery Blueprint efforts, the target is to be live well before the ESRI UC as we'll be shutting down Seamless and the SDW folks at EROS will want to create instances using our TNM API for their clients... as well we've had many requests to expose.

TSWG Help Request

NGTOC would like to have a discussion with others thinking about these issues. NGTOC's assessment (and NGP EA agrees) that if USGS doesn't have experience with JSONP callbacks or direct JSON browser parsing, that NGTOC being the first likely not a reasonable route. So we want to see has anyone else done this at USGS? If not, we'll just parse server-side and treat as a general object to parse/lint/validate server-side.

Several issues of particular interest include:

- straight JSONP call back that can handle XSS issues or [CORS| http://www.nczonline.net/blog/2010/05/25/cross-domain-ajax-with-cross-origin-resource-sharing/]
- asking users to have a Viewer API GUID a la googlemap that we generate
- bit parity in the session that if a user edits has to regenerate a new one - in doing that, it gets a lint check

- take file in, pre-process in java app for lint, XSS check, then read in

Best Practice notes

General whitepaper on ajax json mashup

...suggests

Recommended Best Practices

5.1Add an Input Value Check

5.2Use Vulnerability Checking Tools

 5.3Don't Generate and Execute Code Dynamically

5.4Don't Insert Untrusted HTML Content Without Sanitizing

5.5Preventing CSRF Attacks

5.6Secure the Use of JSON

5.7Preventing JSON Hijacking Attacks

5.8Use <iframe> When Integrating Distrusted Contents

Rather than discussing this in an email thread, which will be relatively closed, we can communicate by commenting on this post. You can set a "Watch" under the Tools button so that you'll be emailed to any additions to the conversation.

YUIThree (ESRG uses), XJS, Jquery all do the above (or similar) but ultimately then pass through an eval()
CIDA has used XJS and Jquery, but used post-cleaned JSON files using decode functionality, so not for JSON security.

By implementing a proxy approach (Tucky proxy at CIDA has been used; also they have an inhouse custom proxy) this would eliminate the need to do a JSONP anyhow

Input from TSWG:

Comment 1: I-Lin Kuo (CIDA Java Development Team)

If you are concerned about the possibility of injection of malicious scripts inside a JSON object, the easiest way to deal with this is to avoid the use of Javascript's eval() and use a JSON parser. A JSON parser will treat the JSON string as just an object with values, ignoring any scripts embedded in it. See this page for more details http://www.json.org/js.html

Comment 2:  Phethala Thongsavanh (CIDA Java Development Team)

Kind of a generic answer, but the question is rather generic... just thoughts that come to mind right away.

1) Security and usability are always directly opposed to each other, need to think in terms of that trade off and restrict accordingly

2) JSON security issues are no different than any other parse-able item, data cleaning and validation being the most obvious routes to address that. That will only be as good as what the developers have thought of. It kind of jumps out at me that they think of validating at the server vs at the browser are any different and/or alternatives to each other (you should be validating at the server on inputs regardless). Same concern, different areas.

3) As far as validating and restricting, if you want to be really secure start with a "white list" and not a "black list". In other words EVERYTHING is invalid at first, then you start adding things to your white list to open it up.

They also have a nice set of links at the bottom of their WIKI. In practice, industry wide, they'd be above the bar if they implemented everything suggested in those links, but they'd probably sacrifice some usability.

Comment 3: Sky Bristol (Core Science Systems)

I know we've had some interesting dynamics with JSONP between my.usgs.gov, www.sciencebase.gov, and other domain shenanigans, and you could have a conversation with Steve Tekell on that. We've done a little work with browser parsing JSON, but probably not yet to the extent you are looking at here. At any rate, the ScienceBase team will certainly be interested in tracking along with this as it progresses.

Comment 4 John Aguinaldo (Eastern Science Region, LCAT)-

I think the security issue is similar to that of processing user supplied data in a server-side script - sanitizing user supplied data... don't dynamically execute user supplied data... the best practice links below are totally appropriate. We do a lot of json, mostly from our own server, but sometimes proxied from a trusted source. and we've dabbled with jsonp, but haven't really thought through the security implications too deeply, so this is a good subject to discuss...

Comment 5 Jim McAndrew (ATA Contractor)

I would like to know if any of you have had experience using JSONp? We would like to know what needs to be done to ensure that everything works in a secure fashion.

From TSWG 4/18 Meeting, Jim McA can contact Ivan @CIDA or John A. @ESRG directly for more questions about their JSON Parsing solutions using YUITree 3rd Party or Tucky Proxy parsing. They felt it can be done this way avoiding JSON or custom java server-side app. Webex Recording of the conversations which was 2/3 through the call will be coming which captures the TSWG input mostly from John and Ivan.

Comment 6 Jim McAndrew (ATA Contractor)

May 1, 2012

As per our "CDI 'Save Map As' Project Kick-Off Meeting" on April 17, 2012, we have come to a decision in collaboration with David Hughes on the uploader framework. We are going to use a Java application as a go-between for the uploaded JSON and the viewer. This will provide an added layer of security from the pure javascript solution we currently use for internal "config" data.