Application Inspection
So far we have looked at tools that examine the web server. In doing so, we miss vulnerabilities that may be present in the web application. This class of vulnerabilities arises from insecure programming and misconfiguration of the interaction between web servers and databases. We can’t explain the nature of web application insecurity and the methodology and techniques for finding those vulnerabilities within a single chapter. What we will show are the tools necessary for you to peek into a web application. Although a few of these programs have grown from the security community, they deserve a place in a web application programmer’s debugging tool kit as well.
Additional stunnel.conf Directives
Directive | Description |
---|---|
foreground | Values: yes or no |
TIMEOUTbusy | Value: time in seconds |
TIMEOUTclose | Value: time in seconds |
TIMEOUTidle | Value: time in seconds |
Achilles
Aptly named, Achilles helps pick apart web applications by acting as a proxy with a pause button. A normal proxy sits between a web browser and a web server, transparently forwarding requests and responses between the two. Achilles works similarly, but it adds functionality that lets you modify contents on the fly. For example, Achilles lets you manipulate cookie values, POST requests, hidden Form fields, and every other aspect of an HTTP transaction—even over SSL!
Implementation
Because it’s a proxy, Achilles must first be set up to listen on a port and placed into “intercept” mode. Clicking the play button (the triangle) starts the proxy, and clicking the stop (square) button stops it—think of a tape recorder’s controls.
It’s a good idea to leave the Ignore .jpg/.gif option enabled. Modifying image files rarely bypasses a web application’s security stance, and the number of requests it generates from a single web page quickly becomes annoying.
Next, set your web browser’s proxy to the IP address (127.0.0.1 if it’s the same computer) and port (5000, by default) on which Achilles listens. Normally, it’s easiest to run Achilles on your localhost. Any web browser that supports an HTTP proxy, from Lynx to Galeon, can use Achilles. The restriction to the Windows platform is that Achilles is a Win32 binary.
In basic intercept mode, you can browse a web site or multiple web sites transparently. The Log To File option will save the session to a file. This is useful for surveying a web application. The logfile holds every link that was visited, including helper files such as JavaScript (*.js) and other include (*.inc) files that are not normally seen in the URL. The other advantage is that you now have a copy of the HTML source of the target web site. This source might reveal hidden Form fields, cookie values, session-management variables, and other information about the application. The techniques for picking apart a web application are well beyond the scope of this chapter, but having a tool like Achilles is an absolute requirement for performing such tests.
In active intercept mode, you can view the requests made by the browser (Intercept Client Data) or responses sent by the server (Intercept Server Data (text)). Intercepting client data enables you to manipulate GET and POST requests as well as cookie values. This capability is used to bypass authentication and authorization schemes and to impersonate other users. Achilles' text box basically functions as a text editor.
Using Achilles probably sounds abstract by now. This is definitely a tool in the “pictures are worth a thousand words” category. Launch Achilles, change your web browser’s proxy setting, make sure to choose Intercept Client Data, and browse your favorite web site. You’ll be surprised to see what goes on behind the scenes of ordering a book or checking your bank balance!
Interception Problems Achilles intercepts only text data. A site that uses ActiveX components, also known as COM (Component Object Model) objects or CAB (cabinet) files, is more resilient to interception because such files appear as binary data that Achilles always ignores. Achilles still correctly proxies the HTTP connection, but you will not be able to manipulate the data. Other binary objects, such as downloading a ZIP or PDF file, are also proxied but are not shown in the text window.
Web sites that use SSL often generate errors in Achilles. A problematic site with 20 objects on its home page (such as pictures, style sheets, JavaScript files, and HTML) might generate 20 “Client failed SSL connection” errors. This is not really a big deal, but it does mean that you have to click 20 different OK buttons to close each error indication.
Some sites tend to cause Achilles to crash unexpectedly. There does not seem to be any good rule of thumb that determines which sites cause crashes and which do not. One workaround is to log onto the site with the proxy, and then start the proxy and change your browser’s settings once you come to a particular part of the application you wish to inspect. Unfortunately, this technique fails against sites that use strong session management. Finally, Achilles handles HTTP Basic Authentication, but any web application that uses NTLM Authentication (supported by IIS) will not work through Achilles.
WebSleuth
WebSleuth puts proxy functionality right in the browser. It is a set of Visual Basic routines wrapped around Internet Explorer. Obviously, this ties you to the Win32 platform, but the tool proves useful enough to consider for web application testing. It allows you to step through a site while examining cookies and HTML source, taking notes along the way. It has also grown from an Internet Explorer shim to a full-featured application testing tool. The 1.36 version is free, but buggy. The 1.41 series fixed several bugs and adds new functionality, most noticeably a request interceptor.
Implementations
The green, red, and blue buttons located on the bottom right control site navigation: Go, Back, Stop, Forward, Reload. The Properties, Toolbox, Plugins, and Favorites menus are accessed by clicking the menu with either mouse button.
The Source tab, enables you not only to view the HTML source of a web page but also apply syntax highlighting (AutoColor option) and even reformat muddled HTML into a more human-readable version (Cleanup option). This is a boon to anyone who has ever tried to slog through web applications whose HTML is littered with punctuation characters, tags, and too few spaces to separate it all.
The best addition to WebSleuth is the inclusion of the Intercept tab, configuration options cover almost any scenario one could wish to cover. The options enable you only to trigger the intercept engine for URLs with a particular file extension or if the URL contains a query string, which is one of the most common reasons for intercepting a request. It also triggers on POST requests or if the URL contains a particular string. Another setting allows for a Gateway Proxy, which enables you to chain WebSleuth with another proxy—something that Achilles sorely lacks.
Another addition to the control tab selections is the Spider tab, Just as you would expect, this tab sets the options for WebSleuth’s internal site crawling engine. The crawler has difficulty with authentication-based applications but nevertheless performs fairly well. A nice feature, which isn’t present on other local proxies, is the ability to add notes for each page. Highlight any of the pages in the left-hand pane of the Window and the right-hand pane displays and Add/Edit Notes button. You can take notes if the page has been tested, if any vulnerabilities were discovered, or if the HTML contained sensitive information.
The Properties menu button displays information about the current page. It does not affect “properties” of the application, nor can it change properties of the current page. It merely reports information. It is useful for extracting focused types of information about the current page: Links, Forms, Cookies, Frames, Query Strings, Images, Scripts, Comments, and Meta Tags.
The Toolbox menu button has some of the best functions. The HTML Transformations function is especially cool. It removes scripts that disable many types of input validation routines. It shows hidden fields, which reveal session, server, and client variables. Also, the Generate Report function creates an excellent list of the current page’s cookies, links, query strings, Form information, script references, comments, and META tags.
The Plugins menu serves as WebSleuth’s method of extending its core capabilities. It enables such activities as request editing (now over SSL as well), testing HTTP PUT and DELETE verbs, and cookie attribute manipulation.
Paros Proxy
Now that Achilles and WebSleuth have been mentioned, it is time to introduce the new heavyweight in the local proxy arena: Paros. While Achilles introduced the utility of local proxies, its development stalled prematurely and WebSleuth is intractably tied to Internet Explorer. Paros is a Java-based proxy that not only imitated the concept of a local proxy, but added significant enhancements to usability, testing techniques, and data presentation. In other words, you should download, install, and try Paros, because it’s an excellent tool!
Implementation
Paros is pure Java. Hence, you can download and compile the source yourself or simply obtain the binary and begin testing. You will need to use the Java 1.4 environment, so be sure to update your system’s Java installation if it does not meet this requirement. Once installed, launch Paros and set your browser’s HTTP proxy setting for port 8080 and HTTPS proxy for port 8443. Now, you are ready to begin examining a web application: navigate through the application as you normally would via the web browser. Paros silently records the directory and file structure of every request. The directory structure of an osCommerce application in the Web Site Hierarchy window in the upper-left corner of the interface.
Although Paros observes every aspect of the request, whether the request uses HTTP or HTTPS, it will log only cookies and the site hierarchy by default. If you wish to record other aspects of the application, navigate to the Filters tab on the interface and set your desired options, Even though the GET and POST files have an .xls extension, they are tab-delimited plain-text files that you can view with a text editor or import into a spreadsheet application. The files are written to the directory from which Paros is executed.
Your next option is to instruct Paros to scan the items in the site hierarchy for common vulnerabilities. Navigate to the Scan tab and check the types of scans you wish to perform, scans are not performed automatically. You must right-click an entry in the Web Site Hierarchy window. This opens a pop-up menu that enables you to select Scan Selected Node, Delete Selected Node, or Clear All. If you select Scan Selected Node, Paros begins its predefined tests.
The filters and scan options represent techniques not available in Achilles and only approximated in WebSleuth. Of course, the greatest benefit of a local proxy is the ability to intercept and rewrite web requests. Paros provides this capability in the Trap tab, which is split into two sections. The Header section shows the intercepted request when Trap Request is checked. This allows you to view and edit the entire URL and Headers that will be sent to the server. Once you click Continue, the Header and Body sections are populated with, appropriately enough, the HTTP Header and Body data returned by the server. This process is shown in the next two figures. You should notice that a single quote has been inserted into the forum='all URL parameter. Header, which used to contain the modified request, not contains the Date, Server, and other fields. More interesting is the Body section, which displays the error produced in the back-end MySQL database due to the extraneous single quote inserted into the forum parameter.
The ability to rewrite and insert arbitrary characters into HTTP GET and POST requests makes a tool like Paros indispensable for auditing the security of a web application. Paros is just a tool; the techniques and tricks of testing web application security are far too broad to cover in this chapter.
Finally, Paros has an additional function hidden under the Tools menu. You can have Paros spider any HTTP or HTTPS application and populate the site hierarchy window automatically. The spider function works with varying success that depends on what the application requires with regard to cookies, headers, and authentication. Nevertheless, it serves as a nice utility that will improve over time.
Wget
The final tool we present probably seems out of place compared to the previous tools. Wget is a command-line tool that basically copies a web site’s contents. It starts at the home page and follows every link until it has discovered every page of the web site. When someone performs a security audit of a web application, one of the first steps is to sift through every page of the application. For spammers, the goal would be to find e-mail addresses. For others, the goal would be to look for programmers’ notes that perhaps contain passwords, SQL statements, or other juicy tidbits. In the end, a local copy of the web application’s content enables the person to search large sites quickly for these types of information.
Wget has other uses from an administrator’s point of view, such as creating mirrors for highly trafficked web sites. The administrators for the mirrors of many web sites (such as http://www.samba.org and http://www.kernel.org) use wget or similar tools to reproduce the master server on alternative servers. They do this to reduce load and to spread web sites geographically.
Implementation
As wget’s main purpose is to download the contents of a web site, its usage is simple. To spider a web site recursively, use the –r option:
$ wget -r www.victim.com
...(continues for entire site)...
The -r or --recursive option instructs wget to follow every link on the home page. This will create a www.victim.com directory and populate that directory with every HTML file and directory wget finds for the site. A major advantage of wget is that it follows every link possible. Thus, it will download the output for every argument that the application passes to a page. For example, the viewer.asp file for a site might be downloaded four times:
-
viewer.asp@ID=555
-
viewer.asp@ID=7
-
viewer.asp@ID=42
-
viewer.asp@ID=23
The @ symbol represents the ? delimiter in the original URL. The ID is the first argument (parameter) passed to the viewer.asp file. Some sites may require more advanced options such as support for proxies and HTTP Basic Authentication. Sites protected by Basic Authentication can be spidered in this way:
[root@meddle]# wget –r --http-user:dwayne --http-pass:woodelf \
> https://www.victim.com/secure/
...continues for entire site...
Sites that rely on cookies for session state or authentication can also be spidered by wget. Create a cookie file that contains a set of valid cookies from a user’s session. The prerequisite, of course, is that you must be able to log in to the site to collect the cookie values. Then, use the --load-cookies option to instruct wget to impersonate that user based on the cookies:
$ wget --load-cookies=cookies.txt \
> –r https://www.victim.com/secure/menu.asp
Still other sites purposefully set cookies to defeat most spidering tools. Wget can handle session and saved cookies with the appropriately named –cookies option. It is a Boolean value, so you can either turn it off (the default) or on:
$ wget --load-cookies=cookies.txt –cookies=on \
> –r https://www.victim.com/secure/menu.asp
The --http-user and --http-passwd options enable wget to access web applications that employ HTTP Basic Authentication. Set the values on the command line and watch wget fly:
$ wget --http-user=guest –http-passwd=no1knows \
> –r https://www.victim.com/maillist/index.html
In the end, wget provides a quick method for downloading the HTML contents of a web application for off-line analysis. If you are frustrated by the spidering capabilities of Paros, then use wget to perform these tasks