Uncategorized

Are you using safe Http Headers?

There are a variety of web applications out there which are relying on http headers for different purposes: automatic redirection, streaming a binary file to the client, controlling how content is cached on the client, adapting the site’s functionalities and interface to the capabilities of the browse and a lot more I’m sure you can think to.

If you’re upgrading to ASP.NET 2.0 (or higher) an existing application which relies on http headers, you might encounter some problems, especially in the case you’re producing binary context (say a PDF file) to your clients: corrupted file, or type not supported, or inability to print the downloaded document are some of the symptoms you may get.

First, check your code if you have something like the following:

Response.ContentType = "application/pdf"; 
Response.AppendHeader("Content-Disposition", "attachment; filename=document.pdf"); 
Response.AddHeader("Content-Length", m.GetBuffer().Length.ToString()); 
ObjPdf writer = ObjPdf.getInstance(document, m); 
document.Open(); 

Try changing it to this:

Response.ContentType = "application/pdf"; 
Response.AppendHeader("Content-Disposition", "attachment; filename=document.pdf"); 
ObjPdf writer = ObjPdf.getInstance(document, m); 
document.Open(); 
Response.AddHeader("Content-Length", m.GetBuffer().Length.ToString()); 

If m.GetByffer().Length is zero then you have a problem, so it’s important to open the writer object before adding the header.

useUnsafeHeaderParsing

If that’s enough or you don’t want to change your code (maybe because too many pages are affected) then you can change your web.config:

<configuration> 
    <system.net> 
        <settings> 
            <httpWebRequest useUnsafeHeaderParsing=”true/> 
        </settings> 
    </system.net> 
</configuration>

The useUnsafeHeaderParsing config option will relax the header parsing so that headers do not have to strictly follow the standard described in the HTTP RFC. This option has been added for backwards compatibility, because the header parsing has been hanged to be very strict. Unfortunately a fair number of servers do not correctly follow the RFC, so clients using these servers will probably break due to this change. However, not using strict header parsing is a security risk, because malicious servers could send the client malformed headers which the client will then handle incorrectly. If you don’t use the config option to turn off the strict parsing you probably won’t get the server protocol error, but you open up the client to attack. The best solution is to try and get the server fixed.

When this property is set to false, the following validations are performed during HTTP parsing:

  • In end-of-line code, use CRLF; using CR or LF alone is not allowed
  • Headers names should not have spaces in them
  • If multiple status lines exist, all additional status lines are treated as malformed header name/value pairs
  • The status line must have a status description, in addition to a status code
  • Header names cannot have non-ASCII chars in them. This validation is performed whether this property is set to true or false

When a protocol violation occurs, a WebException exception is thrown with the status set to ServerProtocolViolation. If the UseUnsafeHeaderParsing property is set to true, validation errors are ignored. Setting this property to true has security implications, so it should only be done if backward compatibility with a server is required.

 

Carlo

Quote of the day:

I’m not sure I want popular opinion on my side — I’ve noticed those with the most opinions often have the fewest facts. – Bethania McKenstry

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.