Home page logo

oss-sec logo oss-sec mailing list archives

Re: Robust XML validation
From: Florian Weimer <fweimer () redhat com>
Date: Fri, 14 Dec 2012 10:06:43 +0100

On 12/13/2012 01:47 PM, Timo Warns wrote:

I wonder if we should care about this in the sense that we should
prepare fixes, or if it is sufficient to recommend to validate against
trusted schemas/DTDs only.  (I've found an implementation which gets
right the things I tested so far, so efficient implementations aren't

Validating against trusted schemas/DTDs would not be sufficient in my
opinion. For example, such validations are not effective against the
billion laughs attack (http://en.wikipedia.org/wiki/Billion_laughs).

True, entity expansion is required for XML parsing, strictly speaking, not just for validation. Some XML implementations use heuristics to stop such attacks. And of course, there's the big hammer of disallowing all entity declarations.

Moreover, some projects deliberately decide against schema validation.
For example, when fixing CVE-2012-2665, LibreOffice developers have
decided against validating the manifest.xml against a schema or DTD.
If I understood correctly, the reason was that omitting validations
allows to open documents in a future format on a best-effort basis (as
an alternative to annoying the user with a "format not supported" message).

I'm not an expert on schema authoring. (Actually, I once tried to define an extensible XML schema and couldn't get it work.) Looking at RELAX NG, there doesn't seem to be a way to say, "you can put any tag here, but if its <myimportanttag>, it must have *this* structure, either. So it's probably feasible to validate during generation only, to check that your hand-crafted code produces the expected document structure. Which is a bit odd.

Florian Weimer / Red Hat Product Security Team

  By Date           By Thread  

Current thread:
[ Nmap | Sec Tools | Mailing Lists | Site News | About/Contact | Advertising | Privacy ]