It’s just a few days before the RSA Conference 2007, and some vendors refuse
to let themselves get lost in the glut of news from companies that scramble
for everyone’s attention at the show.
Take SPI Dynamics, for example. The Web application security provider
earlier this week unveiled the fruits of a three-year endeavor, dubbed
Phoenix, in which the company whipped up a new architecture.
Forming the architectural backbone of the latest release of the company’s WebInspect 7 scanning software, Phoenix aims to turn the traditional, more passive Web application security model on its ear.
In order to That’s the way But with WebInspect 7, we’ve created a That is extremely beneficial to an auditor That doesn’t necessarily sound sexy but it is With WebInspect 7, as soon as we see a CAPTCHA prompt, we will pause the WebInspect 7 will go through a Web application like a Unfortunately, everyone
Traditional Web application scanners can’t effectively handle the
new-fangled applications written with AJAX
Moreover, the highly distributed nature of many organizations’ applications
makes it difficult for current Web app scanners to search for
vulnerabilities through multiple servers. It’s like looking for several
needles in several haystacks.
SPI Dynamics officials say WebInspect 7 aims to end the suffering because
Phoenix was designed to work with newer applications to thwart more
sophisticated Web attacks.
Internetnews.com recently caught up with Caleb Sima, co-founder and
CTO of SPI Dynamics. Sima and his development team are responsible for the
conception of Project Phoenix and its incorporation as the intelligent
bedrock for WebInspect 7.
Q: Why did SPI Dynamics decide to create the Phoenix architecture, which
you bill as the first Web application-scanning architecture for Web 2.0
We realized the architecture of the Web scanner today is built for the Web
of 2000. It is obvious to everybody the Web has changed, especially in the
last three years. The Web has just skyrocketed to an entirely different way
of using it. Because of that, the vulnerabilities have gotten to be a little
bit different. They have become much more difficult to find.
identify these vulnerabilities in Web applications, the way the Web scanner
works today had to be drastically changed. The way that we were doing it was
absolutely insignificant and just wasn’t going to work.
That’s why we started Phoenix. We needed to do the next generation of Web
scanning. We needed to make a product that acts like a human hacker, that
thinks like a human hacker and was flexible enough for us to add these
things in a quick amount of time.
Q: With Phoenix, what are some examples of challenges that WebInspect 7
can address that previous versions of WebInspect couldn’t?
There are a bunch of them. One is time. Because Web applications have gotten
so complex and big, scanning a Web application in the old days, in which you
would crawl even a small Web site, would take an hour or so to complete. And
then you would start auditing to find vulnerabilities.
scanners work today. Now, in order to scan even a small Web site today, with
AJAX and all of the other technologies going on, it takes a lot longer. It’s
much more dynamic. You’re missing things. By the time you finish it, two or
three hours into a crawl, you start the auditing. In the meantime, the
auditor, or the person using the product, sits on his butt and basically
does nothing. What happens is there’s three hours of wasted time, and you have to wait for time for auditing to come up.
new simultaneous crawl and audit methodology. As soon as you start crawling
pages, it immediately starts auditing for the types of vulnerabilities
you’re looking for. So, you can get pretty much instantaneous results as
soon as you crawl one page.
because as soon as a vulnerability pops up, they can start working on it.
They can determine whether there are false positives. They can determine
whether it’s something they need to put in a report in a different way. They
can confirm by exploitation, they can go further with the vulnerability.
That allows them to work with the product as its working, saving them a
considerable amount of time.
amazing how much difference it makes from my perspective as a tester.
Q: Do the Web 2.0 technologies also mean you’re seeing new security
vulnerabilities?
The types of vulnerabilities we find are also going to be different.
Previously with WebInspect, there were Web applications that we could not
crawl. For instance, things with CAPTCHA (Completely
Automated Public Turing test to tell Computers and Humans Apart), things
with multi-factor authentication. In order to do scanning for those, you’d
have to hit something, look at your token and enter in a number and then be
able to somehow get the automated product to do that. It was impossible.
scan, alert the user, he can get out of the CAPTCHA, click OK, and boom, the
product continues. Now what will happen, it’s a very simple and easy method
to do these kinds of things.
The biggest problem of course is JavaScript AJAX. In WebInspect 6, we
basically built a JavaScript engine and were able to plug it in in a
rudimentary form. But this JavaScript engine was really built for Phoenix,
so we are now able to implement it in WebInspect 7 to deal with JavaScript
and AJAX technologies.
user and understand what is going on to its full potential. That’s really
important. Because honestly, if you don’t do that, you’re not going to get
to the Web app and you’re going to miss a bunch of vulnerabilities. A lot of
great features and enhancements get thrown into being able to crawl and
understand a Web application better.
Q: Who do you run into on the competitive landscape as you troll for
customers? Why is WebInspect 7 a more attractive alternative?
Our main competitor is WatchFire, and we have a smaller competitor called
Cenzic. Watchfire copies a lot. From our perspective, we were first with a
Windows product, the first with the technology that we have and the first
with our methodology of auditing Web applications.
has copied our technology, basing it on a crawl and audit. We just knew
three years ago that that wasn’t going to hold up. We fundamentally changed
the way the engines worked and the way things got audited. They’re running
off of what I consider old 2000-based architecture. They’re trying to keep
up with the Web 2.0 by adding features and enhancements to an old,
fundamentally flawed architecture. It’s just not going to work. At some
point, they’re going to have to turn around.
Q: What do you expect will be one of the hot topics at RSA this year?
I think browser security will be big. Your biggest interface to the Web is
the browser. People are just starting to catch on to its cross-site scripting
vulnerabilities, which has reached a level of absolutely huge potential.
AJAX has added some real big security concerns.