Sunday, May 4, 2008

SHEILDING WEB SERVICE FROM ATTACK

Web services are almost irresistible. By nature, they allow one system to find and interact with another, with little or no human intervention. Yet, the very virtues that make web services compelling; their use of trusted protocols and their use of multiple intermediaries make them a potential access for criminals. Ultimately, the recognition that we need to tackle is the Web services' vulnerabilities as part of a growing awareness that security must be addressed in the code of applications, not just through firewalls and gateways.

The common reasons for vulnerabilities are the mistaken belief that applications are exposed only to internal personnel rather than the world at large. Web services frequently pass messages through several intermediaries before they reach their final destination, undercutting technologies such as SSL, which secures connections only across the open Internet.
A high percentage of Web services interact with databases. SOAP and XML make it easy to disguise malicious payloads, opening new avenues for buffer-overflow attacks, targeting an enterprise's most vital systems. Other common Web service exploits include XML parser attacks, in which an infinite string leads to a denial of service, and XML external entity attacks, in which a request points to an invalid file, resulting in an error that may cause the Web service to give out information it shouldn't disclose.

Defensive Measures: Although Web services raise risks, organizations need not fall victim to security breaches if they take proactive measures. That means the biggest defense comes from ensuring code works, preferably before it's ever exposed to the Net. Although plenty of coders use blacklists to prevent well-known types of malicious routines from being executed, the more prudent approach is to employ white lists, for example; a field that asks for a Social Security number will accept only a positive value that has nine digits.

Security professionals should also take careful inventory of every service that's exposed to the Internet, preferably through an audit carried out by someone external to the IT department. That approach can be particularly effective in identifying services left behind by a previous generation of developers. Whether the services are already in place or not yet deployed, each one needs to be thoroughly tested using a variety of methods: 1) scan every port of every IP address and carefully query each service that responds, 2) looking to see whether UDDI servers, WSDLs and/or other self-describing mechanisms are giving up information that could aid an attacker.

Thus this article is closely related to both the chapter 7 (Telecommunication, the internet and wireless) and chapter 8 (Securing Information System). Technology has brought tremendous positive changes in terms of a variety of systems but has also raised major concerns over security. It is for sure, that the more we innovate there will always be someone better working to destroy it.

Dan Goodin, InfoWorld, San Mateo: Nov 27, 2006. Volume 28, Issue 48; pg. 29, 3 pgs

No comments: