Managed Anti-Scraping Service in Auckland
Full, Professional Data Scraping Protection
If your company provides trading, dating, travel, property or betting services, then you’re at risk from a data scraping attack. A site scrape is when a competitor uses a scraping service to copy your data and use it – or sell it – to boost their own profits. They don’t care that they’re breaching copyright or breaking the carefully-worded terms and conditions of your website. HDS are now able to provide you with the first-ever tool designed specifically to protect your data from the ever-growing threat of page scraping: Assassin.
What can we do?
You need a cutting edge web traffic analysis solution, one that can analyse your traffic and alert you when a user is downloading more than the usual amount of information. Of course, the majority of scrapers will disguise themselves as a normal customer, and sometimes, everyday traffic like search engine spiders can look like scrapers. So how can anti scraping software tell the difference? Without human supervision it can’t. And that’s where HDS come in – with our supervision, Assassin provides a much more effective protection against scraping without affecting your genuine users.
Assassin performs a constant analysis of your website’s traffic and alerts our expert IT engineers whenever there’s an incident that looks like a scraping attack. Our operators evaluate the alert to see whether it’s a real attack or not, then act upon it according to your pre-arranged instructions. We guarantee that we’ll stop any scraping on your website within 45 minutes of detecting it.
Assassin logs any traffic on your website and forwards the details to its real time database, where its core technology resides. It then analyses the data and, checks it against acknowledge and dynamic behaviour patterns using pattern matching technology.
This pattern matching is fully controlable in real-time, and it can be altered depending on any new behaviours as they happen. Every request that the database receives is stored for later, deeper analysis in Assassin’s long-term database. The long-term database also sends its information to a Security management portal, which provides you with comprehensive reports and allows you to track scraping trends and communicate with us.
What the package includes:
One of the methods companies use to try to prevent scraping is Captcha – the familiar method of asking users to type in nonsense words they see in a graphic; the idea being that as the words are contained within a graphic file rather than plain text, scraping software cannot see them. However, there are a number of problems with this system, including the inconvenience for users, usability issues for disabled users and the possibility that someone could simply pass the Captcha tests manually before carrying out a scraping attack.
Another method is IP address blacklisting. Of course, the most obvious problem with this method is that, as scraping looks so much like genuine usage, you can unintentionally block legitimate customers from using your site.