Block scrapers and bad web bots
ScrapeSentry stops unwanted scrapers from benefiting from our clients’ intellectual property.
Differentiating good from bad scrapers whether human or bot provides business intelligence to make real decisions that effect our clients’ bottom line.
Why choose Scrapesentry?
- Passive high powered multi layered analysis tailored for specific traffic types coupled with flexible enforcement of blocks
- 24/7 Security Operation Center expertise
- Portal access, scheduled reporting, and scraping analysis provide a total defense against scrapers.
Scraping threat report 2015
In the Scraping Threat Report 2015, the increasing threat to online businesses from data theft due to web scraping is outlined by industry experts ScrapeSentry. Read more and download the report!
Prevent scraping and stop bad bots with Scrapesentry
ScrapeSentry is a fully managed anti scraping service based on a proprietary technology platform and 24/7 services delivered from the Sentor Security Operations Centre (SOC). These Services include monitoring, analysis, investigation, blocking policy development, enforcement, and support.
The ScrapeSentry service delivery platform is located on the client premises. It is either passively located on a span port or a module is installed directly on the webservers aggregating traffic to a passively located appliance containing the ScrapeSentry platform.
Blocking policies are enforced through interaction with existing infrastructure such as load balancers, webservers or the client’s application.
When the system detects signs of unauthorized usage it will either automatically block access or alert the Sentor SOC for further investigation and intervention in minutes.