Block scrapers and bad web bots
ScrapeSentry stops unwanted scrapers from benefiting from our clients’ intellectual property.
Differentiating good from bad scrapers whether human or bot provides business intelligence to make real decisions that effect our clients’ bottom line.
Why choose Scrapesentry?
- Passive high powered multi layered analysis tailored for specific traffic types coupled with flexible enforcement of blocks
- 24/7 Security Operation Center expertise
- Portal access, scheduled reporting, and scraping analysis provide a total defense against scrapers.
ScrapeSentry finds security breach
Researchers at leading anti-scraping and IT security specialists ScrapeSentry have uncovered a sinister side effect to a free app which over a million Google Chrome users have downloaded, and which potentially leaks all of their personal information.
Prevent scraping and stop bad bots with Scrapesentry
ScrapeSentry is a fully managed anti scraping service based on a proprietary technology platform and 24/7 services delivered from the Sentor Security Operations Centre (SOC). These Services include monitoring, analysis, investigation, blocking policy development, enforcement, and support.
The ScrapeSentry service delivery platform is located on the client premises. It is either passively located on a span port or a module is installed directly on the webservers aggregating traffic to a passively located appliance containing the ScrapeSentry platform.
Blocking policies are enforced through interaction with existing infrastructure such as load balancers, webservers or the client’s application.
When the system detects signs of unauthorized usage it will either automatically block access or alert the Sentor SOC for further investigation and intervention in minutes.