CasperJS and PhantomJS trigger "site disabled" browser not

So, I am trying to clean up the site ( https://shop.advanceautoparts.com/ ) and I might be able to access it over the past few weeks via CasperJS. When I try to do this now (about 2 days ago), I get a strange message that the site is down:

enter image description here

When I try to use a regular browser or PhantomJS, I get a regular site. I've tried doing this from different computers, changing my IP, changing user agent, but nothing works.

EDIT

After trying the same on PhantomJS, after running the code about 5 times, I got the same message. Is this something the site does to prevent scraping?

+3


source to share


1 answer


I suspect the site knows that you are scraping based on your user agent as you are clicking it multiple times

Maybe try randomizing your useragent and see what happens. ( see list here )

var casper = require('casper').create({
  pageSettings: {
    userAgent: "USE SOME OTHER USER AGENT HERE"
  }
});

      

However, the site can also be blocked by IP address after several simultaneous requests. So try a) to slow down your script or b) by going to different pages Strike>

EDIT



I upset the testing script and everything works for me. important bit:

casper.waitUntilVisible("#header-top", function() {

enter image description here

NTN

+1


source







All Articles