Hi, @JoshSGman
I think yeah. You could parse the URL and enqueue the new URLs into the scheduling queue, and define another functions to deal with these new URLs.
I am not sure if I understand your meaning. Do you mean that you want to crawl the new URLs in one page, then the new URLs in the new pages?
@gaocegege Hey! Thanks for the response - stepping through the methods, I wasn't quite sure what will enqueue the following links. Essentially, I would like to keep crawling a site until it's 10 levels deeps. Would I use the request
method to enqueue the new URLs? Does request
ultimately call Parse again?
Thanks in advance!