Ahmed Sakr
Ahmed Sakr

Reputation: 129

How to resume crawling after last depth I reached when I restart my crawler?

Hello Everyone I am making a web application that crawl lots of pages from a specific website, I started my crawler4j software with unlimited depth and pages but suddenly it stopped because of internet connection. Now I want to continue crawling that website and not to fetch the urls I visited before considering I have last pages depth.

Note : I want some way that not to check my stored url with the urls I will fetch because I don't want to send very much requests to this site.

**Thanks **☺

Upvotes: 1

Views: 203

Answers (1)

rzo1
rzo1

Reputation: 5751

You can use "resumeable" crawling with crawler4j by enabling this feature

crawlConfig.setResumableCrawling(true);

in the given configuration. See the documentation of crawler4j here.

Upvotes: 2

Related Questions