Yes its a bug in the crawler, it throws an error about too many files. The problem seems to be due to recursion, because there are up to 2700 concurrent active tasks. Need to find a way to optimize this.
The crawler is fixed now and site is updated. There is also another problem with builds so that crawl results wont get updated automatically every day.
Yes its a bug in the crawler, it throws an error about too many files. The problem seems to be due to recursion, because there are up to 2700 concurrent active tasks. Need to find a way to optimize this.
You run them in parallel?
The crawler is fixed now and site is updated. There is also another problem with builds so that crawl results wont get updated automatically every day.