Let's say that for each result, I need to scrape 3 pages. Each time base, /about & /contact. How would you go about that? Nest crawlers inside each other or enqueue the links and keep track of intermediary results in KV storage or something else?
meaning first enqueue base, then in base handled enqueue about and put the results from the base in the userData and so on and in conctact put the whole result to the dataset