fierDeToiMonGrandF

Optimization: replacing addRequests with sendRequest to save Queue Writes cost—blocking concerns?

Hey everyone, I’m trying to optimize my actor's costs. I noticed "Request Queue Writes" are eating up a huge chunk of my budget because I currently enqueue a separate request for a "Details/Enrichment" page for every single item I scrape.

I want to switch to fetching these details inline using
sendRequest
to skip the database write costs. However, the target site is strict with blocking.

My question is: If I use
sendRequest
inside a handler, is manually passing
session
and
proxyInfo.url
enough to maintain the exact same fingerprint/IP as the main crawler?


I want to avoid the "Details" request looking like a fresh bot hit. Here is the pattern I'm trying to implement:

// CURRENT EXPENSIVE APPROACH:
// Enqueues a whole new request, costing 1 Write + 1 Handled per item
/*
await crawler.addRequests([{
    url: detailUrl,
    label: 'EXTRA_DATA',
    userData: { ... }
}]);
*/

// PROPOSED COST-SAVING APPROACH:
// Fetches inline to avoid touching the Request Queue DB
router.addHandler('ITEM_PAGE', async ({ request, session, proxyInfo, sendRequest }) => {
    
    // 1. Scrape the main item data
    const mainData = extractMainData();
    const detailUrl = extractDetailUrl();

    // 2. Fetch extra details inline without queueing
    if (detailUrl) {  
      // My concern: Does this perfectly mimic the parent request?
      const { body } = await sendRequest({
          url: detailUrl,
          session: session,          // Passing current session
          proxyUrl: proxyInfo?.url,  // Passing current proxy IP
      });
      
      const extraData = extractExtraData(body);
    }

    await pushData(mainData);
});


Thanks!
Was this page helpful?