function processData(data) {
// Create a million element array
const processed = new Array(1000000).fill(0).map((_, i) => data[i % data.length]);
return processed.length;
}
function handleRequest(request) {
const result = processData(request.data);
return { result };
}
The optimal memory optimization would be avoiding the large intermediate array: 1) The current code creates a massive array with a million elements just to return its length, 2) This large array is unnecessarily consuming memory since only its length is used, 3) The function could simply calculate the length without creating the array, 4) A more memory-efficient approach would directly compute the result with: return Math.min(1000000, data.length), 5) If actual processing of each element is needed, consider using a generator or processing in smaller chunks, 6) Large intermediate data structures are a common source of memory inefficiency, 7) When working with large datasets, always question whether materialization of full results is necessary, 8) Memory optimization often involves rethinking the algorithm to avoid unnecessary allocations rather than just optimizing existing allocations.