Description
I am simulating 10 scenarios with 7mio prices (so 10 columns and 7mln rows) on my Windows 32GB RAM Desktop.
Simulation runs through, but when calling pf.value(), I get "Memory Error: failed to allocate..." error.
I though vbt praises itself for being able to analyze very big datasets and 7mln x 10 does not sound as awful large to me?
ChatGpt tells me:
"
With 7 million rows and 10 columns of float64 data, your pf.value() operation alone requires approximately 560 MB just for the raw data (7,000,000 × 10 × 8 bytes). However, vectorbt's internal operations create multiple intermediate arrays and data structures that can multiply this memory requirement by 5-10x or more
"
Is there a way to avoid this multiple intermediate arrays?
I can calculate PnL myself from pf.trades.recods, maybe pf.value() call need to be optimized for large datasets?
Thx,
Sergej