Introduction
You've built a Power App that works perfectly on small datasets. Then a colleague opens it with 400 real product records - and the dreaded "App Unresponsive" warning appears. The app isn't broken. It's doing exactly what you asked - processing everything simultaneously, on a single thread, with no capacity left for anything else.
Power Apps applications frequently handle large datasets with complex, record-level business logic such as pricing rules, tax computation, and discounts. When this logic runs entirely on the client side in a single operation, performance problems like freezing and "App Unresponsive" warnings become inevitable.
In this post, we walk through exactly why this happens and demonstrate how chunking - processing records in smaller sequential batches - resolves the problem without changing your underlying business logic.
Technical Challenge
Consider a typical Power Apps scenario where each record may contain nested or related data, complex per-record calculations such as pricing rules, tax computation, or discounts, and where all processing is performed using client-side Power Fx formulas.
When a large dataset is processed in a single operation, the application may freeze and users may receive "Unresponsive" warnings - even though the app is still executing logic in the background.
Important: Performance degradation depends not only on the number of records, but also on the complexity of calculations per record and the user's system configuration - CPU, memory, browser, and device performance.
Why Power Apps Become Unresponsive
1. All Records Are Processed at Once
A single ForAll() loop executes calculations for every record simultaneously. This becomes especially impactful when working with large collections that contain heavy logic per record. For light calculations, even 1,000+ records may process without issue. For heavy nested formulas, as few as 200–300 records can trigger an unresponsive state.
2. High Memory and CPU Consumption
Each record evaluation produces intermediate calculation results. When hundreds of records are processed simultaneously, memory and CPU usage spike - overwhelming the client device. Three compounding failures occur at once:
- Memory Spike - All intermediate results are held in memory simultaneously, causing excessive consumption that overwhelms the client device.
- CPU Overload - Parallel processing of hundreds of complex calculations saturates the processor, leaving no capacity for UI rendering or user interaction.
- UI Freeze - The render thread is blocked entirely, preventing any screen updates until all processing completes.
3. UI Thread Blocking
Power Apps evaluates formulas on the same thread that renders the UI. While large calculations run, the UI cannot refresh - making the app appear frozen even if execution is still ongoing in the background.
Before Chunking: Processing All Records at Once
The following example processes all records in a single pass. While straightforward to write, this pattern blocks the UI thread for the entire duration and is the primary cause of "App Unresponsive" warnings in production.
ForAll(
colProducts,
Patch(
colProducts,
ThisRecord,
{
BaseRevenue: BasePrice * QuantitySold,
DiscountAmount: BasePrice * QuantitySold * DiscountRate,
TaxAmount: ((BasePrice * QuantitySold) -
(BasePrice * QuantitySold * DiscountRate)) * TaxRate,
FinalRevenue:
(BasePrice * QuantitySold)
- (BasePrice * QuantitySold * DiscountRate)
- (((BasePrice * QuantitySold) -
(BasePrice * QuantitySold * DiscountRate)) * TaxRate)
}
)
);
Why this causes problems: The threshold at which this fails is not fixed. For light calculations, even 1,000+ records may process without issue. For heavy calculations - like the nested formula above - as few as 200–300 records can trigger an unresponsive state. The risk scales directly with per-record logic complexity.
After Chunking: Processing Records in Batches
The chunked implementation below processes records incrementally. The business logic is identical - only the delivery mechanism changes. Start with a batch size of 100 and adjust based on the complexity of your formulas and the capabilities of your target devices.
// Adjust _chunkSize based on formula complexity and device capability
Set(_chunkSize, 100);
Set(_totalRecords, CountRows(colProducts));
ForAll(
Sequence(RoundUp(_totalRecords / _chunkSize, 0)) As _batch,
With(
{
_start: (_batch.Value - 1) * _chunkSize + 1,
_end: Min(_batch.Value * _chunkSize, _totalRecords)
},
ForAll(
Sequence(_end - _start + 1, _start, 1) As _row,
With(
{ _p: Last(FirstN(colProducts, _row.Value)) },
Patch(
colProducts,
_p,
{
BaseRevenue: _p.BasePrice * _p.QuantitySold,
DiscountAmount: _p.BasePrice * _p.QuantitySold * _p.DiscountRate,
TaxAmount: ((_p.BasePrice * _p.QuantitySold) -
(_p.BasePrice * _p.QuantitySold * _p.DiscountRate)) * _p.TaxRate,
FinalRevenue: (_p.BasePrice * _p.QuantitySold)
- (_p.BasePrice * _p.QuantitySold * _p.DiscountRate)
- (((_p.BasePrice * _p.QuantitySold) -
(_p.BasePrice * _p.QuantitySold * _p.DiscountRate)) * _p.TaxRate)
}
)
)
)
)
);
How it works: The outer ForAll(Sequence(...)) iterates over batch numbers. For each batch, a With() scope calculates the start and end record indices. The inner loop retrieves and processes each record individually using Last(FirstN()) - a standard Power Fx idiom for positional record access. The UI thread is free to refresh between batches, keeping the experience smooth throughout.
Does chunking change the final output or results?
No. Chunking only changes the order in which records are processed, not the calculations applied to each one. The final state of your collection will be identical to what a single ForAll() would produce - it simply gets there without freezing the app.
When should I not use chunking?
For small collections - typically fewer than 100 records with simple arithmetic - a standard ForAll() loop remains perfectly appropriate and easier to maintain. Chunking adds structural complexity, so apply it where the performance benefit is real.
How do I know if my batch size is too large?
If users still report freezes or unresponsive warnings after applying chunking, reduce the batch size. Start at 100, test on your lowest-spec target device, and decrease by 25–50 until the experience is consistently smooth.
Can this pattern be used with SharePoint lists instead of local collections?
Yes, with some adaptation. When working directly against a SharePoint data source, the same batching logic applies - however, each Patch() call will be a network request, so consider the additional latency and delegate where possible to reduce client-side load.
Conclusion
A single ForAll() loop across a large collection blocks the UI thread entirely until every record is processed - causing the freezes and "App Unresponsive" errors that users experience as crashes. The underlying logic isn't wrong; the delivery mechanism is.
Chunking resolves this by processing records in sequential batches. Memory stays bounded, the UI thread has room to breathe, and the app remains interactive throughout the operation - regardless of how complex your per-record formulas are.

No comments:
Post a Comment