Replies: 2 comments 4 replies
-
500,000 rows is a lot of data to show in the web browser. Without knowing your business needs, the first question I would be asking is whether we can show this data in smaller pages that don't overwhelm memory. I would use a memory profiler to see what kind of data is being generated. Since you're working with a third-party grid control, I would consider the possibility that the control is generating a lot of its own data for keeping track of the data you supply to it. Maybe that control has not been intentionally designed to handle large amounts of data. |
Beta Was this translation helpful? Give feedback.
-
I wouldn't consider this a CSLA issue per se. As @hurcane mentions, 500k records is a lot of records to try and load into memory. It does appear the Radzen grid handles paging (fetching a small "page" of data at a time as the user looks through the grid) https://www.radzen.com/documentation/blazor/custom-datagrid-paging/ CSLA can do paging, but there is some manual coding. You'd have to do something like add page size and current position as parameters to your fetch and pass those through to the data portal. The in the data access layer use SQL syntax for OFFSET and FETCH NEXT to get the specified page of data and only the specified page. Having said all that, here is the question I have. Is this UI design usable? If there are 500k records in a grid who is going to start scrolling through that to find the records they want? It seems from a design perspective you first may want to ask the user some information about what information in the table they are looking for and only fetch and show them those records. |
Beta Was this translation helpful? Give feedback.
-
I have a Blazor WASM .Net7 app using Csla 7 to load datasets into Radzen grids. I am using virtualization with the grids. I am using ADO.Net with stored procedures for data access. The app is currently hosted on a test server with 2 cores and 8GB of RAM.
I am encountering a problem only when accessing large datasets. The access appears to go into a memory-consuming loop on the server and eventually times out. As an example, I have a dataset with 514,417 records occupying 106MB of storage. When I try to fetch that dataset, the IIS Worker Process grabs all the available CPU and steadily increasing amounts of RAM, reaching 5GB by the time it times out.
Has anyone encountered something similar to this or have any idea why it is happening? Suggestions for debugging?
Beta Was this translation helpful? Give feedback.
All reactions