I am working with the grid extensively, and am implementing multiple workarounds for one main problem, the grid cannot work with data it does not have.
Many of the recordsets i am returning are greater than 50k in size. In order to satisfy bandwidth requirements, I have implemented a paging alogrithm with a page size of 1000 records.
The problem arises using groupbys. Because the grid is not bound to the entire dataset, but rather one page of it, grouping information is not accurate. Even if I override the grouping events and force the SQL to sort on whatever column i am grouping by, the information in the grid will not be accurate unless the size of the group <= paging size.
How would you suggest dealing with this problem?
Imported from legacy forums. Posted by jraddock (had 1850 views)
One way you could do this is by implementing your own data binding objects and download the data when necessary from some central server maybe?
If you have detail grids you could download the detail grids onthe fly?
Other than that the grid will need all the information so that it can group the rows correctly.
Personally and this is only a personal point, I’d go to the users and ask do they need all that information? If you mean that a row in a table for instance is 50k and that you’re returning 1000 at a time that’s a shed loaf of data which is alot for a clients computer and would they really need all that information? How useful would it be? That’s just my personal point.
It’s an idea anyway.
Imported from legacy forums. Posted by Chris (had 2926 views)