I'm trying to devise a way to load large amounts of data (upwards of 1000 rows) into a page, without pagination. The first hurdle in this was to query the DB in parallel bite sized chunks, which I've done with the help of the solution at How to make sequentially Rest webservices calls with AngularJS?
I'm running into two problems with what I've implemented however:
Each returned object is being passed into an array which is then itself returned as the array that Angular uses to bind. i.e. [[{key:value, key:value, key:value}, {key:value, key:value, key:value}], [{key:value, key:value, key:value}, {key:value, key:value, key:value}]] As such I can't use ng-repeat="item in data" because data is an array of arrays. Doing "item in data[0]" does make item available. Concatenation seems to be the answer but I haven't been able to sort out a way that makes it work.
I'm making multiple requests to the database and each request gets returned correctly but the page doesn't render until all the requests have completed -- which completely negates the point of doing multiple requests in the first place.
So looking over my code, how can I re-write it to solve these two issues? So that data is returned as one array and that data is rendered every time a query is completed?
app.factory('ScanService', function($http, $q) {
function fetchOne(stepCount) {
return $http({
method: 'GET',
url: '/index.php/scans',
params: {step:stepCount}
})
.then(function onSuccess(response) {
return response.data;
}
return {
fetchAll: function(steps) {
var scans = [];
for (var i = 1; i <= steps; i++) {
scans.push(fetchOne(i));
}
return $q.all(scans);
}
};
});
app.controller.ScanCtrl = function($scope, $q, ScanService) {
$scope.scans = ScanService.fetchAll(10);
};
Follow up
I should add that I did manage to get this working based on the solution below and an angular.forEach(). Can't suggest anyone working with "big data" go this route. At around 1000 rows, the browser was overwhelmed and began slowing down considerably. Trying to filter with angular.filter also experienced a significant delay until the results were narrowed down. On the other hand, a few hundred rows worked respectably well and allowed native filtering - which was a key goal for my implementation.