Pagination is essential when working with large datasets. ServiceSQL provides Laravel-style pagination that makes it easy to split results into manageable pages.
Working with large datasets in Google Sheets can lead to:
Execution timeouts (6-minute limit for Apps Script)
Memory issues
Slow response times
Poor user experience
Loading all records from a large table with .get() can cause timeout errors. Always use pagination for tables with more than 1000 rows.
The paginate() method returns a pagination object with metadata:
const db = APPSQL . init ({ spreadsheetId: "YOUR_ID" });
const result = db . table ( "Users" )
. where ( "status" , "active" )
. orderBy ( "name" )
. paginate ( 15 , 1 ); // 15 per page, page 1
The paginate() method returns an object with:
{
data : [ ... ], // Array of records for this page
total : 150 , // Total number of records
per_page : 15 , // Items per page
current_page : 1 , // Current page number
last_page : 10 , // Last page number
from : 1 , // First record number on this page
to : 15 // Last record number on this page
}
You can pass pagination parameters as an object:
const result = db . table ( "Users" )
. paginate ({
perPage: 20 ,
page: 2
});
Both formats are supported:
paginate(perPage, page)
paginate({ perPage, page })
Manual Pagination with Limit and Offset
For more control, use limit() and offset() directly:
const page = 2 ;
const perPage = 15 ;
const users = db . table ( "Users" )
. offset (( page - 1 ) * perPage ) // Skip first page
. limit ( perPage ) // Take 15 records
. get ();
paginate() is recommended over manual limit()/offset() because it automatically calculates page metadata.
Implementation Examples
Building a Paginated API
function getUsersPage ( page = 1 ) {
const db = APPSQL . init ({ spreadsheetId: "YOUR_ID" });
const result = db . table ( "Users" )
. where ( "status" , "active" )
. orderBy ( "created_at" , "desc" )
. paginate ( 20 , page );
return {
users: result . data ,
pagination: {
current: result . current_page ,
total: result . last_page ,
from: result . from ,
to: result . to ,
totalRecords: result . total
}
};
}
// Usage
const page1 = getUsersPage ( 1 );
Logger . log ( `Showing ${ page1 . pagination . from } - ${ page1 . pagination . to } of ${ page1 . pagination . totalRecords } ` );
Build the base query
Start with your filtering conditions: let query = db . table ( "Products" )
. where ( "status" , "available" );
Add dynamic filters
Apply filters based on user input: if ( filters . category ) {
query = query . where ( "category" , filters . category );
}
if ( filters . minPrice ) {
query = query . where ( "price" , ">=" , filters . minPrice );
}
Apply pagination
Paginate the filtered results: const result = query
. orderBy ( "created_at" , "desc" )
. paginate ( filters . perPage || 20 , filters . page || 1 );
Complete example:
function searchProducts ( filters = {}) {
const db = APPSQL . init ({ spreadsheetId: "YOUR_ID" });
let query = db . table ( "Products" )
. where ( "status" , "available" );
// Apply filters
if ( filters . category ) {
query = query . where ( "category" , filters . category );
}
if ( filters . minPrice ) {
query = query . where ( "price" , ">=" , filters . minPrice );
}
if ( filters . maxPrice ) {
query = query . where ( "price" , "<=" , filters . maxPrice );
}
if ( filters . search ) {
query = query . where ( "name" , "like" , `% ${ filters . search } %` );
}
// Paginate
const result = query
. orderBy ( filters . sortBy || "created_at" , filters . sortDir || "desc" )
. paginate ( filters . perPage || 20 , filters . page || 1 );
return result ;
}
Processing Large Datasets in Chunks
When you need to process all records but avoid timeouts:
function processAllUsers () {
const db = APPSQL . init ({ spreadsheetId: "YOUR_ID" });
const perPage = 100 ;
let currentPage = 1 ;
let hasMore = true ;
while ( hasMore ) {
const result = db . table ( "Users" )
. where ( "status" , "active" )
. paginate ( perPage , currentPage );
// Process this batch
result . data . forEach ( user => {
processUser ( user );
});
Logger . log ( `Processed page ${ currentPage } of ${ result . last_page } ` );
// Check if there are more pages
hasMore = currentPage < result . last_page ;
currentPage ++ ;
// Optional: prevent timeouts with sleep between batches
if ( hasMore ) {
Utilities . sleep ( 100 );
}
}
Logger . log ( "All users processed" );
}
function processUser ( user ) {
// Your processing logic here
Logger . log ( `Processing ${ user . name } ` );
}
When processing large datasets, add Utilities.sleep(100) between pages to avoid hitting rate limits.
Pagination works seamlessly with Models:
class User extends Model {
static _table = "Users" ;
static _timestamps = true ;
static scopeActive ( query ) {
return query . where ( "status" , "active" );
}
}
User . use ( db );
// Paginate with scopes
const result = User . query ()
. scope ( "active" )
. with ( "posts" ) // Eager load relationships
. paginate ( 20 , 1 );
// Access model instances
result . data . each ( user => {
Logger . log ( ` ${ user . name } has ${ user . posts . count () } posts` );
});
Combine pagination with eager loading to efficiently load related data:
const result = Post . query ()
. with ( "author" , "comments.author" , "tags" )
. where ( "published" , true )
. orderBy ( "created_at" , "desc" )
. paginate ( 10 , page );
// All relationships are loaded efficiently
result . data . each ( post => {
Logger . log ( ` ${ post . title } by ${ post . author . name } ` );
Logger . log ( ` ${ post . comments . count () } comments` );
Logger . log ( `Tags: ${ post . tags . pluck ( "name" ). all (). join ( ", " ) } ` );
});
Generate Page Numbers
function generatePageNumbers ( currentPage , lastPage , delta = 2 ) {
const pages = [];
const left = currentPage - delta ;
const right = currentPage + delta + 1 ;
for ( let i = 1 ; i <= lastPage ; i ++ ) {
if ( i === 1 || i === lastPage || ( i >= left && i < right )) {
pages . push ( i );
}
}
// Add ellipsis
const pagesWithDots = [];
let prev = 0 ;
for ( const page of pages ) {
if ( page - prev === 2 ) {
pagesWithDots . push ( prev + 1 );
} else if ( page - prev !== 1 ) {
pagesWithDots . push ( "..." );
}
pagesWithDots . push ( page );
prev = page ;
}
return pagesWithDots ;
}
// Usage
const pagination = result ;
const pageNumbers = generatePageNumbers (
pagination . current_page ,
pagination . last_page
);
Logger . log ( pageNumbers ); // [1, "...", 4, 5, 6, 7, 8, "...", 20]
Create Navigation Links
function createPaginationLinks ( result ) {
const links = {
first: 1 ,
last: result . last_page ,
prev: result . current_page > 1 ? result . current_page - 1 : null ,
next: result . current_page < result . last_page ? result . current_page + 1 : null ,
current: result . current_page
};
return links ;
}
// Usage
const links = createPaginationLinks ( result );
if ( links . prev ) {
Logger . log ( `Previous: Page ${ links . prev } ` );
}
Logger . log ( `Current: Page ${ links . current } ` );
if ( links . next ) {
Logger . log ( `Next: Page ${ links . next } ` );
}
Select Only Required Fields
// ❌ Fetches all columns
const result = db . table ( "Users" ). paginate ( 20 , 1 );
// ✅ Only fetch needed columns
const result = db . table ( "Users" )
. select ( "id" , "name" , "email" , "avatar_url" )
. paginate ( 20 , 1 );
Add Appropriate Indexes
Ensure your sort columns are indexed (if supported by your driver):
// Sort by indexed column
const result = db . table ( "Users" )
. orderBy ( "id" , "desc" ) // ID is typically indexed
. paginate ( 20 , 1 );
const cache = CacheService . getScriptCache ();
function getCachedPage ( page ) {
const cacheKey = `users_page_ ${ page } ` ;
const cached = cache . get ( cacheKey );
if ( cached ) {
return JSON . parse ( cached );
}
const result = db . table ( "Users" )
. where ( "status" , "active" )
. paginate ( 20 , page );
// Cache for 5 minutes
cache . put ( cacheKey , JSON . stringify ( result ), 300 );
return result ;
}
Cache pagination results when data doesn’t change frequently to improve response times.
Best Practices
Use Appropriate Page Sizes
Small datasets (< 100 rows) . paginate ( 20 , page ) // 20 per page
Medium datasets (100-1000 rows) . paginate ( 50 , page ) // 50 per page
Large datasets (> 1000 rows) . paginate ( 100 , page ) // 100 per page
Always Order Results
// ✅ Predictable pagination
const result = db . table ( "Users" )
. orderBy ( "id" , "asc" )
. paginate ( 20 , page );
// ❌ Unpredictable results across pages
const result = db . table ( "Users" )
. paginate ( 20 , page );
Validate Page Numbers
function getUsers ( requestedPage ) {
const page = Math . max ( 1 , parseInt ( requestedPage ) || 1 );
const result = db . table ( "Users" )
. orderBy ( "created_at" , "desc" )
. paginate ( 20 , page );
// Redirect to last page if requested page is too high
if ( page > result . last_page && result . last_page > 0 ) {
return db . table ( "Users" )
. orderBy ( "created_at" , "desc" )
. paginate ( 20 , result . last_page );
}
return result ;
}
Summary
ServiceSQL provides powerful pagination capabilities:
Simple API : paginate(perPage, page) returns data and metadata
Laravel-style : Familiar structure for developers from Laravel/PHP
Flexible : Works with filters, sorting, and eager loading
Efficient : Prevents timeouts and memory issues with large datasets
For related topics, see: