Skip to main content
Product upload sends all products from your local SQL Server database to Digible in a single batch operation. This is useful for initial setup or full data synchronization.

How it works

The sendBatchContent() function (helper.py:140) performs a complete upload of your entire product catalog:
1

Validate authentication

The function first checks if your access token has expired using is_token_expired() (helper.py:143-144).
2

Fetch all products

shopMaster queries your local table and fetches all products with SELECT * FROM {local_table} (helper.py:160-165).
3

Process in batches

Products are sent in batches of 200 to avoid overloading the API (helper.py:141, 175-176).
4

Transform data

Each product is converted to Digible’s expected format with productName, productPrice, and productId fields (helper.py:177-183).
5

Send to Digible

Each batch is posted to https://api.digible.one/v1/business/stores/product/sync with your access token (helper.py:146, 186).
Product upload is different from sync. While sync only sends changes from the ChangeLog table, product upload sends your entire product catalog regardless of recent changes.

Initiating product upload

To upload all products:
  1. Open shopMaster
  2. Navigate to the Settings page
  3. Scroll down and click the Upload Products button (main.py:243-244)
  4. Wait for the confirmation message

Batch processing

Products are uploaded in batches to ensure reliable transmission:
batch_size = 200

for i in range(0, len(products), batch_size):
    batch_products = products[i:i + batch_size]
    # Transform and send batch
The batch size of 200 products is optimized for Digible’s API. Processing happens sequentially to ensure data integrity.

Data transformation

Each product from your local database is transformed before upload:
Local FieldDigible FieldTransformation
ProductNameproductNameDirect mapping
SellPriceproductPriceConverted to string
ProductIDproductIdDirect mapping
The transformation happens in helper.py:177-183:
product_list = [
    {
        "productName": product['ProductName'],
        "productPrice": str(product['SellPrice']),
        "productId": product['ProductID']
    } for product in batch_products
]

Response handling

After each batch is sent, shopMaster:
  1. Checks the HTTP status code (helper.py:189)
  2. Logs the response content (helper.py:190)
  3. Returns an error if status is not 200 (helper.py:191-193)
  4. Parses the JSON response for each batch (helper.py:195-200)
If any batch fails, the upload process returns immediately with an error message. You may need to retry the upload to ensure all products are synchronized.

When to use product upload

Use product upload in these scenarios:
  • Initial setup: When first connecting your database to Digible
  • Full resync: If you suspect data discrepancies between local and cloud
  • After major changes: When you’ve made bulk updates to your product catalog
  • Recovery: After resolving sync issues or reconfiguring your connection
  • Sends all products
  • Ignores ChangeLog table
  • Processes in 200-product batches
  • Manual operation only
  • Use for full synchronization

Error scenarios

Expired session

If is_token_expired() returns True, the function returns immediately (helper.py:143-144):
if is_token_expired():
    return True
You’ll see a message: “User session expired, Please Reconfigure”

Invalid JSON response

If the server returns non-JSON data (helper.py:198-200):
except ValueError:
    print("Response is not valid JSON")
    return {"message": "Invalid JSON response from the server"}

Connection errors

General exceptions are caught and returned as error messages (helper.py:204-206).
The upload function is called by the upload_data() wrapper in main.py:230, which displays the result in a message box.

API endpoint

Product uploads are sent to:
POST https://api.digible.one/v1/business/stores/product/sync
With headers:
  • x-access-token: Your Digible access token
  • Content-Type: application/json
Refer to helper.py:140 for the complete sendBatchContent() implementation.

Build docs developers (and LLMs) love