Overview
The Bulk & Block Deals API retrieves institutional trading data for the last 30 days. It automatically handles NSE’s API limitation of 10-day queries by splitting requests into three chunks, then merges and deduplicates the results.
Source File: fetch_bulk_block_deals.py
Endpoint Details
https://ow-static-scanx.dhan.co/staticscanx/deal
{
"Content-Type": "application/json",
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36",
"Accept": "application/json, text/plain, */*"
}
Request Payload
Start date in DD-MM-YYYY format
End date in DD-MM-YYYY format (max 10 days from start)
Use “N” for custom pagination
Example Payload
{
"data": {
"startdate": "01-01-2024",
"enddate": "10-01-2024",
"defaultpage": "N",
"pageno": 1,
"pagecount": 50
}
}
Example Request
curl -X POST https://ow-static-scanx.dhan.co/staticscanx/deal \
-H "Content-Type: application/json" \
-H "User-Agent: Mozilla/5.0" \
-d '{
"data": {
"startdate": "01-01-2024",
"enddate": "10-01-2024",
"defaultpage": "N",
"pageno": 1,
"pagecount": 50
}
}'
Response Structure
Total number of deals matching criteria (for pagination)
Deal Object Fields
Deal date in DD-MMM-YYYY format
Buy/Sell indicator (“B” or “S”)
Example Response
{
"data": [
{
"sym": "RELIANCE",
"date": "15-Jan-2024",
"qty": 500000,
"avgprice": 2598.50,
"bs": "B",
"cname": "ABC Mutual Fund",
"dealtype": "Bulk"
},
{
"sym": "TCS",
"date": "15-Jan-2024",
"qty": 250000,
"avgprice": 3750.25,
"bs": "S",
"cname": "XYZ Insurance Company",
"dealtype": "Block"
}
],
"totalcount": 150
}
Implementation Details
API Limitation
Constraint: Date range must not exceed 240 hours (10 days)
Solution: Split 30-day fetch into three 10-day chunks
Chunking Strategy
import requests
import json
from datetime import datetime, timedelta
import math
url = "https://ow-static-scanx.dhan.co/staticscanx/deal"
end_date_ref = datetime.now()
all_raw_deals = []
# Loop 3 times for 3 chunks: [0-9 days], [10-19 days], [20-29 days] ago
for i in range(3):
days_offset_end = i * 10
days_offset_start = days_offset_end + 9
chunk_end_date = end_date_ref - timedelta(days=days_offset_end)
chunk_start_date = end_date_ref - timedelta(days=days_offset_start)
start_str = chunk_start_date.strftime("%d-%m-%Y")
end_str = chunk_end_date.strftime("%d-%m-%Y")
print(f"Fetching deals for chunk {i+1}/3: {start_str} to {end_str}...")
# Pagination logic
page_no = 1
max_pages = 1
while page_no <= max_pages:
payload = {
"data": {
"startdate": start_str,
"enddate": end_str,
"defaultpage": "N",
"pageno": page_no,
"pagecount": 50
}
}
response = requests.post(url, json=payload, headers=headers, timeout=10)
if response.status_code == 200:
data = response.json()
deals = data.get('data', [])
if not deals:
break
all_raw_deals.extend(deals)
# Update max pages
total_count = data.get('totalcount', 0)
if total_count > 0:
max_pages = math.ceil(total_count / 50)
print(f" Fetched page {page_no}/{max_pages} ({len(deals)} items)")
page_no += 1
else:
break
Deduplication Logic
# Deduplicate based on composite key
unique_deals_map = {}
for d in all_raw_deals:
# Create unique key: symbol + date + qty + price + bs + client
key = f"{d.get('sym')}_{d.get('date')}_{d.get('qty')}_{d.get('avgprice')}_{d.get('bs')}_{d.get('cname')}"
unique_deals_map[key] = d
# Sort by date descending (latest first)
sorted_deals = sorted(
list(unique_deals_map.values()),
key=lambda x: x.get('date', ''),
reverse=True
)
# Save
with open("bulk_block_deals.json", "w") as f:
json.dump(sorted_deals, f, indent=4)
Output Structure
bulk_block_deals.json
[
{
"sym": "RELIANCE",
"date": "15-Jan-2024",
"qty": 500000,
"avgprice": 2598.50,
"bs": "B",
"cname": "ABC Mutual Fund",
"dealtype": "Bulk"
},
{
"sym": "TCS",
"date": "14-Jan-2024",
"qty": 250000,
"avgprice": 3750.25,
"bs": "S",
"cname": "XYZ Insurance Company",
"dealtype": "Block"
}
]
Deal Types
Bulk Deals
Single transaction ≥0.5% of equity shares
Disclosed on exchange with client name
Public - visible to all market participants
Block Deals
Minimum 500,000 shares or ₹10 crore value
Special 35-minute window (9:15-9:50 AM or 2:05-2:40 PM)
±1% from previous close or current market price
- Date Range: Last 30 days
- Chunks: 3 (10 days each)
- Page Size: 50 deals per page
- Total Deals: 200-800 (varies by market activity)
- Fetch Time: 15-30 seconds (including pagination)
- Success Rate: >99%
Use Cases
- Institutional Tracking: Monitor FII/DII/MF activity
- Smart Money: Follow institutional buying/selling patterns
- Accumulation Detection: Identify consistent institutional buying
- Distribution Alerts: Detect institutional selling pressure
- Price Impact: Analyze deal prices vs market prices
- Volume Analysis: Correlate deals with volume spikes
Analysis Examples
Top Institutional Buyers
from collections import defaultdict
buyers = defaultdict(list)
for deal in all_deals:
if deal["bs"] == "B":
buyers[deal["cname"]].append(deal)
# Sort by total value
buyer_stats = {
name: sum(d["qty"] * d["avgprice"] for d in deals)
for name, deals in buyers.items()
}
top_buyers = sorted(buyer_stats.items(), key=lambda x: x[1], reverse=True)[:10]
Most Active Stocks
from collections import Counter
symbol_counts = Counter(deal["sym"] for deal in all_deals)
most_active = symbol_counts.most_common(10)
Buy vs Sell Ratio
buy_count = sum(1 for d in all_deals if d["bs"] == "B")
sell_count = sum(1 for d in all_deals if d["bs"] == "S")
buy_value = sum(d["qty"] * d["avgprice"] for d in all_deals if d["bs"] == "B")
sell_value = sum(d["qty"] * d["avgprice"] for d in all_deals if d["bs"] == "S")
print(f"Buy/Sell Count Ratio: {buy_count}/{sell_count}")
print(f"Buy/Sell Value Ratio: {buy_value/1e9:.2f}B / {sell_value/1e9:.2f}B")
Accumulation Detector
# Stocks with multiple institutional buys
from collections import defaultdict
stock_buys = defaultdict(list)
for deal in all_deals:
if deal["bs"] == "B":
stock_buys[deal["sym"]].append(deal)
# Stocks with 3+ institutional buyers
accumulation = {
sym: deals for sym, deals in stock_buys.items()
if len(deals) >= 3
}
Error Handling
try:
response = requests.post(url, json=payload, headers=headers, timeout=10)
if response.status_code == 200:
data = response.json()
deals = data.get('data', [])
if not deals:
print(f" No deals found on page {page_no}.")
break
all_raw_deals.extend(deals)
else:
print(f" Error fetching page {page_no}: Status {response.status_code}")
break
except Exception as e:
print(f" Exception fetching page {page_no}: {e}")
break
Date Range Limitation
# API Error if date range > 10 days
{
"error": "start date and end date difference is more than 240 hours"
}
# Solution: Split into chunks
for i in range(3): # 3 chunks for 30 days
# Each chunk: 10-day range
pass
Client Name Patterns
Mutual Funds
- “ABC Mutual Fund”
- “XYZ Asset Management Company”
Insurance Companies
- “Life Insurance Corporation of India”
- “HDFC Life Insurance Company”
Foreign Institutional Investors (FII)
- “Morgan Stanley Asia”
- “Goldman Sachs (Singapore)“
Domestic Institutional Investors (DII)
- “SBI Mutual Fund”
- “ICICI Prudential Life Insurance”
Prop Desks
- “ABC Securities Ltd”
- “XYZ Capital”
Notes
- Deals are reported T+1 (next trading day)
- Same entity can appear in multiple deals for same stock
- Block deal prices may differ significantly from market prices
- Bulk deals happen during regular market hours
- Block deals happen in special windows (morning/afternoon)
- Deduplication prevents double-counting from overlapping chunks
- Client names are as reported to exchange (may vary in format)
- Deal values calculated as: qty × avgprice
- Run daily for up-to-date institutional activity tracking