Overview
STGNN supports both static and dynamic functional connectivity (FC) graphs. Dynamic FC captures time-varying brain connectivity patterns within a single scan, providing richer temporal information compared to static FC which averages connectivity across the entire scan.Static vs Dynamic FC
Static FC (Conventional Approach)
- Computation: Single correlation matrix averaged over entire fMRI scan
- Result: One graph per subject visit
- Edge weights: Pearson correlation between ROI time series
- File format:
fc_matrixvariable in.npzfiles - Training script:
main.py
Dynamic FC (Advanced Approach)
- Computation: Sliding window approach creating multiple connectivity snapshots
- Result: Sequence of graphs per subject visit (time-varying connectivity)
- Edge weights: Time-windowed correlations capturing connectivity dynamics
- File format:
dynamic_fcvariable in.npzfiles - Training script:
dfc_main.py
Comparison
| Aspect | Static FC | Dynamic FC |
|---|---|---|
| Graphs per visit | 1 | Multiple (sliding windows) |
| Temporal resolution | Averaged | Window-level |
| Information captured | Average connectivity | Time-varying patterns |
| Data size | Smaller | Larger (multiple graphs/visit) |
| Computational cost | Lower | Higher |
| Model complexity | Simpler | More complex (needs temporal aggregation) |
| Biological realism | Less | More (captures dynamics) |
Dynamic FC Model Architecture
TheDynamicGraphNeuralNetwork in dfc_model.py is specifically designed to handle DFC data.
Model Configuration
dfc_main.py:254-265 for initialization.
Architecture Components
1. Input Projection
2. GNN Layers
Supports three GNN architectures:- GraphNorm for stable training
- Activation function (ReLU/ELU/LeakyReLU/GELU)
- Dropout for regularization
dfc_model.py:49-63.
3. TopK Pooling (Optional)
- Reduces computational cost
- Focuses on most relevant brain regions
- Applied after each GNN layer
dfc_model.py:68-75
4. Global Pooling
dfc_model.py:117-119.
Temporal Aggregation Strategies
DFC creates multiple graphs per visit. The model aggregates them using one of three strategies:1. Mean Aggregation (Default)
- Average embeddings across all DFC windows
- Most stable
- Good for capturing overall connectivity patterns
2. Max Aggregation
- Take maximum activation across windows
- Emphasizes strongest connectivity patterns
- More sensitive to outliers
3. GRU Aggregation
- Sequential processing of DFC windows
- Captures temporal ordering within visit
- Most expressive but also most complex
dfc_model.py:137-159
Configure via command line:
dfc_main.py:54-55 for argument definition.
Training with Dynamic FC
Basic Training
With Advanced Features
Key Arguments
--temporal_aggregation: How to combine DFC windows -mean,max, orgru(default:mean)--layer_type: GNN architecture -GCN,GAT, orGraphSAGE(default:GraphSAGE)--gnn_hidden_dim: Hidden dimension size (default: 256)--gnn_num_layers: Number of GNN layers, 2-5 (default: 2)--gnn_activation: Activation function -relu,leaky_relu,elu,gelu(default:elu)--use_topk_pooling: Enable hierarchical pooling (default: True)--topk_ratio: Fraction of nodes to keep (default: 0.3)
dfc_main.py:25-55 for all arguments.
DFC Dataset Structure
TheDFC_ADNIDataset expects .npz files with:
- Each window creates a separate graph
- Subject ID tracks all graphs from same visit
- Temporal aggregation combines windows
Subject Graph Mapping
dfc_main.py:195-209.
Forward Pass Methods
TheDynamicGraphNeuralNetwork provides two forward methods:
1. Single Graph Forward
TemporalDataLoader for feature extraction.
See dfc_model.py:89-121.
2. Sequence Forward
dfc_model.py:123-163.
Data Preprocessing
Handling Infinite Values
dfc_main.py:192.
Visit Trimming
Limit maximum visits per subject to prevent memory issues:Training Configuration
Optimizer
dfc_main.py:384-387.
Scheduler
dfc_main.py:388.
Loss Function
dfc_main.py:389 and arguments at dfc_main.py:29-31.
Evaluation
Evaluation is identical to static FC:- Cross-validation at subject level
- Metrics: accuracy, balanced accuracy, F1, AUC
- Optional horizon-based analysis with time features
- Conversion-specific accuracy tracking
dfc_main.py:399-460 for evaluation function.
When to Use DFC vs Static FC
Use Static FC when:
- Limited computational resources
- Simpler, more interpretable models needed
- Baseline comparison required
- Data preprocessing is simpler
Use Dynamic FC when:
- Maximum predictive performance is critical
- Capturing brain dynamics is important
- Sufficient computational resources available
- Research question involves connectivity changes
Best Practices
- Start with static FC to establish baseline performance
- Use mean aggregation for DFC - it’s most stable
- Enable TopK pooling to reduce computational cost
- Use GraphSAGE as the GNN layer - works well for brain graphs
- Set gnn_num_layers=2 - deeper models may overfit
- Monitor GPU memory - DFC uses more memory than static FC
- Apply same temporal modeling (LSTM/GRU) as static FC for fair comparison
Implementation Files
dfc_main.py: Main training script for dynamic FCdfc_model.py: DynamicGraphNeuralNetwork architectureDFC_ADNIDataset.py: Dataset loader for DFC dataTemporalDataLoader.py: Batch creation (works with both static and DFC)model.py: GraphNeuralNetwork for static FC (for comparison)