Skip to main content

Overview

Multi-Cloud Manager provides unified log management across Azure and GCP. Query logs using native query languages (KQL for Azure, LQL for GCP), export to CSV, and analyze log data to troubleshoot issues and gain insights.

Azure Log Queries (KQL)

Kusto Query Language (KQL)

Azure Log Analytics uses KQL to query log data. KQL is a powerful language optimized for log analytics.

VM Heartbeat Logs

Query VM heartbeat data to verify connectivity and health:
# From log_analytics.py:335-340
Heartbeat
| where Computer == 'vm-name'
| top 500 by TimeGenerated desc
| project TimeGenerated, Computer, Category, OSType, OSName, Version, ResourceId
Fields returned:
  • TimeGenerated: Timestamp of the heartbeat
  • Computer: VM hostname
  • Category: Log category
  • OSType: Operating system (Windows/Linux)
  • OSName: OS name and version
  • Version: Agent version
  • ResourceId: Full Azure resource ID

VM Performance Logs

Query performance counter data:
# From log_analytics.py:324-333
Perf
| where Computer == 'vm-name'
| where CounterName == '% Processor Time' or CounterName == 'Available MBytes Memory'
| where InstanceName == '_Total' or InstanceName == 'Memory' or InstanceName == 'total'
| summarize AverageValue = avg(CounterValue) by TimeGenerated, CounterName
| order by TimeGenerated desc
| top 500 by TimeGenerated
Key Counters:
  • % Processor Time: CPU utilization percentage
  • Available MBytes Memory: Available physical memory

Custom VM Queries

Execute custom KQL queries with validation:
# From log_analytics.py:384-401
POST /api/azure/vm/{vm_name}/logs/query
Content-Type: application/json

{
  "workspaceGuid": "12345678-1234-1234-1234-123456789abc",
  "kqlQuery": "Heartbeat | where Computer == 'vm-name' | top 10 by TimeGenerated desc"
}
Security validation:
# From log_analytics.py:396-401
dangerous_keywords = ['delete', 'update', 'modify', 'insert', 'drop']
if any(keyword in kql_query.lower() for keyword in dangerous_keywords):
    return jsonify({"error": "Zapytanie zawiera niedozwolone słowa kluczowe"}), 400

if f"Computer == '{vm_id}'" not in kql_query:
    return jsonify({"error": f"Zapytanie musi zawierać filtr 'where Computer == \"{vm_id}\"'"}), 400
All queries must filter by the specific VM to prevent unauthorized data access.

Container Logs (ACI)

Query Azure Container Instance logs:
# From containermonitor.py:196-201
ContainerInstanceLog_CL
| where ContainerGroup_s == 'container-group-name'
| top 500 by TimeGenerated desc
Validation ensures container-specific filtering:
# From containermonitor.py:252-253
if f"ContainerGroup_s == '{container_group_name}'" not in kql_query:
    return jsonify({"error": "Zapytanie musi zawierać filtr container-specific"}), 400

GCP Log Queries (LQL)

Log Query Language (LQL)

GCP Cloud Logging uses a filter-based query language with structured fields.

VM System Logs

Query VM logs with resource filtering:
# From vmmonitor.py:306-376
POST /api/gcp/vm/{project_id}/{instance_id}/logs/query
Content-Type: application/json

{
  "lqlQuery": "resource.type=\"gce_instance\" resource.labels.instance_id=\"1234567890\" severity>=ERROR"
}
Common LQL Filters:
# Filter by resource type and instance
resource.type="gce_instance"
resource.labels.instance_id="1234567890"

# Filter by severity
severity>=ERROR
severity=INFO

# Filter by timestamp
timestamp>="2024-01-15T00:00:00Z"

# Filter by log name
logName="projects/my-project/logs/syslog"
Required filters:
# From vmmonitor.py:324-326
instance_filter = f'resource.labels.instance_id="{instance_id}"'
if instance_filter not in lql_query:
    return jsonify({"error": f"Zapytanie LQL musi zawierać filtr: {instance_filter}"}), 400

Log Entry Processing

# From vmmonitor.py:332-362
entries_iterator = client.list_entries(
    filter_=lql_query,
    order_by=logging_v2.DESCENDING,  # Newest first
    page_size=100
)

for entry in entries_iterator:
    row = {
        "timestamp": entry.timestamp.isoformat(),
        "severity": entry.severity,
        "logName": entry.log_name.split('/')[-1]
    }
    
    # Extract payload fields dynamically
    payload = entry.payload
    if isinstance(payload, dict):
        for key, value in payload.items():
            row[key] = str(value)
    elif isinstance(payload, str):
        row["payload"] = payload
Response format:
{
  "columns": ["timestamp", "severity", "logName", "message"],
  "rows": [
    ["2024-01-15T10:30:45.123Z", "ERROR", "syslog", "Connection timeout"],
    ["2024-01-15T10:30:40.456Z", "INFO", "syslog", "Service started"]
  ]
}

Cloud Run Logs

Query Cloud Run service logs:
# From containermonitor.py:192-255
POST /api/gcp/container/{project_id}/{service_name}/logs/query
Content-Type: application/json

{
  "lqlQuery": "resource.type=\"cloud_run_revision\" resource.labels.service_name=\"my-service\" severity>=WARNING"
}
Required filter:
# From containermonitor.py:208-210
service_filter = f'resource.labels.service_name="{container_name}"'
if service_filter not in lql_query:
    return jsonify({"error": "Zapytanie LQL musi zawierać filtr service_name"}), 400

Log Export

Azure CSV Export

Export Azure logs to CSV format:
GET /api/azure/vm/{vm_name}/logs/export?workspaceGuid={guid}&type=perf&hours=1
Parameters:
  • workspaceGuid: Log Analytics workspace GUID (required)
  • type: Log type (heartbeat or perf)
  • hours: Time range in hours (default: 1)
Implementation:
# From log_analytics.py:303-370
response = client.query_workspace(
    workspace_id=workspace_guid,
    query=query,
    timespan=timedelta(hours=timespan_hours)
)

if response.status == LogsQueryStatus.SUCCESS and response.tables:
    table = response.tables[0]
    output = io.StringIO()
    writer = csv.writer(output, delimiter=';')
    
    # Write header
    writer.writerow(table.columns)
    
    # Write data rows
    for row in table.rows:
        row_data = [str(item) if isinstance(item, datetime) else item for item in row]
        writer.writerow(row_data)
    
    return Response(
        csv_content,
        mimetype="text/csv",
        headers={"Content-Disposition": f"attachment;filename={vm_id}_{log_type}_logs.csv"}
    )
CSV format:
  • Delimiter: ; (semicolon)
  • Header row: Column names from query
  • Data rows: All matching log entries
  • Datetime values: ISO 8601 format

GCP CSV Export

Export container logs:
GET /api/gcp/container/{container_name}/logs/export?workspaceGuid={guid}&hours=1&type=container
# From containermonitor.py:177-235
query = f"""
ContainerInstanceLog_CL
| where ContainerGroup_s == '{container_group_name}'
| top 500 by TimeGenerated desc
"""

response = client.query_workspace(
    workspace_id=workspace_guid,
    query=query,
    timespan=timedelta(hours=timespan_hours)
)

Common Query Patterns

Azure KQL Examples

Syslog
| where Computer == 'vm-name'
| where SeverityLevel == "err" or SeverityLevel == "crit"
| project TimeGenerated, Facility, SeverityLevel, SyslogMessage
| order by TimeGenerated desc

GCP LQL Examples

resource.type="gce_instance"
resource.labels.instance_id="1234567890"
severity>=ERROR

Query Performance

Azure Optimization

Use where clauses early in the query to filter data before aggregations.
Best practices:
  • Filter by Computer in the first line
  • Use specific time ranges with TimeGenerated
  • Limit results with top or take
  • Use summarize for aggregations instead of returning raw rows
  • Avoid search * across all tables
Query timing:
# From log_analytics.py:421
timespan=timedelta(days=1)  # Limit to 24 hours by default

GCP Optimization

Best practices:
  • Always filter by resource.type
  • Include resource.labels.instance_id or service_name
  • Set reasonable page_size limits (100-1000)
  • Use order_by=DESCENDING for recent logs
  • Avoid very broad time ranges without filters
Default limits:
# From vmmonitor.py:335
page_size=100  # Limit results per page

Authentication

Azure Authentication

Uses service principal credentials:
# From log_analytics.py:312-316
credential = ClientSecretCredential(
    tenant_id=TENANT_ID,
    client_id=CLIENT_ID,
    client_secret=CLIENT_SECRET
)
client = LogsQueryClient(credential)

GCP Authentication

Uses session-based OAuth credentials:
# From vmmonitor.py:329
credentials = SessionCredentials(gcp_account)
client = logging_v2.Client(credentials=credentials, project=project_id)

Error Handling

Common Errors

Azure KQL: Check for missing pipes (|), invalid operators, or typos in table names.GCP LQL: Ensure proper quote matching and valid filter syntax.
  • Verify the time range includes log data
  • Check resource filters match actual resource names
  • Ensure logs are being collected (check agent status)
  • Verify workspace/project permissions
  • Reduce time range
  • Add more specific filters
  • Use top or page_size to limit results
  • Check query complexity and aggregations
  • Verify workspace GUID is correct (Azure)
  • Check service principal has Log Analytics Reader role
  • Ensure GCP account has Logs Viewer role
  • Validate authentication credentials

Next Steps

Log Analytics

Set up Azure Log Analytics workspaces

Metrics Collection

View performance metrics from your resources

Build docs developers (and LLMs) love