Loading...
Loading...
Diagnose ClickHouse SELECT query performance, analyze query patterns, identify slow queries, and find optimization opportunities. Use for query latency and timeout issues.
npx skill4agent add altinity/skills altinity-expert-clickhouse-reporting-- Check if data skipping indices exist
select
database,
table,
name as index_name,
type,
expr,
granularity
from system.data_skipping_indices
where database = '{database}' and table = '{table}'select
query_id,
read_rows,
selected_marks,
selected_parts,
formatReadableSize(read_bytes) as read_bytes,
round(read_rows / nullIf(selected_marks, 0)) as rows_per_mark
from system.query_log
where query_id = '{query_id}'
and type = 'QueryFinish'selected_marks-- Always time-bound
where event_date >= today() - 1
-- or
where event_time > now() - interval 1 hour
-- Always limit
limit 100
-- Filter by type
where type = 'QueryFinish' -- completed
where type like 'Exception%' -- failed-- By user
where user = 'analytics_user'
-- By query pattern
where query ilike '%SELECT%FROM my_table%'
-- By duration threshold
where query_duration_ms > 10000 -- > 10 seconds
-- By normalized hash (for specific query pattern)
where normalized_query_hash = 1234567890| Finding | Load Module | Reason |
|---|---|---|
| High memory queries | | Memory limits/optimization |
| Reading too many parts | | Part consolidation |
| Poor index selectivity | | Index/ORDER BY design |
| Cache misses | | Cache sizing |
| MV slow | | MV optimization |
| Setting | Scope | Notes |
|---|---|---|
| Query | Query timeout |
| Query | Limit rows scanned |
| Query | Limit bytes scanned |
| Query | Parallelism |
| Query | Enable query result caching |
| Server | Enable query logging |
| Server | Log threshold |