Claude Code skills for Analytics & Data engineers working with dbt and Snowflake
Altimate Data Skills is a collection of Claude Code skills that encode the workflows and best practices of experienced analytics engineers. These skills transform Claude from a code generator into a capable data engineering assistant.
- 53% accuracy on ADE-bench (43 real-world dbt tasks)
- 3x improvement on model creation tasks vs baseline
- 84% pass rate on Snowflake query optimization (62 TPC-H queries, 1TB dataset)
- 3.6x better performance gains vs baseline (16.8% avg improvement vs 4.7%)
- Skills that teach Claude how to work, not just what to write
/plugin marketplace add AltimateAI/data-engineering-skillsInstall individual skill packs:
# Install dbt skills
/plugin install dbt-skills@data-engineering-skills
# Install Snowflake skills
/plugin install snowflake-skills@data-engineering-skills
# Install altimate-code delegation skill
/plugin install altimate-code@data-engineering-skills| Skill | Purpose | Key Behaviors |
|---|---|---|
| creating-dbt-models | Model creation | Convention discovery → Write → Build → Verify output |
| debugging-dbt-errors | Error troubleshooting | Read full error → Check upstream → Apply fix → Rebuild |
| testing-dbt-models | Schema tests | Study existing test patterns → Match project style |
| documenting-dbt-models | Documentation | Analyze model → Generate descriptions |
| migrating-sql-to-dbt | Legacy SQL conversion | Parse SQL → Create proper dbt model |
| refactoring-dbt-models | Safe restructuring | Track dependencies → Apply changes → Verify downstream |
| developing-incremental-models | Incremental models | Strategy selection → unique_key design → Handle edge cases |
| Skill | Purpose | Key Behaviors |
|---|---|---|
| finding-expensive-queries | Cost analysis | Find and rank queries by cost/time/data scanned |
| optimizing-query-by-id | Performance tuning | Optimize using query ID from history |
| optimizing-query-text | Performance tuning | Profile query → Identify bottlenecks → Apply patterns |
| Skill | Purpose | Key Behaviors |
|---|---|---|
| altimate-code | Hand off data tasks to altimate-code | Verify install → Invoke altimate-code run --yolo non-interactively → Read output file → Summarize for user |
Use this skill when a task needs altimate-code's wired-up warehouse tools, column lineage, multi-step data exploration, or its 100+ specialized data tools.
Requires altimate-code: npm install -g altimate-code (Node 20+). Docs: docs.altimate.sh · Source: AltimateAI/altimate-code. The skill will detect a missing install and surface the exact command to the user.
Skills are markdown files that teach Claude how to approach tasks, not just what syntax to use. Each skill has two parts:
When should this skill activate?
---
name: creating-dbt-models
description: |
Guide for creating dbt models. ALWAYS use this skill when:
(1) Creating ANY new model (staging, intermediate, mart)
(2) Task mentions "create", "build", "add" with model/table
(3) Modifying model logic or columns
---What steps should Claude follow?
# dbt Model Development
**Read before you write. Build after you write. Verify your output.**
## Critical Rules
1. ALWAYS run `dbt build` after creating models - compile is NOT enough
2. ALWAYS verify output after build using `dbt show`
3. If build fails 3+ times, stop and reassess your approach
...Skills activate automatically based on your request:
| Your Request | Skill Activated |
|---|---|
| "Create a new orders model" | creating-dbt-models |
| "Fix this compilation error" | debugging-dbt-errors |
| "Add tests to the customers model" | testing-dbt-models |
| "Document the revenue metrics" | documenting-dbt-models |
| "Create an incremental model for events" | developing-incremental-models |
| "This query is slow, optimize it" | optimizing-query-text |
Skills become even more powerful when combined with Altimate's MCP server. The MCP server provides real-time access to your dbt project and data warehouse:
| MCP Tool | What It Provides |
|---|---|
dbt_project_info |
Project structure, model list, sources |
dbt_model_details |
Column types, dependencies, compiled SQL |
dbt_compile |
Compile models without CLI |
snowflake_query_history |
Recent query executions and stats |
snowflake_table_stats |
Row counts, clustering info |
Kits bundle skills, MCP servers, and instructions into a single activatable unit. Instead of installing skills one by one, activate a kit to get a complete development setup.
| Kit | Description | Skills | MCP |
|---|---|---|---|
| dbt-snowflake | Complete dbt + Snowflake setup | 9 skills | dbt MCP server |
# Install the kit
altimate-code kit install AltimateAI/data-engineering-skills
# Activate for your project
altimate-code kit activate dbt-snowflake
# Check what's active
altimate-code kit statusSee kits/README.md for the full kit format reference and how to create your own.
Evaluated using ADE-bench, a framework for evaluating AI agents on analytics engineering tasks. All tests were run using Claude Sonnet 4.5.
| Configuration | Accuracy | Tasks Resolved |
|---|---|---|
| Baseline Claude (no skills) | 46.5% | 20/43 |
| Claude + Skills | 53.5% | 23/43 |
| Category | Baseline | With Skills | Improvement |
|---|---|---|---|
| Model Creation | 40% | 65% | +25 pts |
| Bug Fixing | 60% | 70% | +10 pts |
| Debugging | 35% | 50% | +15 pts |
| Refactoring | 30% | 35% | +5 pts |
| Analysis | 25% | 30% | +5 pts |
Benchmark on TPC-H 1TB dataset (62 queries) testing optimizing-query-text skill. All tests were run using Claude Sonnet 4.5.
| Configuration | Pass Rate | Avg Performance Improvement |
|---|---|---|
| Baseline Claude (no skills) | 77.4% (48/62) | 4.7% |
| Claude + Skills | 83.9% (52/62) | 16.8% (3.6x better) |
Skills provide structured optimization with query profiling, anti-pattern detection, and semantic preservation validation.
Note: This benchmark uses our internal evaluation framework. We plan to open-source it soon with additional evals.
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
Ideas for contributions:
- New skills for workflows we haven't covered
- Improvements to existing skills based on your team's patterns
- Benchmark results on different datasets
- Bug reports and feature requests
We're actively developing:
- Airflow skills — DAG development, debugging, testing
- Cross-platform migration — dbt to/from SQL Server, Oracle
- Snowflake cost optimization — Warehouse sizing, query patterns
- Data quality workflows — Anomaly detection, freshness checks
This project is licensed under the MIT License - see the LICENSE file for details.
Built by the team at Altimate AI — Making data engineering delightful.