Loading...
Loading...
Found 4 Skills
Expert-level Databricks platform, Apache Spark, Delta Lake, MLflow, notebooks, and cluster management
Apache Spark distributed computing. Use for big data processing.
Columnar file patterns including partitioning, predicate pushdown, and schema evolution.
Using DuckDB with remote cloud storage via HTTPFS extension, fsspec, and Delta Lake integration. Covers S3, GCS, Azure, and S3-compatible endpoints.