
Entrada is a global boutique Databricks Consulting Partner specializing in industry solutions and business results for companies of all sizes. They offer a comprehensive suite of services including Data Engineering, Data Strategy & Governance, Data Migrations, Concierge & Support, Artificial Intelligence, Training & Enablement, and Data Analytics & Insights. Entrada focuses on accelerating modern data initiatives with Databricks, helping clients achieve rapid value, minimize risk, and boost ROI. Their approach is driven by a deep understanding of business needs, technological excellence, and proprietary accelerators and solutions. The company boasts a team of experts, including Databricks MVPs, and has a global presence with delivery centers across the Americas, Latin America, India, and Europe.

Entrada is a global boutique Databricks Consulting Partner specializing in industry solutions and business results for companies of all sizes. They offer a comprehensive suite of services including Data Engineering, Data Strategy & Governance, Data Migrations, Concierge & Support, Artificial Intelligence, Training & Enablement, and Data Analytics & Insights. Entrada focuses on accelerating modern data initiatives with Databricks, helping clients achieve rapid value, minimize risk, and boost ROI. Their approach is driven by a deep understanding of business needs, technological excellence, and proprietary accelerators and solutions. The company boasts a team of experts, including Databricks MVPs, and has a global presence with delivery centers across the Americas, Latin America, India, and Europe.
Data Architect
Tech Stack: Databricks, Delta Lake, Unity Catalog, Terraform, Python/PySpark, Identity management platform, API integration, Data Governance frameworks, dbt
Technical Scope:
- Design enterprise data platform strategy spanning 3+ workspaces (dev/test/prod)
- Architect multi-layered medallion framework (bronze/silver/gold) supporting 20+ downstream applications with clear data contracts
- Define Databricks governance and security architecture (Unity Catalog hierarchy, RBAC, PII classification, compliance controls)
- Lead Terraform Infrastructure-as-Code standardization for repeatable, auditable workspace provisioning and cost control
- Design identity management and Databricks integration pattern (SSO, role-based provisioning, automated access governance)
- Establish data quality and reconciliation standards to validate legacy system business logic translation (90+ packages)
- Create FinOps strategy: cluster tagging, cost allocation, workspace chargeback models
- Design API integration architecture for enterprise data distribution (rate limiting, idempotency, retry logic)
- Define migration phasing strategy and rollback/validation approach for production cutover
Strategic Responsibilities:
- Partner with CTO/Engineering Leadership on long-term data platform roadmap and scalability planning
- Mentor Senior Data Engineers and guide implementation teams on architectural decisions
- Define data asset ownership and SLA expectations across engineering and business teams
- Establish governance policies (PII handling, access controls, audit trails) aligned with enterprise compliance
- Manage dependencies: coordinate with Finance, IT (network/security), and external vendors
- Translate business requirements into technical architecture (e.g., "enterprise source system feeds 20+ downstream systems" → medallion schema design)
Possible Challenges:
- Designing Databricks workspace isolation and security that scales as organization expands analytics use cases
- Identity management connector may require custom API bridge if standard integration unavailable
- Balancing governance rigor (Unity Catalog row/column filters) with team autonomy and development velocity
- Predicting cost impact of multi-workspace topology and setting realistic chargeback/FinOps models early
- Ensuring data quality framework captures 90+ legacy business rules without becoming unmaintainable
- Managing multi-team execution: aligning integration teams on unified architecture
- Handling evolving requirements from downstream systems; resisting scope creep with clear Phase 1/2/3 roadmaps
Must-Have Skills:
- 5+ years data architecture/engineering experience (cloud data platforms, enterprise-scale)
- Deep Databricks expertise (workspaces, Unity Catalog, security models, cost optimization)
- Strong IaC experience (Terraform, version control, pipeline automation)
- SQL, PySpark, and dbt knowledge to design effective medallion architectures
- Identity and access management concepts (SAML/SSO, RBAC, principle of least privilege)
- Data governance frameworks (PII classification, lineage, audit controls)
Nice-to-Have Skills:
- Identity management platform implementation experience
- iPaaS integration patterns
- REST API and webhook architecture
- Enterprise ERP domain knowledge
- dbt on Databricks implementation experience
- Legacy data warehouse migration patterns
Domain & Soft Skills (HIGH PRIORITY):
- Strategic mindset : Can articulate why architectural decisions
- Communication : Fluent translating between business, finance, security, and engineering teams
- Pushback capability : Willing to defend scope boundaries
- Hands-on credibility : Can code/validate designs (not just PowerPoint architect)