
Tredence is a global data science solutions provider focused on bridging the gap between insights delivery and value realization, helping clients address the 'last mile' problem in AI. They achieve this by enabling the adoption of data science solutions, powered by industry-specific expertise and AI/ML accelerators through their Tredence Studio. The company emphasizes a vertical-first approach and an outcome-driven mindset to help clients accelerate value realization from their analytics investments. Tredence has been recognized as a 'Leader' in the Forrester Wave: Customer Analytics Services and is a Great Place to Work-Certified company. With over 3500 employees, Tredence serves major companies across retail, CPG, hi-tech, telecom, healthcare, travel, and industrials, recently expanding its capabilities in financial services through the acquisition of Further Advisory.

Tredence is a global data science solutions provider focused on bridging the gap between insights delivery and value realization, helping clients address the 'last mile' problem in AI. They achieve this by enabling the adoption of data science solutions, powered by industry-specific expertise and AI/ML accelerators through their Tredence Studio. The company emphasizes a vertical-first approach and an outcome-driven mindset to help clients accelerate value realization from their analytics investments. Tredence has been recognized as a 'Leader' in the Forrester Wave: Customer Analytics Services and is a Great Place to Work-Certified company. With over 3500 employees, Tredence serves major companies across retail, CPG, hi-tech, telecom, healthcare, travel, and industrials, recently expanding its capabilities in financial services through the acquisition of Further Advisory.
What they do: Data science and AI solutions provider focused on turning analytics/AI insights into business value ('last-mile' problem)
Headquarters: San Jose, California
Employees: ~3,337
Total funding: USD 205,000,000
Recent major investor: Advent International (Series B, Dec 2022)
Founded: 2013
Last-mile adoption of analytics and AI — turning insights into operational business value
2013
Data science / AI services
USD 175,000,000
USD 30,000,000
“Private equity and venture investors (Advent International acquired a minority stake and joined the board; Chicago Pacific Founders is a meaningful investor)”
As a GCP DBT Manager you will work with team to help designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This often includes using tools like BigQuery, Cloud Composer, and Python, and requires strong SQL skills and knowledge of data warehousing concepts. The role also involves ensuring data quality, performance optimization, and collaborating with cross-functional teams. Role & responsibilities Data Pipeline Development: Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like BigQuery and Cloud Composer. Data Modeling: Creating and managing data models and transformations using dbt to ensure efficient and accurate data consumption for analytics and reporting. Data Quality: Developing and maintaining a data quality framework, including automated testing and cross-dataset validation. Performance Optimization: Writing and optimizing SQL queries for efficient data processing within BigQuery. Collaboration: Working with data engineers, analysts, scientists, and business stakeholders to deliver data solutions. Incident Resolution: Supporting day-to-day incident and ticket resolution related to data pipelines. Documentation: Creating and maintaining comprehensive documentation for data pipelines, configurations, and procedures. Cloud Platform Expertise: Utilizing GCP services like BigQuery, Cloud Composer, Cloud Functions, etc. Scripting: Developing and maintaining SQL/Python scripts for data ingestion, transformation, and automation tasks. Preferred candidate profile Requirements: Experience: Typically requires 7~12 years of experience in data engineering or a related field. GCP Proficiency: Strong hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery. dbt Expertise: Proficiency in using dbt for data transformation, testing, and documentation. SQL Proficiency: Advanced SQL skills for data modeling, performance optimization, and querying large datasets. Data Warehousing: Understanding of data warehousing concepts, dimensional modeling, and star schema design. ETL/ELT: Experience with ETL/ELT tools and frameworks, including Apache Beam, Cloud Dataflow, Data Fusion, or Airflow/Composer.