
Arkham is a powerful Data & AI Platform designed to help mid-market and large enterprises unify fragmented systems and data, and solve complex operational challenges with AI models. Their platform integrates tools for data unification, machine learning model training, and Gen AI application deployment, eliminating infrastructure complexity and tool fragmentation. Arkham focuses on creating an Ontology, a digital representation of business operations, as a foundation for AI solutions. They boast a 6-8 week implementation process that delivers rapid ROI by addressing pressing challenges and building a foundation for AI transformation. Key clients include Circle K, Mexico Infrastructure Partners, and Editorial Televisa.

Arkham is a powerful Data & AI Platform designed to help mid-market and large enterprises unify fragmented systems and data, and solve complex operational challenges with AI models. Their platform integrates tools for data unification, machine learning model training, and Gen AI application deployment, eliminating infrastructure complexity and tool fragmentation. Arkham focuses on creating an Ontology, a digital representation of business operations, as a foundation for AI solutions. They boast a 6-8 week implementation process that delivers rapid ROI by addressing pressing challenges and building a foundation for AI transformation. Key clients include Circle K, Mexico Infrastructure Partners, and Editorial Televisa.
Arkham is a Data & AI Platform—a suite of powerful tools designed to help you unify your data and use the best Machine Learning and Generative AI models to solve your most complex operational challenges.
Today, industry leaders like Circle K, Mexico Infrastructure Partners, and Televisa Editorial rely on our platform to simplify access to data and insights, automate complex processes, and optimize operations. With our platform and implementation service, our customers save time, reduce costs, and build a strong foundation for lasting Data and AI transformation.
We are looking for a Senior Data Engineer to own our high-performance Data Platform based on the Lakehouse architecture. In this role, you will work with cutting-edge technologies such as Apache Spark, Trino, and Delta Lake, ensuring data governance and interoperability across platforms. You'll play a key role in shaping our data infrastructure, working across the entire data lifecycle—from ingestion to transformation and activation.
Key Responsibilities
End-to-End Data Lifecycle Management – Maintain high data quality and usability across integration, transformation, and activation stages.
Qualifications
Problem-Solving: Excellent analytical and debugging skills.