Upload CV
X

    Product Manager – Enterprise Data Warehouse

    BettingJobs is excited to present an exceptional opportunity in Ras Al Khaimah, UAE, for ambitious individuals looking to take on a career defining role.

    Set to open in early 2027, our client is developing a landmark 72-floor resort in Ras Al Khaimah featuring a casino, hotel, multiple restaurants, and a marina.

    In preparation for this, they are looking to add a Product Manager – Enterprise Data Warehouse to their corporate team.

    Responsibilities:

    • Define and drive the vision, strategy, and roadmap for the enterprise data platform.
    • Champion modern data architectures (lakehouse, streaming pipelines, real-time analytics).
    • Develop long-term strategies for data storage, processing, governance, and activation.
    • Oversee end-to-end data pipelines (batch ingestion, real-time streaming, storage, transformation, activation).
    • Ensure best practices in ETL/ELT, schema design, and data quality.
    • Lead integration of real-time streaming technologies (Kafka, Flink, Confluent).
    • Implement data governance (metadata management, lineage tracking, data cataloging).
    • Ensure security and compliance (GDPR, CCPA, HIPAA).
    • Standardize data quality and master data management (MDM).
    • Collaborate with engineering, AI/ML, BI, marketing, and CRM teams for seamless data access.
    • Support MLOps (model training, tracking, deployment) with platforms like Databricks and MLflow.
    • Enable data activation via CRM integrations and omnichannel platforms.
    • Optimize cloud compute resources (Azure) for cost and performance.
    • Drive innovation in data storage, query optimization, and real-time analytics.
    • Evaluate and implement new data technologies for scalability and efficiency.

    Requirements:

    • 7+ years of experience in product management, data engineering, or data platform architecture.
    • Deep knowledge of data warehousing (Snowflake) and data lake solutions (Azure Data Lake, Delta Lake, Iceberg).
    • Strong understanding of data engineering pipelines, ETL/ELT, and real-time streaming (Kafka, Flink, Spark, Databricks).
    • Hands-on experience with data governance tools, data catalogs, and lineage tracking (Coalesce, Atlan, Alation).
    • Familiarity with AI/ML platforms (Databricks, MLflow, Feature Stores, AutoML) and their data dependencies.
    • Strong experience in SQL, Python, and cloud-native data processing.
    • Ability to bridge technical and business requirements, working cross-functionally across multiple teams.

    Get in touch