Senior Manager - Data Platform Engineering

PepsiCo

PepsiCo

India · Hyderabad, Telangana, India
Posted on Feb 10, 2026
Overview

Data Platform & Infrastructure Engineering - Associate role at PepsiCo is to actively contribute to the development, maintenance, and optimization

of the company's data platform and infrastructure. This position plays a crucial role in ensuring the reliability, scalability, and performance of PepsiCo's data systems, supporting the organization in making informed business decisions.

Responsibilities
  • Collaborating with cross-functional teams to design, implement, and maintain data platforms that support various business needs. Thisinvolves working with cutting-edge technologies to ensure the platform meets performance, security, and scalability requirements
  • Managing and optimizing the infrastructure that supports data processing, storage, and retrieval. This includes implementing bestpractices for data storage, ensuring data integrity, and monitoring system performance to identify and address potential issues
  • Enforcing data governance policies and security measures to safeguard sensitive information. This involves implementing access controls,encryption, and other security protocols to protect the integrity and confidentiality of PepsiCo's data assets
  • Collaborating with data scientists, analysts, and other stakeholders to understand their data requirements and providing technicalexpertise to support their initiatives. This role may involve working closely with cross-functional teams to integrate data solutions intovarious business processes
  • Staying abreast of industry trends and emerging technologies to propose and implement innovative solutions that enhance the efficiencyand effectiveness of PepsiCo's data platform. Continuously improving existing infrastructure and processes to meet evolving businessneeds
  • Troubleshooting and resolving issues related to data platform performance, data quality, and infrastructure. Proactively identifyingpotential challenges and implementing solutions to ensure a seamless and reliable data environment
Qualifications
  • Hands-on expertise with data warehouse tools like Snowflake, Databricks, redshiftincluding architecture design, warehouse sizing, RBAC/security implementation, SQLoptimization and performance tuning.
  • Strong Kubernetes (AKS/EKS) engineering skills covering cluster architecture, networking,namespaces, pod security, autoscaling, ingress, operators, and workload troubleshooting.
  • Advanced experience with Helm for templating, packaging, release automation, anddeployment lifecycle management.
  • Expert-level Infrastructure as Code development using Terraform, including module design,
  • state management, automated provisioning, variable strategies, and CI integration.
  • Strong Python programming skills for automation, orchestration, data transformations, APIscripting, and platform tooling.
  • Expert-level SQL capabilities including query optimization, stored procedure development,analytical function usage, and schema design.
  • Strong CI/CD engineering experience using GitHub Actions, including workflow design,automated testing, deployment pipelines, and artifact management.
  • Deep hands-on experience with Azure and/or AWS cloud services includingidentity/security management, compute, networking, storage, monitoring, and cost
  • Proficiency in implementing platform observability and SRE practices (metrics, logging,alerting, tracing, SLO/SLI development, capacity planning, and incident response).-
  • Strong foundation in data engineering concepts including ELT/ETL pipelines, datamodeling, data quality frameworks, and metadata management.
  • Experience implementing enterprise-grade security controls including encryption,
  • IAM/RBAC governance, secrets management, network segmentation, and audit/compliance
  • Ability to troubleshoot end-to-end platform performance issues across compute, storage,
  • orchestration, network, SQL, Snowflake, and cloud environments.