Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Snowflake and other cloud-based tools.
- Implement data ingestion, transformation, and integration processes from various sources (e.g., APIs, flat files, databases).
- Optimize Snowflake performance through clustering, partitioning, and query tuning.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and security across all data pipelines and storage.
- Develop and maintain documentation related to data architecture, processes, and best practices.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
- Working experience with tools like medallion architecture, Matillion, DBT models, SNP Glu are highly recommended
- Data Modelling experience and exposure to Medallion architecture
- Bachelor’s degree/Master's degree in Computer Science, Information Systems, or a related field.
- 4-5 years of experience in data engineering or a similar role.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in SQL and experience with scripting languages like Python or Shell.
- Experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, or Talend.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
- Attractive compensation package
- Company pension plan
- Remote working
- Flexible working models, that adapt to individual life phases
- Health offers
- Individual development opportunities through our own Learning Academy as well as free access to LinkedIn Learning
- + two individual benefits
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Snowflake and other cloud-based tools.
- Implement data ingestion, transformation, and integration processes from various sources (e.g., APIs, flat files, databases).
- Optimize Snowflake performance through clustering, partitioning, and query tuning.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and security across all data pipelines and storage.
- Develop and maintain documentation related to data architecture, processes, and best practices.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
- Working experience with tools like medallion architecture, Matillion, DBT models, SNP Glu are highly recommended
- Data Modelling experience and exposure to Medallion architecture
- Bachelor’s degree/Master's degree in Computer Science, Information Systems, or a related field.
- 4-5 years of experience in data engineering or a similar role.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in SQL and experience with scripting languages like Python or Shell.
- Experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, or Talend.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
- Attractive compensation package
- Company pension plan
- Remote working
- Flexible working models, that adapt to individual life phases
- Health offers
- Individual development opportunities through our own Learning Academy as well as free access to LinkedIn Learning
- + two individual benefits
- Bachelor’s degree/Master's degree in Computer Science, Information Systems, or a related field.
- 4-5 years of experience in data engineering or a similar role.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in SQL and experience with scripting languages like Python or Shell.
- Experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, or Talend.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
- Bachelor’s degree/Master's degree in Computer Science, Information Systems, or a related field.
- 4-5 years of experience in data engineering or a similar role.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in SQL and experience with scripting languages like Python or Shell.
- Experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, or Talend.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
At Daimler Truck we are united in our purpose: We work for all who keep the world moving. Our impact as a global transportation company depends entirely on the contribution of each individual.
We believe that our individual strengths lead to the best team performance. Together, we create an inclusive culture where a deep sense of belonging fuels innovation and growth.
We welcome applications from everyone, people of all cultures and genders, different generations and phases of life, and people with disabilities. You can be your true self at Daimler Truck.
We need your consent to load the Youtube service!
We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.
This content is not permitted to load due to trackers that are not disclosed to the visitor. The website owner needs to setup the site with their CMP to add this content to the list of technologies used.
Job number:
5405Publication period:
11/13/2025 - 12/31/2030Location:
BangaloreOrganization:
Daimler Truck Innovation Center India Private LimitedJob Category:
Finance/ControllingWorking hours:
Full time
Benefits
- Barrier-free workplaceBarrier-free workplace
- Inhouse DoctorInhouse Doctor
- Good public transportGood public transport
- Canteen-CafeteriaCanteen-Cafeteria
- ParkingParking
Barrier-free workplace;Inhouse Doctor;Good public transport;Canteen-Cafeteria;Parking
