Title:  Data Engineer

The Data Engineer is responsible for designing and architecting scalable big data solutions, collaborating with stakeholders to meet business needs, and optimizing data platforms like Snowflake and Databricks. Key responsibilities include:

  • Designing, developing, and maintaining data pipelines and infrastructure.

  • Assisting in the implementation and optimization of data warehouses and data lakes.

  • Collaborating with cross-functional teams to understand business requirements.

  • Applying best practices in data pipeline architecture and ETL processes.

  • Contributing to the adoption and integration of modern data engineering tools and technologies.

  • Supporting data governance, quality, and compliance initiatives.

Job Qualifications:

Education:

  • Bachelor’s degree in Computer Science, Computer Engineering, Information Technology, or a related field.

Experience:

  • 3-5 years in data engineering.

  • Proficiency in Python, Java, or Scala.

  • Experience with Snowflake, Databricks, Hadoop, Spark, and Kafka.

  • Strong problem-solving and communication skills.

  • Experience with cloud platforms (AWS, Azure, Google Cloud) and machine learning/AI technologies is a plus.

Essential Traits:

  • Hands-on experience with data warehouse and big data technologies.

  • Basic experience with real-time data streaming and event-driven architectures.

  • Strong software engineering skills.

  • Understanding of data security and compliance.

  • Effective collaboration with business stakeholders and cross-functional teams.

  • Problem-solving and troubleshooting abilities.

  • Basic knowledge of Docker and Kubernetes.

  • Understanding of data governance frameworks.

  • Experience in Agile development environments.

Work Conditions:

  • Flexibility to collaborate with global teams; occasional travel.

Location:

  • Ortigas, Pasig (Hybrid Work Setup)