Title:  Technology Architect

The Technology Architect is a critical role responsible for designing and architecting scalable big data solutions, collaborating with stakeholders to meet business needs, and optimizing data platforms like Snowflake and Databricks.

This role will lead the development of data models, flows, and pipelines, ensuring data security, governance, and compliance. Additionally, you will be tasked with evaluating the future roadmap of existing tools, introducing new platforms or solutions, and enhancing current systems. With extensive experience in data architecture, strong software engineering skills, and expertise in big data technologies such as Hadoop, Spark, and Kafka, you will provide technical leadership, drive innovation, and foster a culture of continuous improvement.

  • Design, develop, and maintain scalable data platforms and infrastructure, ensuring high performance, reliability, and security.
  • Oversee the implementation and optimization of data warehouses, particularly Snowflake and Databricks.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Implement best practices in data platform architecture and ETL processes to support data analytics and reporting.
  • Drive the adoption of modern data engineering & analytics tools and technologies, staying up-to-date with industry trends.
  • Ensure data governance, quality, and compliance across all data platforms.

 

Qualifications:

  • Bachelor's degree in a relevant field, such as Computer Science, Computer Engineering, Information Technology
  • Tools and Technology certifications are optional
  • Extensive experience in data platform architecture and solution design, with a minimum of 8-10 years of relevant experience.
  • Strong software engineering skills, with proficiency in programming languages such as Python, Java, or Scala.
  • Proven expertise in data warehouse technologies, including Snowflake and Databricks.
  • Experience with big data technologies like Hadoop, Spark, and Hive.
  • Familiarity with real-time data streaming and processing using Apache Kafka.
  • Solid understanding of data modeling, ETL processes, and data pipeline architecture.
  • Strong problem-solving skills and the ability to troubleshoot complex data issues.
  • Excellent communication and collaboration skills, with the ability to work effectively in a team-oriented environment.
  • Demonstrated ability to lead and mentor a team of engineers, fostering a culture of innovation and continuous improvement.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with machine learning and AI technologies is a plus
  • Must be willing to work in Ortigas, Pasig (Hybrid Work Setup)