AWS Data Engineer

  • Sector: Monroe Information Technology
  • Contact: Justine Danielle Bituin
  • Client: Monroe Consulting Group
  • Location: Pasig
  • Salary: PHP70000 - PHP80000 per month
  • Expiry Date: 16 December 2021
  • Job Ref: BBBH226870_1637055527
  • Contact Email: justine.bituin@monroeconsulting.com.ph

Executive recruitment company Monroe Consulting Group Philippines is recruiting on behalf of a global IT-BPM company. Our respected client offers from application development to back office support to their valuable clients and they are looking for an Information Technology (IT) Professional for the job of AWS Data Engineer. The job will be based in Ortigas, Pasig City, Metro Manila, Philippines.

Job Summary:

AWS Data Engineers will work with our ML Engineers, Data Scientist and various Business units to define solutions for operationalizing data-driven decision making in a cost effective and scalable manner.

Key job responsibilities include:

  • ​Designing, building and maintaining efficient, reusable, and reliable architecture and code.
  • Build reliable and robust Data ingestion pipelines (within AWS, onprem to AWS, etc)
  • Ensure the best possible performance and quality of high scale data engineering project
  • Participate in the architecture and system design discussions
  • Independently perform hands on development and unit testing of the applications.
  • Collaborate with the development team and build individual components into complex enterprise web systems.
  • Work in a team environment with product, production operation, QE/QA and cross functional teams to deliver a project throughout the whole software development cycle.
  • Responsible to identify and resolve any performance issues
  • Keep up to date with new technology development and implementation
  • Participate in code review to make sure standards and best practices are met

Key job requirements include:

  • Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience
  • Experience implementing, supporting data lakes, data warehouses and data applications on AWS for large enterprises
  • Programming experience with Python, Shell scripting and SQL
  • Solid experience of AWS services such as CloudFormation, S3, Athena, Glue, EMR/Spark, RDS, Redshift, DynamoDB, Lambda, Step Functions, IAM, KMS, SM etc.
  • Solid experience implementing solutions on AWS based data lakes.
  • Experience in AWS data lake/data warehouse/business analytics
  • Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS
  • Knowledge of ETL/ELT
  • End-to-end data solutions (ingest, storage, integration, processing, access) on AWS
  • Architect and implement CI/CD strategy for EDP
  • Implement high velocity streaming solutions using Amazon Kinesis, SQS, and Kafka (preferred)
  • Migrate data from traditional relational database systems, file systems, NAS shares to AWS relational databases such as Amazon RDS, Aurora, and Redshift
  • Migrate data from APIs to AWS data lake (S3) and relational databases such as Amazon RDS, Aurora, and Redshift
  • Implement POCs on any new technology or tools to be implemented on EDP and onboard for real use-case
  • AWS Solutions Architect or AWS Developer Certification preferred
  • 5+ years of experience as Data engineer
  • Experience developing business applications using SQL databases.
  • Experience working Cloud (AWS preferred)
  • Should have good experience with AWS Services - S3, Athena, Glue, Lambda, Step Functions, SQS, Redshift.
  • Plus to have knowledge on Snowflake