SENIOR DATA OPS ENGINEER


Experience

3 to 5 + years of data engineering experience

3+ years of experience with any data warehouse technical architectures, ETL/ELT, and reporting/analytics tools including, but not limited to, any of the following combinations (1) SSIS and SSRS, (2) SAS ETL Framework, (3) SAP ETL Framework, (4) MongoDB ETL deployments, (5) Apache Spark and Apache Hive deployments will be beneficial.

Working with large volumes of structured and unstructured data and leveraging them to build artificial intelligence (AI)/machine learning (ML) and predictive modelling (PM) solutions through end-to-end automated data pipelines.

Deep and Extensive AWS knowledge and skills: EMR, Lambda, EC2, VPC, Transit Gateway, S3, IAM, Infrastructure as Code (e.g. CloudFormation), Code Commit, Code Pipeline and Code Build.  Advantageous AWS experience with DynamoDB, RDS (MS SQL, Oracle), Landing Zone, Control Tower, and AMS.

The candidate having DBA ability and knowledge across at least 2 platforms (for example TSQL, SAS, PSQL, IBM VSAM DB2 etc.) will also be beneficial.

Experience with the Python programming language.

Experience with designing and implementing Cloud (AWS) solutions including the use of APIs available.

Experience with Dev/OPS architecture, implementation, and operation would be advantageous.

Some experience in applying SAFe/Scrum/Kanban methodologies would be advantageous.

Knowledge of Engineering and Operational Excellence using standard methodologies. Best practices in software engineering, data management, data storage, data computing and distributed systems to solve business problems with data.

Strong AWS cost management, billing resolution and cost optimization

Experience in applying SAFe/Scrum/Kanban methodologies.  Strong problem-solving skills: The ability to exercise judgment in solving technical, operational, and organizational challenges, to identify issues proactively, to present solutions and options leading to resolution.

Qualifications

Share Position

Share
Tweet