Data Engineer en Remoto - Solo México para Adecco - Hireline México

Feria Virtual de Empleos de Tecnología México 2023

¡Más de 700 ofertas de trabajo en México!

Visitar feria

Data Engineer en Adecco

$ 60,000 a 70,000 MXN (Bruto)

Remoto: México

Empleado de tiempo completo

Nivel de Inglés: Nivel Avanzado

The Data Engineer role will be responsible for expanding and optimizing our data-processing pipelines, as well as supporting our data analysts and data scientists on data-driven initiatives to ensure an optimal data delivery architecture that is consistent throughout ongoing projects. The hire must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

Job Requirements:

 

Key Responsibilities:

 

     Create and maintain optimal data pipeline architecture.

     Implement solutions for Big Data challenges using Python, Apache Spark, and Apache Airflow.

     Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

     Assemble large, complex data sets that meet functional / non-functional business requirements.

     Implement internal process improvements: automating manual processes, optimizing data delivery, etc.

     Work with business stakeholders to assist with data-related technical issues and support their data infrastructure needs.

     Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

 

Technical Knowledge Requirements:

     2+ years of developing and maintaining data pipelines

     Extensive programming experience in Python

     Experience with Big Data processing and scalable pipelines

     Experience with SQL database systems.

     Experience with data cleansing, cataloging, and validation

     Process oriented with great documentation skills

     Experience with Apache Airflow is a plus

     Experience with AWS platform with emphasis on Amazon S3, Glue, Athena, EMR, RDS, and Lake Formation

 

Non-Technical Skills:

     Excellent problem solving, analysis, and documentation skills.

 

Education/Training:


     Ability to speak, read and write in English

     BS or MS degree in Computer Science or a related technical field.

     2+ years of experience with Python, SQL, and NoSQL.

     2+ years of experience with AWS cloud services: EMR, RDS, Glue, Athena, and S3.

     1+ years of experience with schema design and dimensional data modeling.