Empleo de Data Engineer (Python) - 100% Remote en Remoto - Solo México - Vacante 103887 - MX

Publicado hace más de 30 días.

Data Engineer (Python) - 100% Remote en Gta Telecom De México

Sueldo oculto

Remoto: México

Empleado de tiempo completo

Inglés: Nivel Avanzado

Position Overview:

We are seeking a highly skilled Senior Data Engineer to join our team and collaborate with our US-based client on a dynamic project. As a Senior Data Engineer, you will play a pivotal role in designing, developing, and maintaining robust data pipelines and infrastructure to support the client's data-driven initiatives. The ideal candidate will possess strong expertise in Python, SQL , and a solid understanding of key data engineering technologies such as AWS Glue, PySpark, DBT, and Airflow .


- Collaborate with our US-based client to understand their data requirements and translate them into scalable and efficient data solutions.

- Design, develop, and deploy data pipelines using Python, SQL, AWS Glue, PySpark, DBT, and Airflow to ingest, transform, and analyze large volumes of structured and unstructured data.

- Optimize and maintain existing data pipelines to ensure high performance, reliability, and scalability.

- Implement data quality checks, monitoring, and alerting mechanisms to proactively identify and address data issues.

- Work closely with cross-functional teams including data scientists, analysts, and stakeholders to gather requirements and deliver solutions that meet business objectives.

- Provide technical guidance and mentorship to junior team members, promoting best practices and continuous learning within the team.

- Stay current with emerging technologies and industry trends in data engineering and recommend innovative solutions to enhance data processing capabilities.


- Bachelor's degree or higher in Computer Science, Engineering , or related field.

- Proven experience as a Data Engineer or similar role, with a focus on designing and building data pipelines.

- Strong proficiency in Python and SQL for data manipulation and analysis.

- Hands-on experience with AWS services such as Glue, S3, Lambda, and EMR.

- Solid understanding of distributed computing frameworks like PySpark .

- Familiarity with Data Build Tool (DBT) for data transformation and modeling.

- Experience with workflow orchestration tools such as Airflow .

- Excellent communication skills and the ability to work effectively in a remote and collaborative environment.

- Strong problem-solving skills and attention to detail.

- Ability to thrive in a fast-paced and dynamic environment, managing multiple priorities with a customer-focused mindset.

Working conditions:

Mon – Fri 9am - 5 pm EST (at least 4-5 hours overlap with the team)

100% remote

6 Months contract (possible extension to 12 months).