Talent.com
No se aceptan más aplicaciones
Lead Data Engineer – (Remote – Latin America)

Lead Data Engineer – (Remote – Latin America)

Bertoni SolutionsCosta Rica
Hace 10 días
Descripción del trabajo

Job Description

We are seeking a highly skilled Lead Data Engineer with strong expertise in PySpark, SQL, and Python , Azure Data Factory, Synapse, Databricks and Fabric, as well as a solid understanding of ETL and data warehousing end to end principles. The ideal candidate will have a proven track record of designing, building, and maintaining scalable data pipelines in a collaborative and fast-paced environment.

Key Responsibilities :

  • Design and develop scalable data pipelines using PySpark to support analytics and reporting needs.
  • Write efficient SQL and Python code to transform, cleanse, and optimize large datasets.
  • Collaborate with machine learning engineers, product managers, and developers to understand data requirements and deliver solutions.
  • Implement and maintain robust ETL processes to integrate structured and semi-structured data from various sources.
  • Ensure data quality, integrity, and reliability across pipelines and systems.
  • Participate in code reviews, troubleshooting , and performance tuning.
  • Work independently and proactively to identify and resolve data-related issues.
  • Contribute to Azure-based data solutions , including ADF, Synapse, ADLS, and other services.
  • Support cloud migration initiatives and DevOps practices.
  • Provide guidance on best practices and mentor junior team members when needed.

Qualifications

  • 8+ years of overall experience working with cross-functional teams (machine learning engineers, developers, product managers, analytics teams).
  • 3+ years of hands-on experience developing and managing data pipelines using PySpark.
  • 3 to 5 years of experience with Azure-native services, including Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), Databricks, Azure Synapse Analytics / Azure SQL DB / Fabric.
  • Strong programming skills in  Python and SQL.
  • Solid experience doing  ETL processes and data modeling / data warehousing end to end solutions.
  • Self-driven, resourceful, and comfortable working in dynamic, fast-paced environments.
  • Advanced written and spoken English is a must have for this position (B2, C1 or C2 only).
  • Strong communication skills is a must.
  • Nice to have :

  • Databricks certification.
  • Knowledge of  DevOps, CI / CD pipelines, and cloud migration best practices.
  • Familiarity with  Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, and Cosmos DB.
  • Basic understanding of  SAP HANA.
  • Intermediate-level experience with  Power BI.
  • Additional Information

    Please note that we will not be moving forward with any applicants who do not meet the following mandatory requirements :

  • 3+ years of experience with PySpark / Python, ETL and data warehousing processes, Azure data factory, Synapse, Databricks, Azure Data Lake Storage, Fabric, Azure SQL DB etc.
  • Proven leadership experience in a current project or previous projects / work experiences.
  • Advanced written and spoken English fluency is a MUST HAVE (from B2 level to C1 / C2)
  • MUST BE located in Central or South america, as this is a nearshore position (Please note that we are not able to consider candidates requiring relocation or those located offshore).
  • More Details :

  • Contract type : Independent contractor (This contract does not include PTO, tax deductions, or insurance. It only covers the monthly payment based on hours worked).
  • Location : The client is based in the United States; however, the position is 100% remote for nearshore candidates located in Central or South America.
  • Contract / project duration : Initially 6 months, with extension possibility based on performance.
  • Time zone and working hours : Full-time, Monday to Friday (8 hours per day, 40 hours per week), from 8 : 00 AM to 5 : 00 PM PST (U.S. time zone).
  • Equipment : Contractors are required to use their own laptop / PC.
  • Start date expectation : As soon as possible.
  • Payment methods : International bank transfer, PayPal, Wise, Payoneer, etc.
  • Bertoni Process Steps :

  • Requirements verification video interview.
  • Partner / Client Process Steps :

  • CV review.
  • 1 Technical video interview with our partner.
  • 1 or 2 video interviews with the end client.
  • Why Join Us?

  • Be part of an innovative team shaping the future of technology.
  • Work in a collaborative and inclusive environment.
  • Opportunities for professional development and career growth.
  • Crear una alerta de empleo para esta búsqueda

    Data Engineer • Costa Rica