XX
Lead Data Engineer, RemoteGXO Logistics, Inc.United States
XX

Lead Data Engineer, Remote

GXO Logistics, Inc.
  • US
    United States
  • US
    United States
Apply Now

About

Logistics at full potential.
At GXO, we're constantly looking for talented individuals at all levels who can deliver the caliber of service our company requires. You know that a positive work environment creates happy employees, which boosts productivity and dedication. On our team, you'll have the support to excel at work and the resources to build a career you can be proud of.

As the Lead Data Engineer, you will lead the design and development of robust and scalable data pipelines on a modern data platform. You will work closely with cross-functional teams to understand data requirements, implement efficient solutions, and ensure the reliability and performance of our data infrastructure. The ideal candidate will have a strong background in data engineering, expertise in working with cloud platforms such as Google Cloud Platform (GCP), and experience with modern data warehousing solutions like Snowflake. The Lead Data Engineer will mentor less experienced developers and collaborate with cross-functional teams, stakeholders, and vendors to achieve project milestones and deliver a robust and scalable data lake solution.

Pay, Benefits And More.
We are eager to attract the best, so we offer competitive compensation and a generous benefits package, including full health insurance (medical, dental and vision), 401(k), life insurance, disability and the opportunity to participate in a company incentive plan.

What you'll do on a typical day:

  • Data Pipeline Design and Development: Lead the design, implementation, and maintenance of data pipelines to support data ingestion, transformation, and storage on GCP and Snowflake.
  • Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business objectives.
  • Platform Optimization: Optimize and enhance the performance, scalability, and reliability of existing data pipelines, ensuring efficient data processing and storage.
  • Technology Stack: Stay abreast of industry trends and emerging technologies in data engineering and incorporate them into the data architecture where applicable. Experience with the following modern data stack highly preferred: GCP, Snowflake, Fivetran, and dbt.
  • Quality Assurance: Implement best practices for data quality, validation, and testing to ensure the accuracy and integrity of data throughout the pipeline.
  • Documentation: Create and maintain comprehensive documentation for data pipelines, ensuring knowledge
  • United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.