XX
Senior Data Engineer (GCP)cyberThinkUnited States
XX

Senior Data Engineer (GCP)

cyberThink
  • US
    United States
  • US
    United States

Über

Job Description: As a Senior Data Engineer (GCP), you will design, build, and maintain scalable data pipelines and data engineering solutions on Google Cloud Platform to support fraud analytics and reporting. You will collaborate with cross-functional teams, ensure data quality and reliability, optimize BigQuery data models, and proactively resolve data issues across a growing cloud-native data environment. This role requires deep hands-on experience in Python, SQL, GCP services, and modern data engineering best practices.
Key Responsibilities:
Design, build, and maintain scalable data pipelines using BigQuery, Cloud Storage, Dataflow, Cloud Composer, and Pub/Sub. Write high-performance Python and SQL scripts to support ETL and data transformation processes. Implement advanced data models in BigQuery including partitioning, clustering, and materialized views. Collaborate with business stakeholders and SMEs to gather requirements and deliver cloud-based data solutions. Implement data quality, governance, and security best practices across the cloud data environment. Monitor pipeline performance, troubleshoot failures, and ensure high availability of data workflows. Participate in data architecture decisions and recommend improvements for pipeline scalability. Stay current on data engineering trends, cloud technologies, and cybersecurity considerations. Communicate effectively with technical and non-technical audiences and contribute to Agile team activities. Lead investigation and resolution of data issues with a sense of ownership and urgency. Create documentation for data processes, data flows, and operational procedures. Required Skills, Experiences, Education, and Competencies:
Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. Minimum 8 years of experience in data engineering with hands-on experience consolidating data from multiple sources into centralized repositories. Strong expertise with Google BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, and related GCP services. Strong proficiency in Python and SQL for data engineering and automation tasks. Experience designing and supporting ETL processes and scalable cloud data pipelines. Strong understanding of modern data modeling practices and performance optimization. Excellent problem-solving skills with strong analytical attention to detail. Strong communication, collaboration, and Agile delivery experience. Preferred Skills:
Expertise in real-time streaming technologies including Kafka or Pub/Sub. Experience with Power BI or other data visualization tools. Experience with Snowflake, Databricks, or modern data stack technologies. Knowledge of DevOps practices and tools including Terraform. Experience with visualization tools such as Tableau, Grafana, or Looker. Google Professional Data Engineer certification. Experience in fraud analytics or financial crime domains.
The hourly range for roles of this nature are $50.00 to $80.00/hr. Rates are heavily dependent on skills, experience, location, and industry.
cyberThink is an Equal Opportunity Employer.
  • United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.