XX
Senior Data EngineerHCA HealthcareUnited States

This job offer is no longer available

XX

Senior Data Engineer

HCA Healthcare
  • US
    United States
  • US
    United States

About

Benefits HCA Healthcare offers a total rewards package that supports the health, life, career and retirement of our colleagues. The available plans and programs include:
Comprehensive medical coverage that covers many common services at no cost or for a low copay. Plans include prescription drug and behavioral health coverage as well as free telemedicine services and free AirMed medical transportation.
Additional options for dental and vision benefits, life and disability coverage, flexible spending accounts, supplemental health protection plans (accident, critical illness, hospital indemnity), auto and home insurance, identity theft protection, legal counseling, long‑term care coverage, moving assistance, pet insurance and more.
Free counseling services and resources for emotional, physical and financial wellbeing.
401(k) Plan with a 100% match on 3% to 9% of pay (based on years of service).
Employee Stock Purchase Plan with 10% off HCA Healthcare stock.
Family support through fertility and family building benefits with Progyny and adoption assistance.
Referral services for child, elder and pet care, home and auto repair, event planning and more.
Consumer discounts through Abenity and Consumer Discounts.
Retirement readiness, rollover assistance services and preferred banking partnerships.
Education assistance (tuition, student loan, certification support, dependent scholarships).
Colleague recognition program.
Time Away From Work Program (paid time off, paid family leave, long‑ and short‑term disability coverage and leaves of absence).
Employee Health Assistance Fund that offers free employee‑only coverage to full‑time and part‑time colleagues based on income.
Note: Eligibility for benefits may vary by location.
Job Summary and Qualifications The Senior Data Engineer serves as a primary development resource for design, build, implementation, and support of ITG Data Management enterprise application initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. As a senior‑level position the role requires ‘self‑starters’ who are proficient in problem solving and capable of bringing clarity to complex situations. The culture of the organization places an emphasis on teamwork, so social and interpersonal skills are equally important as technical capability. Due to the emerging and fast‑evolving nature of Big Data/GCP technology and practice, the position requires that one stay well‑informed of technological advancements and be proficient at putting new innovations into effective practice. In addition, the candidate will have a history of increasing responsibility in a small multi‑role team. This position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. The applicant must also be willing to mentor other developers to prepare them for assuming the responsibilities.
What You Will Do
Responsible for building and supporting a GCP based ecosystem designed for enterprise‑wide analysis of structured, semi‑structured, and unstructured data.
Build streaming/batch data pipelines, and systems to access and process data.
Implement and test data pipelines.
Build analytics on raw data.
Troubleshoot data issues.
Communicate with other teams to identify and solve problems.
Set coding standards and perform code reviews.
Mentor junior developers.
Experience with microservices, and modern software patterns using containerized environment (i.e., Kubernetes) or serverless compute services.
Make sure service levels are maintained, and any interruption is resolved in a timely fashion.
Closely collaborate with team members to successfully execute development initiatives using Agile practices and principles.
Collaborate with business analysts, project lead, management, and customers on requirements.
Participate in the deployment, change, configuration, management, administration and maintenance of deployment process and systems.
Proven experience effectively prioritizing workload to meet deadlines and work objectives.
Gather requirements, designs, constructs, and deliver solutions with minimal team interaction.
Work in an environment with rapidly changing business requirements and priorities.
Bring new data sources into GCP, transform and load to BQ and databases.
Work collaboratively with Data Scientists and business and IT leaders throughout the company to understand Data needs and use cases.
Core Competencies
Communication and interpersonal skills.
Problem‑solving and critical thinking skills.
Understand strategic imperatives.
Technology & business knowledge.
Qualifications
Bachelor’s degree in computer science, related technical field, or equivalent experience.
Master’s degree in computer science or related field preferred.
3+ years of experience in Data Engineering required.
1+ year(s) of experience in Healthcare preferred.
5+ years of experience in Information Technology required.
Good understanding of best practices and standards for GCP Data process design and implementation.
Two Years of hands‑on experience with GCP platform (AWS or Azure are acceptable, too) and experience with many of the following components: Cloud Run, GKE, Cloud Functions, Spark Streaming, Kafka, Pub/Sub, Bigtable, Firestore, Cloud SQL, Cloud Spanner, JSON, Avro, Parquet, Python, Java, Terraform, Big Query, Dataflow, Data Fusion, Cloud Composer, DataProc, CI/CD, Cloud Logging, Vertex AI, NLP, GitHub.
Ability to multitask and to balance competing priorities.
Ability to define and utilize best practice techniques and to impose order in a fast‑changing environment.
Strong problem‑solving skills.
Strong verbal, written, and interpersonal skills, including a desire to work within a highly matrixed, team‑oriented environment.
Experience in Healthcare Domain preferred.
Hardware/Operating Systems: GCP, Distributed, highly scalable processing environments, Linux, UNIX.
Demonstrates an empathetic and growth mindset with a willingness to learn new skills, technologies, and methodologies - Required.
Growing knowledge of public cloud best practices and design patterns used in creating, automating, and supporting data pipelines - Required.
Ability to assemble large, complex sets of data that meet functional and non‑functional product requirements - Required.
Helps create and use analytical tools to monitor data pipeline metrics and provide actionable intelligence to increase operational efficiency and valuable data outcomes - Required.
Ability to use source control management tools such as Git/GitHub - Required.
Ability to use CI/CD automation tools - Required.
Understanding of SQL and analytical data warehouses - Required.
Understanding of Agile methodologies and how to apply Agile within the team - Required.
Proven ability to complete work, make sound decisions, and plan and accomplish goals with direction/guidance from leadership - Required.
Builds and nurtures healthy relationships with all colleagues - Required.
Stays abreast of public cloud technologies, capabilities, and industry use of public cloud to help guide HCA’s strategy and adoption - Required.
Physical Demands and Working Conditions
Prolonged sitting or standing at computer workstation including use of mouse, keyboard, and monitor.
Requires ability to provide after‑hours support.
Work Location and Schedule
Nashville, TN area (near Centennial Park).
Hybrid: 2 days a week onsite.
Travel Requirements Less than 25%
Equal Opportunity Employer We are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
#J-18808-Ljbffr
  • United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.