XX
Data EngineerTrinity Technology Solutions LLCUnited States

This job offer is no longer available

XX

Data Engineer

Trinity Technology Solutions LLC
  • US
    United States
  • US
    United States

About

Data Engineer for developing and maintaining ETL solutions to drive clinical operations, advanced analytics, and machine learning models. This position will work collaboratively with the technology team to support our clinical, operational, and finance teams to integrate data from diverse sources for consumption by internal and external stakeholders Responsibilities: • Responsible for the development, testing, maintenance, and optimization of cloud-native data and ETL solutions. • Work on small to mid-sized and cross-functional IT and business intelligence solutions. • Participate in the workstream planning process including inception, requirements gathering, technical design, development, testing and delivery of ETL solutions. • Collaborate with Analytics & Reporting, Data Science, Machine Learning, Analytics Engineering, IT Infrastructure, and other Technology teams in solution design, development, and deployment. • Practice business guidelines to protect PHI and ensure secure communication channels for transfer of such data. • Exercise best practice Agile communication and documentation through channels like JIRA, Confluence • Follow DevOps/DataOps best practices throughout software development lifecycle (SDLC).
Qualifications • Bachelor's degree in Computer Science, Information Technology, Engineering, Mathematics, or equivalent. • 2 - 5 years professional experience in Data Engineering (or similar) role • Experience in designing and implementing data applications and data architectures. • Experience with open-source data frameworks like Spark and/or experience with cloud data platforms is preferred. • Healthcare experience in a Payer or Provider/Hospital Organization preferred. • Experience in Azure data technologies is a bonus (Azure Data Factory, Synapse, Cosmos DB, Azure SQL). • Experience with DevOps/DataOps practices is a bonus. • Experience or familiar with Agile or a similar process. • Experience in implementing ELT and ETL solutions. Knowledge, Skills, and Abilities: • Experience with at least one database/data warehouse solution (e.g., MySQL, MSSQL, Synapse, Snowflake, RedShift). • Strong coding proficiency in at least one programming language (preferably Python) • Experience using industry standard Python libraries for data exploration, analysis, and transformation (e.g., Pandas, Numpy, etc.) • Experience using REST APIs • Proficient in writing SQL Code for SQL queries, views, stored procedures, etc. • Experience working with data housed in file formats including TXT, CSV, JSON, YAML, Parquet, XLSX • Problem-solving aptitude and critical thinking skills • Excellent communication and presentation skills
  • United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.