- +2
- +9
- North Carolina, United States
About
Description
Primary Role: This position reports to the DHTS-Data Partnerships, Director of Data and Analytics Platforms. This individual will be primarily responsible for the development of data integration and delivery pipelines while also expanding the FHIR-based content stored within the clients data lake. These solutions will capitalize on technologies to improve the value of analytical data, improve effectiveness of information stewardship, and streamline the flow of data in the organization. Solutions will focus on using state of the art data and analytics tools including traditional and near real-time integrations, big-data, and delta lake architecture using both extract, load, transform (ELT) toolsets as well as REST APIs and FHIR. The ideal candidate will also be comfortable with data science platforms with proven experience leveraging DevOps and Automation/Orchestration tools.
Essential Tasks/Responsibilities
*Create and maintain optimal data pipeline architecture
*Develop a data lake on Microsoft Azure using the medallion architecture leveraging a delta lake format for the silver layer
*Assemble large, complex data sets that meet functional / non-functional business requirements
*Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
*Recommend design of analytics solutions which improves data integration, data quality, and data delivery with an eye towards re-useable components
*Articulate differences, advantages, and disadvantages between architectural solution methods
*Work with Agile team members to document and execute test plans and data validation scripts. Support the code promotion process through development and production as required by using standard CI/CD processes
*Develop monitoring, logging, and error notification processes to ensure data is updated as expected and processing metrics reported
*Participate in the creation and maintenance of standards for coding, documentation, error handling, error notification, logging, etc.
*Accountable for conforming to established architectural, developmental, and operational standards and practices
*Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
*Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs
*Evaluate and recommend development tools
*Assist in application and data operations performance tuning
*Participate in system architecture design
*Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
*Work with data and analytics experts to strive for greater functionality in our data systems
*Share troubleshooting and maintenance duties
Education: Bachelor's degree in a related field, or four years of equivalent technical experience required
Required Experience: We are looking for a candidate with 5+ years of experience in a Data Engineer role:
*Experience implementing data lakes on Microsoft Azure
*Experience with relational SQL and NoSQL databases
*Experience with data pipeline and workflow management tools such as Azure Data Factory, Synapse Analytics pipelines
*Experience with object-oriented/object function scripting languages such as Python or Java.
*Experience with Cloud-based analytics platforms such as Azure Synapse Analytics
Required Skills:
*Advanced SQL skills and experience working with a variety of relational database management systems
*Intermediate to Advance skills in python development
*Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
*Strong analytical skills related to working with unstructured datasets
*Build processes supporting data transformation, data structures, metadata, dependency and workload management
*A successful history of manipulating, processing and extracting value from large disconnected datasets
*Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores
*Experience supporting and working with cross-functional teams in a dynamic environment
Desired Skills:
*Experience with Microsoft Fabric
*Working experience with the FHIR specification; including implementation.
*Knowledge of APIs, API Integration, and API Management
*Working knowledge of DevOps & Automation/Orchestration
*Knowledge of open-source software solutions and open source as a business model
*Technical breadth across application development, enterprise architecture, or application integration
*Understanding of Agile methodology
Prior experience in health care related field is a plus
The information above describes the general nature and level of work assigned to this position. It is not intended to be an exhaustive list of all duties and responsibilities required of position incumbents
Skills
azure, engineering, Data
Top Skills Details
azure,engineering,Data
Additional Skills & Qualifications
Ability to communicate effectively with internal and external stakeholders. This role is a technical role but they will be part of an organization who is working closely with the third party vendor. This role is 100% remote. Meetings start at 9 am (ET) every day.
Experience Level
Entry Level
Pay and Benefits
The pay range for this position is $50.00 - $75.00/hr.
Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to specific elections, plan, or program terms. If eligible, the benefits available for this temporary role may include the following:
Medical, dental & vision
- Critical Illness, Accident, and Hospital
- 401(k) Retirement Plan - Pre-tax and Roth post-tax contributions available
- Life Insurance (Voluntary Life & AD&D for the employee and dependents)
- Short and long-term disability
- Health Spending Account (HSA)
- Transportation benefits
- Employee Assistance Program
- Time Off/Leave (PTO, Vacation or Sick Leave)
Workplace Type
This is a fully remote position.
Application Deadline
This position is anticipated to close on Feb 21, 2025.
About TEKsystems:
We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Nice-to-have skills
- Azure
- SQL
- NoSQL
- Python
- Java
- Azure Data Factory
- DevOps
- Automation
- Data Science
Work experience
- Data Engineer
- Data Infrastructure
Languages
- English