Über
The Database Developer works across the organization to provide reliable, analytics-ready data that informs effective decision-making. This is a full-time, exempt, salaried position reporting to the Director of Data Technologies. Candidates must be local to Kansas City, MO, and after a successful training period, there are opportunities to work remotely.
Requirements
Data Pipelines & Integrations
Design, build, and maintain scalable batch and near-real-time data pipelines. Develop integrations using tools such as Azure Data Factory, Fabric Data Factory, SSIS, or equivalent technologies. Maintain clear, well-documented data processes that ensure secure, reliable data delivery. Ensure pipelines are secure, reliable, and well-documented. ETL / ELT Development
Develop structured ETL and ELT processes that support data warehouse models and downstream analytics. Partner with staff to ensure data structures align with reporting and semantic model needs. Scheduling & Automation
Manage orchestration, scheduling, and dependencies across data workflows. Implement automation to improve reliability, monitoring, and recovery from failures. AI
Platform
Readiness
Partner with leadership to evaluate, govern, and leverage AI-enabled capabilities within the data platform ecosystem. Leverage AI-assisted tooling to improve platform reliability and operations efficiencies. Data Reliability & Monitoring
Monitor pipeline execution and proactively identify failure patterns. Implement improvements to increase resiliency, observability, and operational predictability. Source
System Integration
Work with application owners (e.g., financial systems, CRM platforms) to understand data structures, APIs, and integration requirements. Maintain clear documentation for data flows, lineage, integration logic, and operational processes. Support ingestion of files, APIs, JSON/XML payloads, and database sources. Education & Experience
At least 3-6 years of related experience with a bachelor's degree. An equivalent combination of education and experience will be considered. Required Technical Background
Hands-on experience developing and supporting ETL/ELT pipelines in Azure, SQL Server, or comparable data environments. Strong SQL skills with the ability to troubleshoot data quality, transformation, and performance issues. Experience integrating data from APIs, SaaS platforms, files, and relational databases (including JSON and XML payloads). Experience managing job orchestration, scheduling, documentation, retries, and failure of recovery for data workflows. Familiarity with version control (Git) and deployment practices for data pipelines and integration code. Ability to monitor pipeline execution and proactively identify and remediate reliability issues. Preferred Background
Experience with Azure Data Factory, Fabric Data Factory, SSIS, or similar tools. Exposure to cloud data warehouses or analytics platforms. Experience working in regulated, audit-conscious, or highly governed environments. Physical Requirements
Office & Computer Work : Ability to work regularly at a computer terminal in a fast-paced environment with frequent interruptions. Noise & Communication : Able to work in an office with moderate noise levels. Ability to communicate and interpret detailed information effectively.
This job description serves as a summary of the employment-at-will relationship and is not a contract. Responsibilities may evolve, and other duties may be assigned as needed.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.