À propos
Medical, dental, vision, 401k, flexible spending account, paid sick leave, paid time off, parental leave, quarterly performance bonus, training, career growth and education reimbursement programs. Ziply Fiber Overview
Ziply Fiber is a local internet service provider dedicated to elevating the connected lives of the communities we serve. We offer the fastest home internet in the nation, are refreshingly great customer experience, and affordable plans that put customers in charge. Our Company Values
Genuinely Caring:
We treat customers and colleagues like neighbors, with empathy and full attention. Empowering You:
We help customers choose what is best for them, and we support employees in implementing new ideas and solutions. Innovation and Improvement:
We constantly seek ways to improve how we serve customers and each other. Earning Your Trust:
We build trust through clear, honest, human communication. Job Summary
The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs. Essential Duties and Responsibilities
The essential duties and responsibilities listed below are a range of duties performed by the employee and not intended to reflect all duties performed. Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets. Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems. Automate data workflows to increase efficiency and reduce manual intervention. Data Infrastructure, Modeling & Governance
Optimize data models for analytics and business intelligence reporting. Build and maintain data infrastructure, ensuring performance, reliability, and scalability. Implement best practices for data governance, security, and compliance. Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms. Cross‑Functional Collaboration, Leadership & Documentation
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions. Mentor and train junior engineers, fostering a culture of learning and innovation. Develop and maintain documentation for data engineering processes and workflows. Other Duties
Perform other duties as required to support the business and evolving organization. Required Qualifications
Bachelor’s degree in Computer Science, Engineering, or a related field. Minimum of eight (8) years of experience in data engineering, ETL development, or related fields. Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.). Familiarity with Linux/Unix and scripting technologies. Proficiency in Python for data engineering tasks. Hands‑on experience with Microsoft Azure and its data services such as Azure Data Factory and Azure Synapse Analytics. Experience working with data warehouses such as Snowflake or Azure SQL Data Warehouse. Familiarity with workflow automation tools such as Autosys. Knowledge of data modeling, schema design, and data architecture best practices. Strong understanding of data governance, security, and compliance standards. Ability to work independently in a remote environment across different time zones. Exposure to GraphQL and RESTful APIs for data retrieval and integration. Familiarity with NoSQL databases such as MongoDB. Experience with version control software such as GitLab. Preferred Qualifications
Proven aptitude for independently managing complex procedures, even when encountered infrequently. Proactive approach to learning and optimizing operational workflows. Familiarity with DevOps practices and CI/CD pipelines for data engineering, including Azure DevOps. Proficiency in designing, writing, and maintaining complex stored procedures and ETL workflows for robust data processing. Knowledge, Skills, and Abilities
Strong problem‑solving and analytical skills. Ability to manage multiple priorities and work in a fast‑paced environment. Excellent verbal and written communication skills. Ability to translate business requirements into scalable technical solutions. Strong attention to detail and a commitment to data quality. Ability to work with Agile methodologies and tools such as Jira, Confluence, and Azure DevOps. Strong collaboration skills with cross‑functional teams including product managers, software engineers, and business analysts. Work Authorization
Applicants must be currently authorized to work in the US for any employer. Sponsorship is not available for this position. Physical Requirements
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Background Check
Ziply Fiber requires a pre‑employment background check as conditions of employment. Ziply Fiber may require a pre‑employment drug screening. Ziply Fiber must be your primary employer. Unless otherwise prohibited by law, employees may not hold outside employment nor be self‑employed without obtaining approval in writing from Ziply Fiber. In holding outside employment or self‑employment, employees should ensure that participation does not conflict with responsibilities to Ziply Fiber or its business interests.
#J-18808-Ljbffr
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.