- +3
- +9
- United States
À propos
Key Responsibilities Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. Ensure data integrity, quality, and governance throughout Microsoft Fabric environments. Collaborate with stakeholders, analysts, and architects to translate business needs into actionable data solutions. Troubleshoot and optimize existing Microsoft Fabric implementations for enhanced performance. Test, monitor, and ensure compliance with best practices in data analytics and quality management. About you: Experience translating business requirements to technical requirements. Proficiency in programming languages commonly used in data engineering (e.g., Python, Java, Scala). Strong knowledge of database systems, data modeling techniques, and SQL proficiency. Proficiency with ETL tools commonly used in data engineering (e.g., SSIS, Databricks, Azure Data Factory). Strong working knowledge and experience with Microsoft Azure services and tools including Microsoft Fabric, Azure Data Factory, Azure Synapse, Azure SQL Database, and Azure Databricks. Experience using a work management tool such as Azure DevOps . Preferred/Desired Qualifications Microsoft Certified: Fabric Data Engineer Associate. Microsoft Certified: Fabric Analytics Engineer Associate. Education and Experience Bachelor’s degree in computer science, engineering, information systems, or related field. Master’s degree preferred. A minimum of five years of experience in information technology with at least two years of the following: Experience in business analytics, data science, software development, data modeling, or data engineering work or equivalent experience. Experience manipulating and transforming data in Spark SQL, PySpark, or Spark Scala, or a minimum of two years of experience manipulating and transforming data in T-SQL. GENESYS Consulting Servcies, Inc. is proud to be an equal opportunity employer.
Compétences idéales
- ETL
- Python
- Java
- Scala
- SQL
- SSIS
- Azure Data Factory
- PySpark
- T-SQL
Expérience professionnelle
- Data Engineer
- Data Infrastructure
- Data Analyst
Compétences linguistiques
- English