XX
Data EngineerArgyle InfotechUnited States

This job offer is no longer available

XX

Data Engineer

Argyle Infotech
  • US
    United States
  • US
    United States

About

Contract Based Data Engineer
Location: Grand Rapids, MI/ Hybrid Contract Duration: 7+ months Responsibilities: Design, code, test, and implement data movement, dashboarding and analytical assets. Develop system documentation according to SAFe Agile principles and industry standards. Evaluate architectural options and define the architecture of enterprise Data Lake and Data Warehouse. Provide subject matter expertise and technical consulting support on Azure Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2. Define functional and non-functional requirements including performance monitoring, alerting, and code management. Partner with all areas of the business to gather requirements for Data and Analytics and design solutions. Mentor and coach other members of the agile and/or Run team. Drive engagement with ITS Security and Infrastructure teams to ensure secure development and deployment of solutions. Interface with the Product Manager and IT partners at the Program level and within other Release Trains to define and estimate features for agile teams. Conduct industry research, facilitate new product and vendor evaluations, and assist in vendor selection. Qualifications: 6+ years industry experience in business application design, development, implementation, and/or solution architecture. Understanding of architecture practices for large projects / programs. Experience with Azure - Data Factory, Log Analytics, Databricks, Synapse, Power BI, ADLS Gen2. Databricks experience is required. Experience designing data pipelines, ingestion, storage, prep-train, model and serve using Azure technologies. Bachelors degree in computer science, Computer Information Systems, Business Information Systems, Engineering or equivalent. Excellent written and oral communication skills. Previous experience in Power BI, Data Modeling, Data Classification and zones, data movement, Data architecture and reporting. In-depth understanding of computer, storage, network components including backup, monitoring, and DR environment requirements. Preferred knowledge and experience on Python and API architecture in Azure. Any SAFe certification or training. Experience with multiple, diverse technical configurations, technologies, and processing environments. Exceptional interpersonal skills, including teamwork, facilitation, and negotiation.
  • United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.