XX
ABC Supply Co.

Data Engineer (944)

  • +2
  • +6
  • US
    Wisconsin, United States
Show interest
  • +2
  • +6
  • US
    Wisconsin, United States

About

Must be able to work onsite at ABC Supply's HQ in Beloit, WI in a hybrid arrangement.

ABC Supply is North America's largest wholesale distributor of exterior and interior building products.

ABC Supply is proud to be an employee-first company. In fact, we have won the Gallup Great Workplace Award every year since its inception in 2007, and Glassdoor has named us one of the best places to work in the country. Be part of a company that recognizes your talents, rewards your efforts, and helps you reach your full potential. At ABC Supply, we have YOUR future covered.

ABC Supply is currently seeking a Data Engineer to deliver analytics solutions to the enterprise. As a member of the Enterprise Analytics team at ABC, you will be delivering high impact solutions driven by our executive leadership.

The Data Engineer is responsible for developing batch integrations to ABC standards. The Data Engineer is expected to have deep knowledge of the EDW, Data modelling, integration patterns (ETL, ELT, etc....) and may work with one or a range of tools depending on project deliverables and team resourcing. The Data Engineer will also be expected to understand traditional relational database systems and be able to assist in administering these systems.

Candidates must be interested in working in a collaborative environment and possess great communication skills, experience working directly with all levels of a business and able to work in both a team environment as well as individually. Responsibilities range from batch application/client integration, aggregating data from multiple sources into a data warehouse, automate integration solution generation using reusable patterns/scripting, prototyping integration solutions, and security.

Primary Accountabilities: Designed, develop, and deliver large-scale data systems, data processing, and data transformation Pipelines. Collaborate with stakeholders to understand business needs and work closely with business to deliver enterprise grade datasets that are reliable, flexible, scalable, and provide low cost of ownership. Develops optimized and scalable batch integration solutions for ABC Supply. This includes traditional DW workloads and nightly large extracts that are scheduled. Extract data from various sources, transform it into a usable format, and load it into Azure data lake using ADF pipelines, Databricks, Pyspark, SQL. Design and Build Data models - star schema, snowflake. Understands common analytical data models like Kimball. Builds physical data models align with best practice and requirements. Mentor & coach junior team members with your technical expertise to enhance their skills and abilities. Documents all solutions as needed using ABC standard documentation. Plans, reviews, and performs the implementation of database changes for integrations/DW work. Work with BI team, PO to build required tables and transform data to load into Snowflake/Dremio. Partners with functional support and help desk teams to ensure communication, collaboration, and compliance with support process standards at ABC. Performs data management tasks as needed. Maintains relevant technical competencies and helps to foster an environment of continued growth and learning among colleagues on existing and emerging technologies. Recommend ways to improve data reliability, efficiency, and quality. Additional Responsibilities as assigned. Qualifications:

A bachelor's degree in computer science or a related field is required. A high school diploma and/or equivalent combination of education and work experience may be substituted. A minimum of 7 years relevant experience of development experience in Data warehousing and various ELT or ETL tools A minimum of 5 years' experience with Cloud native Modern Data Engineering projects and practices: designing, building, and deploying scalable data pipelines. Preferred experience in Data bricks, Azure, ADF (Azure Data Factory) Must have hands on experience with Python, Pyspark, Spark SQL and able to write Complex SQL for DDL and DML operations fluently. Deep understanding of data architecture as it relates to business goals and objectives and strong understanding of enterprise integration patterns (EIP), data warehouse modeling, ETL processes, and database management. Experience with development and data warehouse requirements gathering analysis and design. Understands common analytical data models like Kimball. Possess strong business acumen and consistently demonstrates forward thinking. Eagerly and proactively investigates new technologies. Can effectively work with ambiguous or incomplete information. Must have a strong working knowledge of technical infrastructure, protocols and networks. Must be able to routinely work with little or no supervision. Must be able to effectively and efficiently handle multiple and shifting priorities. Benefits may include:

Health, dental, and vision coverage - eligible after 60 days, low out of pocket. 401(k) with generous company match - eligible after 60 days, immediately vested. Employer paid employee assistance program. Employer paid short term and long-term disability. Employer paid life insurance. Flex spending. Paid vacation. Paid sick days. Paid holidays.

Equal Opportunity Employer / Drug Free Workplace

ABC Supply values diversity and we actively encourage women, minorities, and veterans to apply.

Nice-to-have skills

  • ETL
  • PySpark
  • SQL
  • Kimball
  • Python
  • Database Management
  • Wisconsin, United States

Work experience

  • Data Engineer
  • Data Infrastructure

Languages

  • English