XX
Saxon Global

Data Engineer II

  • +3
  • +12
  • US
    Arizona, United States
Interesse zeigen
  • +3
  • +12
  • US
    Arizona, United States

Über

Data Engineer II

KEY RESPONSIBILITIES: Process Structured and Unstructured Data

Gathers and processes raw, structured, semi-structured, and unstructured data using batch and real-time data processing frameworks. Understands and enforces appropriate data master management techniques. Data Validation and Quality

Ensures data quality and implements tools and frameworks for automating the identification of data quality issues. Work with internal and external data providers on data validation providing feedback and making customized changes to data feeds and data mappings. Agile Planning

Understands the challenges that the analytics organization faces in their day-to-day work and partner with them to design viable data solutions. Recommends improvements to processes, technology, and interfaces that improve the effectiveness of the team and reduce technical debt. Product Development

Implements and optimizes data solutions in enterprise data warehouses and big data repositories. Installs, maintains, monitors, and supports business intelligence, distributed computation, and big data analytics tools Application Support

Provides ongoing support, monitoring, and maintenance of deployed products. Manage application enhancements to improve business performance Advice and Guidance

Actively works with less experienced data engineers providing technical guidance and oversight. Take part in reviews of own work and leads reviews of colleagues' work. Derive an overall strategy of data management, within an established information architecture (including both structured and unstructured data), that supports the development and secure operation of existing and new information and digital services. Plan effective data storage, security, sharing and publishing within the organization Emerging Technology Monitoring

Actively participates in the engineering community, staying up to date on new data technologies and best practices and shares insights with others in the organization. QUALIFICATIONS:

Bachelor's degree in Computer Science or related field and 8-10 years working experience Working experience with batch and real-time data processing frameworks Working experience with data modelling, data access, schemas, and data storage techniques within

Snowflake Working experience in design, development, and implementation of highly scalable, high-volume software systems and components, client-facing web applications, and major Internet-oriented applications and systems Working experience working with relational databases such as

SQL, MySQL, Postgres/PostgreSQL Working experience with business intelligence tools and platforms Working experience with data quality tools Working experience with application lifecycle methodologies (e.g. waterfall, agile, iterative) Experience with

ETL processes and tools Experience working with Git PREFERRED QUALIFICATIONS:

Scrum Developer Certification or equivalent Working experience with Sisense, Tableau or Microsoft PowerBI platforms Working experience with SQL Server Integration Services Working experience with the Snowflake ecosystem Working experience with AWS data tools (Database Migration Service) Working experience with Python Experience working with Jira, Rally or similar tools

Required Skills : • Bachelor's degree in Computer Science or related field and 2-5 years working experience • Working experience with batch and real-time data processing frameworks • Working experience with data modelling, data access, schemas, and data storage techniques within Snowflake • Working experience in design, development, and implementation of highly scalable, high-volume software systems and components, client-facing web applications, and major Internet-oriented applications and systems • Working experience working with relational databases such as SQL, MySQL, Postgres/PostgreSQL • Working experience with business intelligence tools and platforms • Working experience with data quality tools • Working experience with application lifecycle methodologies (e.g. waterfall, agile, iterative) • Experience with ETL processes and tools • Experience working with Git Basic Qualification : Additional Skills : Background Check :Yes Drug Screen :Yes Notes : Selling points for candidate : Project Verification Info : Candidate must be your W2 Employee :No Exclusive to Apex :Yes Face to face interview required :Yes Candidate must be local :No Candidate must be authorized to work without sponsorship :Yes Interview times set :Yes Type of project :Development/Engineering Master Job Title :Engineer: Other (Non-IT/Non-Telecom) Branch Code :Phoenix

Wünschenswerte Fähigkeiten

  • SQL
  • MySQL
  • PostgreSQL
  • ETL
  • Git
  • Tableau
  • SQL Server Integration Services
  • AWS
  • Python
  • JIRA
  • Rally
  • Scrum
  • Arizona, United States

Berufserfahrung

  • Data Engineer
  • Data Infrastructure
  • Data Analyst

Sprachkenntnisse

  • English