- +2
- +17
- Arizona, United States
Über
Position Summary
The Data Engineer will be responsible for building and maintaining enterprise data management infrastructure both on-premises and in the cloud. The Data Engineer will be responsible for orchestrating pipelines using modern data tools/architectures as well as design and engineering of existing transactional and RDBMS processing systems.
Work Schedule:
This position currently follows a hybrid work model. Employees are required to work from the office at least four days per week (Monday - Thursday), with Friday available for remote work, offering a blend of in-person collaboration and flexibility.
Essential Duties & Responsibilities (Other duties may be assigned)
Develop and maintain data pipelines that extract, transform, and load data into an information and analytics environment
Develop and maintain datasets in a conventional data warehouse (operational data store, dimensional models)
Develop and maintain datasets in a modern, cloud-based data warehouse (Redshift, Snowflake, Azure)
Implement and configure data sets on column oriented (column-store) database management systems
Assist application development teams during application design and development for highly complex and critical data projects.
Create and enhance analytic and data platforms using tools that enable state of the art, next generation capabilities and applications.
Utilize programming languages like SQL, Java, Spark, Python, with an emphasis in tuning, optimization, and best practices for application developers.
Function as team member in an Agile development environment
Leverage DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test-Driven Development to enable the rapid delivery of end user capabilities’
Develop data solutions on cloud deployments such as AWS and Azure.
Understand concepts of data ingestion from event driven architectures
Remain on the cusp of new technologies in Data Engineering, Big Data and Analytics, and Cloud space.
Minimum Qualifications (These are the requirements that all applicants MUST HAVE to be considered for this position)
Bachelor’s degree from an accredited institution or equivalent experience.
4+ years experience connecting to various data sources and structures: APIs, NoSQL, RDBMS, Blob Storage, Data Lake, etc. Knowledge of cloud providers such as AWS, Azure, and Snowflake
Database systems (SQL and NoSQL)
ETL tools including SSIS, Glue, and Kinesis Firehose
Data APIs
Python, PowerShell
Experience with multiplatform integration and distributed systems (Kafka)
Preferred Qualifications
Experience with Dev Ops tools such as Azure DevOps and Visual Studio
Master Data Management (MDM)
Understanding of data-warehousing and data-modeling techniques
Advanced level SQL for data transformations, queries & data modeling
Data Quality tools and processes.
Graph Database Systems including Neo4J
Cypher Query Language
Data warehousing solutions
T-SQL, MDX, and Spark programming languages
Prior knowledge of modern Cloud based ETL tools such as Matillion or Fivetran
Location : Company Overview
Established in 2002, Isagenix International has created simple, proven products that optimize what your body is capable of—helping you protect your greatest asset, your health. For more than twenty years, Isagenix has made holistic science an art with transparency and integrity—creating products and systems that address nutrition, stress, fitness, energy, natural beauty, focus, and financial wellbeing. The global wellbeing company, based in Gilbert, Arizona, markets its products through a network of independent distributors in 22 key markets: the United States, Canada, Puerto Rico, Australia, New Zealand, Mexico, the United Kingdom, Ireland, the Netherlands, Belgium, Spain, Austria, Denmark, Finland, France, Germany, Italy, Norway, Poland, Portugal, Sweden, and Switzerland. For more information, visit Isagenix.com.
Location : EEO
Isagenix International, LLC is an equal opportunity employer and affords equal opportunity to all applicants for all positions without regard to race, color, religion, sex, national origin, age, disability, veteran status or any other status protected by law.
Wünschenswerte Fähigkeiten
- SQL
- Java
- Spark
- Python
- Redshift
- Azure
- AWS
- Kafka
- Neo4J
- T-SQL
- ETL
- SSIS
- Powershell
- NoSQL
- RDBMS
- DevOps
- Test Automation
Berufserfahrung
- Data Engineer
- Data Infrastructure
Sprachkenntnisse
- English