XX
Principal Data ArchitectBlue Cross of IdahoMeridian, Idaho, United States
XX

Principal Data Architect

Blue Cross of Idaho
  • US
    Meridian, Idaho, United States
  • US
    Meridian, Idaho, United States
Jetzt Bewerben

Über

Blue Cross of Idaho is seeking a Principal Data Architect to join our Data Strategy & Engineering team's Cloud Enterprise Data Platform (EDP) journey. As we modernize and optimize our Data & Analytics program through legacy data platforms consolidation onto AWS & Snowflake re-design, we will be enabling a suite of leading capabilities to support the spectrum of business needs with key cloud and big data technologies to include: AI/ML, streaming data, data lake and data warehouse, as well as self-service and delivered reporting. This important role will help lead our EDP architecture, roadmap, and integration strategy, as well as play a key role in the build of these capabilities.

We'll look to you to demonstrate a robust track record of leading data engineering and platform enablement projects and enhancement efforts on Snowflake. As a subject matter expert on Snowflake capabilities, services, tools, and best practices for data engineering, data warehousing, and data delivery, you will help drive our key initiatives and high value projects, including the migration of our existing on-prem data warehouse platforms (Data Vault/WhereScape) onto Snowflake. You'll also integrate with operational systems located within AWS, Azure, hosted as SaaS, and on-prem using the best fit integration technologies and methodologies. Working with an internal data platform engineering/enablement team augmented by third-party onshore/offshore project resources, you'll also partner very closely with Data Governance, Product, Security, IT engineering teams, and other business teams to build secure and sustainable solutions.

Location: strong preference for flexible hybrid location (onsite Meridian Idaho campus + local work-from-home); there may be opportunity for fully remote work within a mutually acceptable location. #LI-Remote;

Responsibilities:

  • Define multi-tenant enterprise data architecture for the platforms
  • Lead data architecture practice and represent Data Engineering at the enterprise level
  • Technical subject matter expert in the data ecosystem, including AWS/Snowflake, providing input into architecture, platforms, and development strategies and methodologies – includes mentorship of engineering and platform teams via design reviews, code reviews, etc.
  • Define data engineering and data platform integration framework and standards. Mentor, coach and build learning program to efficiently and quickly onboard new people in the team and ensure standards are understood and followed.
  • Facilitate cross-team collaboration for defining and building enterprise data management architecture from principles to tools, oversee cross-functional adoption of the new architecture, and enable a new level of engineering efficiency when working with data.
  • Direct the strategy and implementation for migration from the existing data warehouse platforms onto the new Enterprise Data Platform (EDP) on AWS/Snowflake.
  • Lead technical direction of the team, driving the necessary changes and recommending appropriate technology choices working collaboratively with Architecture, Platform, DevOps, Security, and Project teams; influence technical direction with expert input into project decisions.
  • Lead the shift towards DevSecOps processes for the Data & Analytics delivery functions, emphasizing continuous integration, release management, and automated testing to maximize development agility and improve time to market.
  • As the data platform product manager, drive the data platform technical roadmap to prioritize and build new features that serve the business teams and standardize templates for technical work
  • Interface with key business functions (i.e., Compliance, Actuarial, Marketing, Operations, Clinical, etc.) and IT Analysis teams to assess business functional requirements and translate them into data and integration requirements.
  • Guide onsite/onshore/offshore technical resources from consulting partners, communicate architecture standards and best practices, establish a high impact development process, drive excellence in all deliverables.
  • Establish daily cadence with project team to prioritize and execute work items through adoption of Agile principles and processes.
  • Manage relationships with external vendors to determine technical competence and identify integration opportunities.
  • Be a hands-on practitioner for end-to-end delivery of AWS/Snowflake platforms, capabilities, and content.

Success Factors:

  • Ability to understand, drive, and deliver technology solutions in AWS and Snowflake.
  • Demonstrated understanding of Data architecture and components, especially for Data Warehousing, including Snowflake, dbt Data transformations, Airflow, Data Cataloging tools like Alation; other key integration platforms like Boomi/Fivetran, BI tools like Tableau/Power BI/Sigma, and AI/ML tools like Dataiku/Cortex/Snowflake Intelligence.
  • Demonstrated understanding of and experience with modern Data Warehouse concepts, including Data Lake/Data Warehouse/Data Mart implementations, SQL-based transformations, and ELT methodology.
  • Understanding of Data Mesh principles, treating data as product, emphasizing domain specific data ownership and governance. Ability to foster/work in decentralized ownership of data.
  • Solid understanding of DevSecOps and Agile methodology, comfortable with Jira and AWS DevOps and GitLab.
  • Ability to lead project and technical teams and deliver solutions.
  • Ability to partner with Data Solution Architects in building APIs and other integrations required.
  • Ability to provide mentorship to Snowflake administrators in support of role/policy administration, replication, zero-copy clone etc. Ability to provide technical leadership to Snowflake-related project and support teams.
  • Demonstrated planning, coordination, and execution skills.
  • Ability to conduct POCs, and demonstrable work products when adapting new features.
  • Ability to successfully collaborate with business partners.
  • Knowledgeable and skilled in Snowflake-related technologies, including platforms and tools for meeting business use cases for all forms of analytics and AI/ML functionality.
  • Ability to prioritize well, communicate clearly, and understand how to drive a high level of focus and excellence in deliverables. Communication is diplomatic, accurate and concise, across internal and external organizations.
  • Ability to collaborate, propose solutions, owning the data platform items related to architecture reviews and advise on the best course of action.
  • Conscientious, reliable, and inquisitive with a keen desire to learn, not just gain the knowledge necessary for the job but also the underlying reasons and drivers.

Required Education: Bachelor's degree in Computer Science, Information Systems, or equivalent work experience is required (two years' relevant work experience is equivalent to one-year college). Certifications such as Snowflake Pro and/or AWS-related are highly preferred.

Required Experience: 10/+ years of experience in Data & Analytics field, to include:

  • 2/+ years' working with hybrid (onshore + offshore) data teams.
  • 5/+ years designing and building data storage capabilities, such as data warehouse or data lake. This should include 3/+ years in AWS/Snowflake-specific architecture, development, or support capacity.

Overall experience should also span:

  • Understanding of large-scale computing solutions, including software design and development, and database architectures.
  • Implementation involving batch, streaming, event-driven and API integrations on the data platform.
  • Knowledge of AWS/Snowflake cloud security, orchestration, management, data management (in particular metadata management and data quality checks), role-based access controls, and policy-based access controls.
  • Knowledge of FinOps, preferably involving Snowflake.
  • Ability to build data pipelines as part of integrations design.
  • Strong knowledge of ELT, ETL, CDC, API, messaging, streaming, and all forms of data ingestion techniques applicable to cloud-based Data & Analytics.
  • Strong skills in SQL
  • DevSecOps and Agile projects
  • No-SQL; relational & non-relational platforms; multiple file formats, including Parquet, AVRO, JSON, XML, CSV, etc.

Preferred Experience:

  • dbt
  • Boomi, Mule or other integration platforms
  • Python, Snowpark, Sage Maker, Snowflake Intelligence
  • Tableau, Power BI or Sigma
  • Data science platforms, such as Dataiku
  • Designing and building frameworks, API interfaces for efficient data extraction
  • Working in environments where data privacy and protection is critical (Healthcare, HIPAA, etc.).

As of the date of this posting, a good faith estimate of the current pay range is: $147,983 - $192,969. The position is eligible for an annual incentive bonus (variable depending on company and employee performance). The pay range for this position takes into

  • Meridian, Idaho, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.