About
Job Description:
Responsibilities : · Data Warehouse Management: Continue to support and enhance the Enterprise Data Warehouse, ensuring data availability, accuracy, and efficiency in ETL operations. · Data Platform Implementation: Architect, design, and implement a scalable, on-prem or cloud-based enterprise data platform, integrating diverse data sources beyond Guidewire Insurance Suite · Data Integration & Engineering: Develop and oversee ETL/ELT pipelines to ingest, transform, and store data efficiently, leveraging modern tools. · Data Modeling & Architecture: Design and implement optimized data models for structured and unstructured data, supporting reporting, analytics, and AI/ML initiatives. · Data Governance & Security: Establish best practices for data governance, data quality, metadata management, and security compliance across all data assets. · Advanced Analytics Support: Enable self-service analytics, real-time data processing, and AI/ML-driven insights by integrating modern data technologies such as data lakes, streaming data, Graph and NoSQL databases. · Collaboration & Leadership: Act as a strategic partner to IT, business units, and analytics teams, aligning data initiatives with organizational goals. Mentor junior team members and foster a culture of data-driven decision-making. · Monitor the task queue, take, and update tickets as directed by your supervisor. · Successfully engage in multiple initiatives simultaneously. · Contributes to the development of project plans and may assign and monitor tasks. · Assist in the development and generation of new reports to be provided to senior management across functional departments. · Performs other duties as required.
Requirements : · Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. · 5+ years of experience in data architecture, engineering, or related roles, preferably within the insurance industry. · Ability to analyze and learn new complex data sources models and integrate them into the data platform’s pipelines. · Experience implementing cloud-based data platforms in Azure and familiarity with data lakehouse architectures. · Proficiency in modern ETL/ELT tools (e.g., MS SSIS and Azure Data Factory) and database technologies (SQL, Databricks, etc.). · Hands-on experience with big data processing, streaming technologies (Kafka, Spark, Flink), and API-driven data integration. · Strong understanding of data security, compliance, and governance best practices (GDPR, CCPA, SOC2, etc.). · Familiarity with BI/reporting tools such as Power BI, Tableau, Looker. · Strong knowledge and experience implementing Data Mesh architecture is a plus. · Knowledge of machine learning frameworks and MLOps is a plus. · Familiarity with ticketing systems like Atlassian Jira used to assign and track work amongst multiple team members. · Must be resourceful, industrious, and willing to take on new tasks and proactively learn new technologies to keep up with business needs. · Must be able to work under tight deadlines efficiently and with high quality. · Must possess strong organizational skills with demonstrated attention to detail. · Must be flexible and able to adapt in a changing business environment. · Must possess a positive attitude and strong work ethic. · Excellent verbal and written communication skills and the ability to interact professionally with a diverse group (executives, managers, and subject matter experts). · Must be proficient in Microsoft Office (Excel, Word, Power Point).
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.