XX
Senior Data EngineerPTMA Financial SolutionsDenver, Colorado, United States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Senior Data Engineer

PTMA Financial Solutions
  • US
    Denver, Colorado, United States
  • US
    Denver, Colorado, United States

Über

PTMA Financial Solutions, provides treasury management, liquidity management, and other financial products and services to the public sector. In addition to more than 12,000 local governments, school districts and other public entities, we also partner with over 1,000 financial institutions to help strengthen communities from coast to coast. Our family of financial services companies offers local government investment pool administration, investment advisory services, term investments, cashflow analysis, bond proceeds management, and public finance services for public entities plus stable deposit funding solutions for financial institutions. Our financial expertise, comprehensive products, and advanced technology help clients achieve more today for a better tomorrow. The firm's primary operational hubs are in Denver, Colorado, and Naperville, Illinois, with other offices throughout the United States.

Job Summary:

The Senior Data Engineer & Platform Lead at PTMA Financial Solutions is a full-time leadership role responsible for owning and evolving the company's data platform and strategy. This role involves building and optimizing a cutting-edge Microsoft Fabric-based data platform, guiding data architecture decisions, and integrating advanced analytics (including external LLM/RAG solutions) to drive business value.

Key Responsibilities:
  • Cross-Functional Collaboration: Act as the primary point-of-contact for data initiatives across teams (Investments, Finance, Client Services, Operations). Engage with stakeholders to understand their data needs and pain points, translating them into technical solutions and phased implementation plans.
  • Platform Governance & Prioritization: Guide platform-related decisions (tools, vendors, integrations) and govern data project priorities Architecture
  • Microsoft Fabric Ownership: Own and manage the end-to-end Microsoft Fabric architecture for PTMA. This includes designing and organizing the OneLake unified data lake (the single, logical data repository for all analytics data), configuring capacities, and overseeing Fabric's components (data pipelines, Dataflows Gen2, Data Warehouse/Lakehouse, etc.) and their integration.
  • Data Pipeline Design: Lead the design and implementation of enterprise data pipelines and ETL/ELT processes to ingest, transform, and load data from a variety of sources. You will integrate data from core business systems into the Fabric platform. Leverage Azure Data Factory (in Fabric) or equivalent orchestration tools to build pipelines that handle batch and streaming data, ensuring timeliness and accuracy of data delivery.
  • LLM Integration & APIs: Design secure APIs and data access layers to enable external Large Language Model (LLM) usage (outside of Microsoft's built-in Copilot) with company data.
  • Performance & Capacity Management: Own the monitoring and optimization of the data platform's performance. Plan for capacity and scalability of the Fabric environment, including compute and storage management, to accommodate growing data volumes and user concurrency
  • Hands-on Data Engineering: Developing and maintaining data workflows. Build and maintain pipelines that aggregate and transform data from diverse systems into a centralized warehouse and lakehouse, ensuring data is consistent, clean, and ready for analysis. This includes handling data mapping, data transformations, and validation rules for financial datasets (positions, transactions, client data, etc.).
  • Standards & Best Practices: Establish and enforce data engineering best practices including coding standards, naming conventions for datasets and fields, data contract definitions between systems, and robust testing/validation of pipeline outputs. Implement processes for data accuracy and reconciliation so that data quality issues are proactively identified and resolved.
  • Tooling & Innovation: Continuously evaluate and incorporate new tools or features that improve data engineering productivity and data platform capabilities. This may include adopting new Microsoft Fabric features, Delta Lake optimization techniques, CI/CD automation for data pipelines, or leveraging Python for custom data processing tasks. You will also ensure version control and development workflows are in place for data pipeline code (e.g., using Git and DevOps practices).
  • Regulatory Compliance & Data Governance: Establish data governance policies and practices that ensure compliance with financial industry regulations (FINRA, SEC) and standards such as SOC 2. This includes defining data retention and deletion policies, access controls, and audit trails in line with regulatory requirements.
  • Microsoft Purview Implementation: Deploy and manage Microsoft Purview (or similar data governance tools) to achieve enterprise-grade data cataloging, classification, and lineage. Utilize Purview's capabilities to automatically discover and classify sensitive information. Set up end-to-end data lineage tracking for transparency in data flows and to support audit readiness. Define and enforce data retention and access policies through Purview's governance portal, ensuring consistent rules for data sharing, usage, and privacy across the organization.
  • Data Quality & Documentation: Implement robust data quality management processes, including data profiling, validation checks, and monitoring for anomalies in critical datasets. When issues arise, lead root-cause analysis and remediation efforts. Maintain comprehensive documentation for the data platform – from data dictionaries and schema descriptions to pipeline flow diagrams and runbooks. This documentation, along with clearly defined data ownership roles and stewardship, will support long-term data health and onboarding of new data users.
  • Governance Committee Leadership: If applicable, lead a cross-departmental data governance committee or working group. Champion data governance initiatives such as establishing data access request processes, approving definitions for KPIs, and reviewing data compliance adherence. By instituting a governance framework, ensure that PTMA's data is trustworthy and well-governed, balancing data democratization with necessary controls and oversight.
  • Team Management & Mentorship: Manage a small data team of two individual contributors (the Analytics Engineer & BI Lead, and the Business Data Engineer & Reporting Lead). Provide mentorship, technical guidance, and support to help your team members grow in their roles. Set clear objectives, conduct regular one-on-ones and performance reviews, and foster a collaborative team culture that emphasizes excellence, innovation, and continuous improvement.
  • Strategic Planning & Execution: Balance strategic planning with day-to-day execution. Prioritize the data team's workload and projects in alignment with company goals, creating project plans and timelines. Proactively identify resource needs or skill gaps on the team and make recommendations.
Required Qualifications:
  • Education & Experience: bachelor's or master's degree in computer science, Data Engineering, Information Systems, or a related field. 8+ years of experience in data engineering or data architecture.
  • Technical Proficiency: Excellent SQL skills for data querying and performance tuning, and strong programming ability in Python (or similar language) for developing data pipelines and automation scripts. Deep understanding of relational databases, data modeling, and building data integration workflows.
  • Cloud & Azure Expertise: Hands-on experience with Microsoft Azure data services and architecture. This should include familiarity with tools such as Azure Data Factory or Synapse Pipelines, Azure Data Lake Storage (ADLS Gen2), Azure Synapse Analytics or SQL pools, and related Azure infrastructure.
  • Data Architecture & Warehousing: Strong knowledge of data warehousing concepts, star/schema design, and lakehouse architecture. Ability to design data models and pipelines that efficiently handle large volumes of data and support analytics use cases. Familiarity with Spark or distributed processing for big data is beneficial.
  • Governance & Security: Solid understanding of data governance and security best practices. Comfortable implementing data access controls, encryption, masking of sensitive data, and user permission models. Knowledge of compliance standards (FINRA, SEC rules, GDPR, etc.) and frameworks like SOC 2 for managing data securely is required. You should be able to design systems and policies that protect sensitive data and ensure privacy in line with industry regulations.
Preferred Qualifications:
  • Industry Experience: Prior experience in the financial services sector or another highly regulated industry (such as banking, fintech, insurance, healthcare, etc.) is highly desirable. Familiarity with the compliance, security, and data privacy obligations in these environments will help you hit the ground running. Experience working with investment or portfolio data, trading systems, or client reporting in a wealth management context would be a plus.
  • Business Intelligence & Analytics Tools: Experience with data visualization or BI tools, especially Microsoft Power BI. Understanding of how to create and manage Power BI datasets, reports, and dashboards, and how they connect to the underlying data models.

The pay range for this role is:
160, ,000 USD per year(Colorado)

  • Denver, Colorado, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.