Data Engineer - Finance Data Reporting & AnalyticsCastleton Commodities International • United States
Data Engineer - Finance Data Reporting & Analytics
Castleton Commodities International
- United States
- United States
About
CCI is hiring a Data Engineer for our US development team to help advance our Reporting & Data Analytics platform by building and improving finance-focused data pipelines and broader data management capabilities. Responsibilities include architecting, developing, and maintaining robust and scalable data pipelines, ensuring the integrity and availability of data for analysis and reporting. Analyze source data to identify patterns, anomalies, and opportunities for data enrichment, ensuring comprehensive understanding for effective data modeling. Design and implement data models that support complex financial analysis, integrating data quality measures to ensure the accuracy and reliability of data products. Perform ongoing data management tasks, including data mapping, standardization, and normalization to facilitate consistent data consumption. Engage in research to stay abreast of the latest trends and best practices in data management, big data, AI, and vendor solutions. Lead the migration of existing datasets, databases, and codebases to modern and scalable technology stacks. Monitor data loads, troubleshoot errors, establish efficient data workflows, and conduct data quality analyses to identify and resolve systemic issues. Proactively manage vendor relationships and anticipate changes in data inputs to maintain data product relevance. Collaborate with global teams to understand and document data flows, architecture, and functional requirements, ensuring alignment with business objectives. Qualifications include a bachelor's degree in computer science, engineering, mathematics, physics, or a related field of study. Five or more years of experience in data engineering with experience in energy commodities or financial services. Strong analytics skills with demonstrated attention to detail. Superb communication skills and ability to interact with all levels of management. Ability to perform under pressure with a high motivation/energy level and manage multiple tasks to established deadlines. Demonstrated successful management of projects. Demonstrable expertise in programming with a strong command of SQL and Python. Proficient with orchestration tools such as Autosys and Airflow, with the ability to design and manage automated data workflows. Hands-on experience with modern data transformation tools, including DBT (Data Build Tool), AWS Redshift or other cloud-based data warehouse solution and Airflow, showcasing skills in efficiently transforming and processing large datasets. Solid repository management experience, with a deep understanding of CI/CD pipeline processes, version control systems, and best practices in code deployment and integration. Knowledgeable in implementing automated testing and data quality (DQ) frameworks, ensuring the reliability and integrity of data solutions. Experience with cloud-based data warehouse environment with emphasis on managing and optimizing data for reporting and analysis. Adept at troubleshooting, optimizing data processes, and resolving complex data engineering challenges. Familiarity with business intelligence tools Power BI and Tableau. Ability to work effectively in a fast-paced, dynamic and high-intensity environment including open-floor plan if applicable to the position, with timely responsiveness and the ability to work beyond normal business hours when required.
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.