XX
Software Engineer III- Data EngineerCharlotte StaffingUnited States
XX

Software Engineer III- Data Engineer

Charlotte Staffing
  • US
    United States
  • US
    United States

À propos

Software Engineer III, Data Engineering
The Software Engineer III, specializing in Data Engineering, plays a pivotal role in designing, developing, and maintaining scalable data pipelines, ETL (Extract, Transform, Load) processes, and analytics solutions to support enterprise-wide, data-driven decision-making. This position requires advanced expertise in data integration, analytics, and software development, and involves close collaboration with cross-functional teamsincluding data scientists, business analysts, and stakeholdersto deliver impactful insights. The role emphasizes innovation using platforms such as Informatica BDM, AbInitio, Snowflake and big data ecosystems like Hadoop, while maintaining high standards for data quality, security, and compliance. The engineer advocates for agile methodologies, CI/CD pipelines, and automated testing to accelerate delivery and minimize risk. Responsibilities include leading and participating in the development, testing, implementation, maintenance, and support of complex solutions, ensuring robust unit testing and support for release cycles. The engineer also builds monitoring capabilities, provides escalated production support, and maintains security controls in line with company standards. Typically, this role leads moderately complex projects and contributes to larger initiatives, solving complex technical and operational challenges and serving as a resource for less experienced teammates. Essential Duties And Responsibilities
Following is a summary of the essential functions for this job. Other duties may be performed, both major and minor, which are not mentioned below. Specific activities may change from time to time. Architect and implement robust ETL workflows using tools like Informatica PowerCenter, AbInitio. Data mapping, transformation logic, error handling, and performance optimization for high-volume data processing. Design and develop data pipelines in Snowflake for efficient data warehousing, querying, and analytics, leveraging features such as Snowpark for custom processing and zero-copy cloning for cost-effective data sharing. Build and maintain distributed data processing systems on Hadoop ecosystems (e.g., Hive, Spark, HDFS), ensuring scalability, fault tolerance, and seamless integration with upstream and downstream systems. Develop advanced SQL queries, stored procedures, and optimizations for both relational and NoSQL databases to support complex data extraction, aggregation, and reporting needs. Create interactive dashboards, visualizations, and reports in Power BI, integrating multiple data sources to enable self-service analytics and real-time business intelligence. Perform data analytics tasks, including exploratory data analysis, statistical modeling, and trend identification to derive actionable insights and support predictive analytics initiatives. Collaborate on full-stack development using the .NET framework (C#) for backend services and JavaScript (including frameworks like React or Node.js) for frontend data visualization tools and user interfaces. Lead code reviews, mentor junior engineers, and contribute to technical design documents, ensuring adherence to coding standards, design patterns, and security best practices (e.g., OWASP for web applications). Troubleshoot and resolve production issues in data pipelines and applications, implementing monitoring, alerting, and logging using tools such as Splunk or Azure Monitor. Drive continuous improvement by adopting industry best practices, including DevOps automation, containerization (Docker/Kubernetes), and machine learning operations (MLOps) for data workflows. Participate in agile ceremonies, sprint planning, and stakeholder meetings to align technical solutions with business objectives and deliver value iteratively. Qualifications
The requirements listed below are representative of the knowledge, skill and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. Required Qualifications
1. Bachelor's Degree and six to ten years of experience or equivalent education and software engineering training or experience 2. In-depth knowledge in information systems and ability to identify, apply, and implement best practices 3. Understanding of key business processes and competitive strategies related to the IT function 4. Ability to plan and manage projects and solve complex problems by applying best practices 5. Ability to provide direction and mentor less experienced teammates. Ability to interpret and convey complex, difficult, or sensitive information Preferred Qualifications
Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field. 5+ years of progressive experience in software engineering, with at least 3 years focused on data engineering, ETL development, and data analytics. Proven track record of delivering production-ready data solutions in fast-paced environments, preferably in financial services, healthcare, or other regulated industries. Strong problem-solving skills, with the ability to handle ambiguous requirements and scale solutions for terabyte-scale datasets. Core Data Engineering & ETL Skills
Advanced proficiency in Informatica PowerCenter for ETL design, scheduling, and workflow management. Expertise in Snowflake for cloud data warehousing, including SQL scripting, data sharing, and performance tuning. Hands-on experience with Hadoop ecosystem (HDFS, MapReduce, Hive, Spark) for big data processing and distributed computing. Expert-level SQL skills across multiple databases (e.g., Oracle, SQL Server, PostgreSQL), including query optimization, indexing, and data modeling. Analytics & Visualization
Strong experience with Power BI for dashboard development, DAX scripting, data modeling, and integration with APIs/ODBC sources. Proficiency in data analytics techniques using Python, R, or SQL for data cleaning, statistical analysis, and visualization. Software Development
Solid experience with .NET (C#, ASP.NET) for building scalable backend services and APIs. Proficiency in JavaScript/TypeScript, including modern frameworks (e.g., React, Angular, Vue.js) for interactive web applications and data-driven UIs. Additional Technical Skills
Familiarity with cloud platforms (AWS, Azure, GCP) for data services such as S3, Azure Data Factory, or BigQuery. Knowledge of version control (Git), CI/CD tools (Jenkins, GitHub Actions), and container orchestration. Other Job Requirements / Working Conditions
Sitting Constantly (More than 50% of the time) Standing Frequently (25% - 50% of the time) Walking Frequently (25% - 50% of the time) Visual / Audio / Speaking Able to access and interpret client information received from the computer and able to hear and speak with individuals in person and on the phone. Manual Dexterity / Keyboarding Able to work standard office equipment, including PC keyboard and mouse, copy/fax machines, and printers. Availability Able to work all hours scheduled, including overtime as directed by manager/supervisor and required by business need. Travel Minimal and up to 10%
  • United States

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.