This job offer is no longer available
About
Typical task breakdown:
Develop and maintain data pipelines using Python, SQL, and Snowflake. utomate data processing tasks and reporting workflows using Snowflake scripts and notebooks. Build and maintain Streamlit apps for internal data applications, visualization, and interaction. Schedule and monitor data jobs to ensure timely delivery of reports, with increased activity during month-end. Test programs or databases, correct errors, and make necessary modifications. Modify existing databases and database management systems.
Interaction with team
:
They support the data flows used for reporting and visualizations Team Structure
Reporting, Visualization, ETL and Data Operations 5 person team
Education & Experience Required:
Years of experience: 5+ programming experience Degree requirement: Bachelor's degree preferred in Computer Science, Engineering, or a related technical field. Open to all degree if they have actual programming experience Do you accept internships as job experience: No re there past or additional job titles or roles that would provide comparable background to this role: Data Engineer, Data Operations, ETL Developer
Top 3 Skills Python, SQL, Snowflake
dditional Technical Skills Strealit, VS Code, GitHub, DevOps, PowerAutomate, Snowflake notebooks
(Desired)
PowerBI, Tableau
Soft Skills (Required)
Collaborate with project teams to coordinate database development and determine scope and limitations. Review project requests describing user needs to estimate time and cost required to accomplish project. Strong verbal and written communication skills. Excellent problem-solving and interpersonal skills. bility to work independently and manage time effectively.
(Desired)
Basic mentoring skills to support and provide constructive feedback.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.