XX
(Closed)BASF

DevOps Engineer (m/f/d)

  • +3
  • +4
  • DE
    Germany
Show interest
  • +3
  • +4
  • DE
    Germany

About

JOIN THE TEAMAs part of BASF’s Data & AI Powerhouse, we as Data Transparency team provide an enterprise-wide Data Catalog solution as single-point of truth to share knowledge about data and to find and access relevant documented data in an efficient, convenient, and compliant way. Our goal is to accelerate business decisions and processes by enabling data citizens to connect data with insights.As part of an agile team, you will independently drive the development of the Data Catalog to bring transparency into the BASF Data Landscape and enable transfer of metadata across systems.RESPONSIBILITIESYou develop new integration processes to populate the Data Catalog with metadata, using Azure Data Factory, Azure Data Flows and Databricks.You will be responsible for both the design and end-to-end implementation of individual microservices and RESTful APIs as well as ensuring their smooth operation.You implement BPMN workflows within our Data Catalog to meet the needs of our internal stakeholders.You contribute to build a stable and secure infrastructure and develop CI/CD pipelines and test frameworks.You solve incidents and raise technical problems to the 3rd party vendor of our Data Catalog.You ensure code quality by defining coding conventions and development guidelines, ensuring high test coverage.Within the team, you use your experience to support a continuous exchange of knowledge and ensure effective collaboration with our internal stakeholders.QUALIFICATIONSBachelor’s Degree/Diploma in Computer Science, Software Engineering, or comparable technical education.Minimum of 3+ years of working experience in backend software development.Hands-on experience in designing, developing and consuming of web services - RESTful APIs, ideally using frameworks like Django.Experience in implementing and operating containerized microservices (using docker) in cloud environments.Experience with data transformation pipelines (ETL), ideally using Azure Data Factory, Data Flows, ADLS, and Databricks.Experience in implementing BPMN workflows is a plus.High interest in technical innovation and continuous improvement of existing systems.Ability to communicate clearly and effectively translating sophisticated concepts into easily-to-understand language.Agile mindset, proactive approach, good self-organization and teamwork in a remote environment, complemented by a strong customer focus.BENEFITSResponsibility from day one in a challenging work environment and "on-the-job" training as part of a committed team.Adequate compensation according to your qualifications and experience A secure work environment because your health, safety and wellbeing is always our top priority.Flexible work schedule and Home-office options, so that you can balance your working life and private life.Learning and development opportunities23 holidays per yearAnother 5 days (readjustments days) and 2 days (cultural days)A collaborative, trustful and innovative work environmentBeing part of an international team and work in global projectsRelocation assistance to Madrid provided

Nice-to-have skills

  • Azure Data Factory
  • Django
  • Docker
  • ETL
  • Germany

Work experience

  • Backend
  • Data Engineer
  • Data Infrastructure

Languages

  • English