This job offer is no longer available
About
Design and operate telemetry ingestion pipelines for data collection from various security sources Normalize and enrich telemetry into structured datasets for consistent correlation Build and maintain data models that connect users, devices, identities, and activities in the security ecosystem
Required Qualifications
Bachelor's degree in Computer Science, Engineering, Cybersecurity, Data Engineering, or a related field, or equivalent experience 5+ years of experience in designing and operating large scale data pipelines in a security or enterprise environment Strong understanding of security telemetry from various sources including endpoint and cloud Experience with modern data platforms and ingestion technologies such as Databricks, Snowflake, or Kafka Hands-on experience with data normalization frameworks like OCSF or ECS
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.