Cette offre d'emploi n'est plus disponible
À propos
The goal is to create a realistic, enterprise-level big data pipeline architecture that demonstrates how we design and implement scalable data infrastructure .
Scope of Work:
• Design a complete end-to-end big data architecture (source to target)
• Include data ingestion, processing, orchestration, storage, and analytics layers
• Incorporate modern tools such as Kafka, Spark, Snowflake, Airflow, and cloud infrastructure (AWS preferred)
• Provide a clear, professional architecture diagram
• Deliver a short technical explanation of each layer and tool selection
Deliverables:
• High-level architecture diagram
• Detailed architecture diagram (clean and structured)
• Brief documentation explaining components and data flow
• Editable diagram file (e.g., Lucidchart, , Visio, or similar)
This architecture will be used for marketing and client-facing materials, so it must reflect real-world enterprise standards and best practices.
Ideal candidate:
• Proven experience designing big data solutions
• Strong knowledge of distributed systems and cloud architecture
• Experience with Kafka, Spark, Snowflake, Airflow, and AWS
• Ability to design clean, professional architecture diagrams
Please share examples of previous architecture diagrams or similar projects.
Contract duration of less than 1 month.
Mandatory skills: Big Data, ETL Pipeline, Apache Spark, Apache Hadoop, Apache Kafka, Snowflake, Apache Airflow, Data Architecture
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre a été publiée par l’un de nos partenaires. Vous pouvez consulter l’offre originale ici.