XX
Data EngineerCargomaticUnited States
XX

Data Engineer

Cargomatic
  • US
    United States
  • US
    United States

Über

Data Engineer
Senior Data Architect Data Engineering Location: San Francisco, CA Reports To: VP of Engineering FLSA Status: Exempt Employment Type: Full-Time Compensation: $140,000 $160,000 annually (based on experience) About Cargomatic Cargomatic is transforming the local trucking industry with cutting-edge technology that connects shippers and carriers in real time. Every product that humans build, grow, or sell has spent time on a truck. Local trucking is the lifeblood of every regional economy, yet this $82 billion industry still relies heavily on outdated systems. Cargomatic is bringing transparency, efficiency, and intelligence to local freight through modern technology and data-driven solutions. We are solving complex, real-world logistics problems every day. If you thrive in a fast-paced environment, enjoy building scalable systems, and want to help shape the future of AI-powered logistics, we'd love to meet you. Position Summary
Cargomatic is seeking a Senior Data Architect Data Engineering to design and build scalable, cloud-native data infrastructure that powers analytics, machine learning, and AI-driven applications. This role combines deep data architecture expertise with hands-on experience in modern data platforms and LLM-enabled application development. You will lead the design of enterprise-grade data models, architect RAG systems, implement agentic workflows, and integrate secure, production-ready LLM capabilities into our ecosystem. This is a high-impact role with significant ownership, visibility, and opportunity to shape the future of intelligent logistics technology. Key Responsibilities
Data Architecture & Engineering Design and build scalable, cloud-native data pipelines (batch and streaming) supporting analytics, ML, and AI-powered applications Architect enterprise-grade data models across data lakes, warehouses, and real-time systems (Snowflake, Databricks, Kafka, DBT) Define standards for data governance, reliability, performance, and cost optimization Optimize storage formats and distributed data systems (Parquet, Delta Lake, Iceberg) AI & LLM-Enabled Systems Develop Retrieval-Augmented Generation (RAG) systems integrating structured and unstructured enterprise data Design and implement agentic workflows using frameworks such as LangChain, LangGraph, LlamaIndex, n8n, or similar Integrate LLM APIs (OpenAI, Anthropic, or similar) into secure, production-ready applications Implement guardrails, validation layers, monitoring, and evaluation frameworks to mitigate hallucination, prompt injection, and data security risks Backend & API Development Build secure backend APIs (Python/FastAPI) to expose AI-powered capabilities Ensure observability, monitoring, and cost controls across AI and data services Contribute to microservices architecture and distributed system design Collaboration & Leadership Partner cross-functionally with Product, Engineering, and Operations to translate business requirements into scalable technical solutions Mentor junior engineers and contribute to architectural standards and best practices Drive innovation in data engineering and AI-powered logistics systems Qualifications
Bachelor's degree in Computer Science or equivalent practical experience 8+ years of software or data engineering experience in production environments Strong expertise in data modeling, distributed systems, and scalable cloud architectures Hands-on experience with ETL/ELT frameworks and streaming technologies (Kafka, Spark, HEVO, Snowflake, DBT, etc.) Advanced SQL skills and deep understanding of modern storage formats Proficiency in Python and RESTful API development Experience integrating LLM APIs into production applications Strong understanding of system reliability, observability, and cost management in cloud environments Desired Experience
Experience building RAG pipelines including embeddings, vector search, chunking strategies, and hybrid retrieval Experience designing multi-agent or agentic AI workflows with orchestration frameworks Knowledge of LLM evaluation, monitoring, and tracing tools (LangSmith or similar) Experience with microservices architecture and distributed system design Exposure to transportation, logistics, or supply chain domains Active GitHub contributions or demonstrated passion for emerging AI and data technologies Why Join Cargomatic?
We offer competitive compensation and a comprehensive benefits package, including: Medical, Dental, and Vision insurance 401(k) with company match Flexible Spending Accounts (FSA) Company-paid Life and Disability insurance Flexible Paid Time Off (PTO) and company holidays Paid Parental Leave Employee Assistance Program (EAP) Opportunity to build cutting-edge AI solutions in a high-growth logistics technology company Collaborative, high-impact team environment Cargomatic is proud to be an Equal Opportunity Employer. We are committed to creating a diverse and inclusive workplace where all employees feel valued and empowered to succeed.
  • United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.