XX
Data Engineer (AI & Data Platforms) - Q126R2 TechnologiesUnited States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Data Engineer (AI & Data Platforms) - Q126

R2 Technologies
  • US
    United States
  • US
    United States

Über

Job Title:
Data Engineer (AI & Data Platforms)
Company:
R2 Technologies
Location:
Alpharetta, GA (Hybrid / Remote Options Available)
Employment Type:
Full-Time / Contractual
About R2 Technologies:
R2 Technologies is a Certified Minority Business Enterprise (MBE) headquartered in Alpharetta, GA. With over two decades of experience across global markets, we have built a reputation as a trusted partner for IT staffing excellence and cutting-edge digital product innovation. We are driven by innovation and operate on a simple philosophy: "We deliver what we promise, and we promise only what we can deliver." Beyond providing top-tier IT talent, R2 builds cutting-edge proprietary solutions like SmartEnt-an Enterprise AI & IoT Intelligence Platform utilizing advanced NLP and AI technologies. By partnering closely with our clients, we deliver technology-driven outcomes that are realistic, measurable, and impactful.
Job Summary:
The role of the Data Engineer has fundamentally shifted in 2026. R2 Technologies is seeking an innovative Data Engineer who goes beyond traditional ETL to become a builder of "AI-ready" data foundations. You will be crucial in powering our SmartEnt platform and client AI initiatives by engineering context-rich data pipelines. Utilizing AI coding assistants (GitHub Copilot, Cursor) to automate boilerplate transformations, you will design systems that feed directly into Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) architectures.
Key Responsibilities:
* Build and maintain scalable, low-latency data pipelines utilizing modern data stack tools (e.g., dbt, Snowflake, Databricks, Apache Iceberg) to process both structured and unstructured data.
Integrate and manage Vector Databases (such as Pinecone, Milvus, or Qdrant) to support enterprise semantic search and RAG workflows. Leverage AI-assisted coding tools to rapidly generate, test, and optimize SQL/Python data transformation code, freeing up time for complex architectural decisions. Orchestrate data workflows using Apache Airflow, Dagster, or n8n to ensure seamless, event-driven data delivery to AI agents and downstream analytics. Implement data reliability engineering practices (data contracts, observability, lineage) to ensure LLMs are grounded in highly accurate, governed data. Collaborate closely with Full Stack and AI/ML engineers to support multi-modal data ingestion from IoT devices, APIs, and enterprise systems. Qualifications:
*
Up to 3 years
of hands-on experience in Data Engineering, Data Analytics, or ML Engineering.
Strong proficiency in Python and advanced SQL. Hands-on experience with cloud data platforms (Snowflake, Databricks, BigQuery) and orchestration tools. Familiarity with creating embedding pipelines and working with Vector Databases. Proven experience or strong familiarity working alongside AI coding assistants to enhance productivity. Understanding of modern lakehouse architectures and open table formats (Apache Iceberg, Delta Lake). Excellent communication skills and the ability to adapt to the fast-paced convergence of data and AI.
Skills:
Snowflake,Apache
  • United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.