Dieses Stellenangebot ist nicht mehr verfügbar
Über
ATTENTION ALL SUPPLIERS!!!
READ BEFORE SUBMITTING • UPDATED CONTACT NUMBER and EMAIL ID is a MANDATORY REQUEST from our client for all the submissions • Limited to 1 submission per supplier. Please submit your best. • We prioritize endorsing those with complete and accurate information • Avoid submitting duplicate profiles. We will Reject/Disqualify immediately. • Make sure that candidate's interview schedules are updated. Please inform the candidate to keep their lines open. • Please submit profiles within the max proposed rate. • Please make sure to TAG the profiles correctly if the candidate has WORKED FOR INFOSYS as a SUBCON or FTE. MANDATORY: Please include in the resume the candidate's complete & updated contact information (Phone number, Email address and Skype ID) as well as a set of 5 interview timeslots over a 72-hour period after submitting the profile when the hiring managers could potentially reach to them. PROFILES WITHOUT THE REQUIRED DETAILS and TIME SLOTS will be REJECTED.
Job Title: Technology Lead | Analytics - Packages | Python - Big Data -- AWS Data and Analytics Lead Work Location & Reporting Address: Austin, TX 78717 (Onsite-Hybrid. Will consider candidates willing to relocate to client's location) Contract duration: 6 MAX VENDOR RATE: per hour max depending on performance and experience Target Start Date: 21 Jan 2026 Does this position require . independent candidates only? Yes
Must Have Skills: ? WS Glue ? Kafka ? Step Functions ? Lambda ? Terraform
Nice to Have Skills: ? .NET
Detailed Job Description: Key Responsibilities: ? Design and implement Generative AI models for text, image, or multimodal applications. ? Develop prompt engineering strategies and embedding-based retrieval systems. ? Integrate Gen AI capabilities into web applications and enterprise workflows. ? Build agentic AI applications with context engineering and ClientP tools.
Required Skills & Qualifications: ? rchitect and implement data pipelines using AWS Glue for ETL processes. ? Design event-driven workflows using AWS Lambda and Step Functions. ? Ensure high availability, scalability, and performance of data solutions. ? Develop and maintain infrastructure using Terraform for AWS resources. ? Implement CI/CD pipelines for automated deployments. ? Optimize data ingestion, transformation, and storage strategies. ? Ensure data quality, governance, and compliance with security standards. ? Lead and mentor a team of data engineers. ? Collaborate with cross-functional teams including Data Scientists, Architects, and DevOps. ? Implement monitoring and alerting for data workflows. ? Continuously optimize cost and performance of AWS services.
Minimum Years of Experience: ? 10
Certifications Needed: ? No
Top 3 responsibilities you would expect the Subcon to shoulder and execute: ? Optimize data ingestion, transformation, and storage strategies. ? Ensure data quality, governance, and compliance with security standards. ? Lead and mentor a team of data engineers.
Interview Process (Is face to face required?) ? No
Any additional information you would like to share about the project specs/nature of work:
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.