About
Essential Responsibilities
Design, develop, and maintain enterprise‑class data pipelines capable of processing billions of records daily with high throughput and reliability.
Write advanced, complex SQL queries for data transformation, analysis, and optimization across petabyte‑scale datasets.
Build and optimize both batch and real‑time/streaming data systems that solve low‑latency business problems.
Hands‑on development of scalable data architectures in GCP using BigQuery, Bigtable, Dataproc, and Pub/Sub.
Perform deep data analysis to identify trends, anomalies, and insights that drive business decisions.
Ramp up on Agentic AI capabilities and apply outside‑the‑box thinking to conceptualize innovative data solutions.
Troubleshoot and resolve complex data pipeline issues, performing root cause analysis and implementing fixes.
Optimize query performance, data models, and pipeline efficiency to meet strict SLAs.
Actively engage in design and code reviews, maintaining high standards of code quality and engineering excellence.
Take ownership of end‑to‑end data pipeline development from design through production deployment.
Implement data quality checks, monitoring, and alerting for production pipelines.
Collaborate with Product, Analytics, and Engineering teams to understand data requirements and deliver solutions.
Participate in on‑call rotations and respond to production incidents with urgency.
Document technical designs, data flows, and operational runbooks.
Contribute to continuous improvement of data engineering practices and standards.
Qualifications
Demonstrated deep hands‑on experience building enterprise‑class data platforms and pipelines at scale.
Expert‑level SQL skills – ability to write and optimize complex queries across large datasets (billions of records).
Strong experience with batch processing (Spark, Dataproc) and real‑time/streaming systems (Kafka, Pub/Sub, Dataflow).
Proven ability to build low‑latency data systems that solve time‑sensitive business problems.
Hands‑on proficiency with Cloud data services – preferably BigQuery, Bigtable, Dataproc, Cloud Storage, Pub/Sub.
Strong Python programming skills for data pipeline development and automation.
Experience with workflow orchestration tools (Airflow, UC4, or similar).
Solid understanding of data modeling, schema design, and data warehouse concepts.
Experience with CI/CD pipelines and DevOps practices for data systems.
Strong debugging and troubleshooting skills for distributed data systems.
Ability to work in a fast‑paced environment and ramp up quickly on new technologies and domains.
5+ years relevant experience and a Bachelor’s degree OR any equivalent combination of education and experience.
Experience with Agentic AI – ability to conceptualize and design data solutions leveraging Agentic AI with outside‑the‑box thinking.
Experience in Marketing Technology (MarTech) or Advertising Technology (AdTech) domain.
Experience building Customer Data Platforms (CDP).
Experience with Customer Segmentation and Activation across owned and paid channels.
Experience with real‑time integration with third‑party AdTech platforms.
Experience with Attribution and Analytics systems.
Familiarity with AI/ML pipelines or AI‑enhanced data systems.
Employment Information
Position: Data Engineering – MarTech Platform
Travel: 0%
Location: San Jose, California (range $129,500 – $191,950 annually); Austin, Texas (range $117,500 – $174,350 annually).
Pay range dependent on experience and expertise.
Additional compensation may include annual performance bonus, equity, or other incentive compensation.
Benefits include paid time off, healthcare coverage for you and your family, and resources for financial and mental health support.
Equal Employment Opportunity PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, disability, race, religion, gender, sexual orientation, or any other protected characteristic. PayPal will provide reasonable accommodations for qualified individuals with disabilities.
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.