Cette offre d'emploi n'est plus disponible
À propos
The actual salary paid to an individual will be based on multiple factors, including, but not limited to, particular skills, education, licenses or certifications, experience, market value, geographic location, collective bargaining agreements, and internal equity. Although we estimate the successful candidate hired into this role will be placed towards the middle or entry point of the range, the decision will be made on a case-by-case basis related to these factors. This job can also participate in PG&E’s discretionary incentive compensation programs.
A reasonable salary range is: Bay Area Minimum: $122,000 Bay Area Mid-point: $158,000 Bay Area Maximum: $194,000 •
This position follows a hybrid work model, requiring employees to report to their assigned office location at least one to two days per week. The remaining days may be worked remotely, depending on business needs. The headquarters is located in Oakland, CA. •
The first round interview for this role will be conducted in person at our Oakland headquarters. Job Responsibilities •
Conceptualizes and generates infrastructure that allows big data to be accessed and analyzed. •
Collaborate with cross-functional teams—including data scientists, analysts, and business stakeholders—to understand data requirements and deliver high-quality, analytics-ready datasets. •
Support the deployment of machine learning models by enabling feature pipelines, model input/output data flows, and integration with platforms like SageMaker or Foundry. •
Resolves application programming analysis problems of moderate to complex scope within procedural guidelines. May seek assistance from the supervisor or more skilled programmers/analysts on unusual or especially complex issues that cross multiple functional/technology areas. •
Works on complex data and analytics-centric problems having a moderate impact that require in-depth analysis and judgment to obtain results or solutions •
Plans work to meet assigned general objectives; progress is reviewed upon completion, and solutions may provide an opportunity for creative/non-standard approaches. •
Communicates (oral and written) recommendations. •
Mentors/guides less experienced colleagues. Qualifications Minimum: •
BA/BS in Computer Science, Management Information Systems, related field of study, or equivalent experience. •
5 years of experience with data engineering/ETL ecosystem, such as Palantir Foundry, Spark, Informatica, SAP BODS, OBIEE. Desired: •
Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). •
Familiarity with business intelligence tools such as Power BI, Tableau, or Foundry for data visualization and reporting. •
Knowledge of software engineering principles such as unit testing, CI/CD, and source control. •
Experience working with geospatial data and tools such as ArcGIS. •
Experience building data migration pipelines in Informatica • Experience with machine learning algorithm deployment.
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre a été publiée par l’un de nos partenaires. Vous pouvez consulter l’offre originale ici.