Data Engineer, Expert
Requisition ID # 171415
Job Category: Information Technology
Job Level: Individual Contributor
Business Unit: Strategy & Growth
Work Type: Hybrid
Job Location: Oakland
Department Overview
The Strategy and Growth team is dedicated to multi-year strategic and infrastructure planning infrastructure planning. With electricity demand expected to double over the next 15 years, we’re building an energy system of the future for the world’s fourth-largest economy. The team includes Energy Policy and Procurement, Innovation and Strategy, and Strategic Commercial Solutions.
Position Summary
Designs, develops, modifies, configures, debugs and evaluates jobs for extracting data from various sources (including cloud data warehouses), implements transformation logic, and stores data in various formats fit for use by stakeholders. Supports scalable forecasting and analytics workflows by enabling reliable, governed data pipelines and production‑ready machine learning inputs. Collects metadata about jobs including data lineage and transformation logic. Works with teams, clients, data owners, and leadership throughout the development cycle practicing continuous improvement.
This position is hybrid, working from your remote office and your assigned location based on business need. The number of days on-site is 1-4 days per month and may vary if there is a business need to come to office for work related meetings, trainings etc.
PG&E is providing the salary range that the company in good faith believes it might pay for this position at the time of the job posting. This compensation range is specific to the locality of the job. The actual salary paid to an individual will be based on multiple factors, including, but not limited to, specific skills, education, licenses or certifications, experience, market value, geographic location, and internal equity. Although we estimate the successful candidate hired into this role will be placed towards the middle or entry point of the range, the decision will be made on a case-by-case basis related to these factors.
Bay Minimum: $140,000
Bay Maximum: $238,000
&/OR
CA Minimum: $133,000
CA Maximum: $226,000
This job is also eligible to participate in PG&E’s discretionary incentive compensation programs.
Job Responsibilities
- Leads a team on moderately complex to complex data and analytics-centric problems having broad impact that require in-depth analysis and judgment to obtain results or solutions.
- May contribute to the resolution of uniquely complex data and analytics-centric problems having significant impact
- Identifies, designs and implements internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Resolves application programming analysis problems of broad scope within procedural guidelines.
- Provides assistance to other programmers/analysts on unusual or especially complex problems that cross multiple functional/technology areas.
- Conceptualizes and generates infrastructure that allows big data to be accessed and analyzed with verified data quality and metadata is appropriately captured and catalogued.
- Collaborates with peers to develop departmental standards, norms, and new goals/objectives.
- Plans work to meet assigned general objectives; reviews progress regularly and solutions may provide an opportunity for creative/non-standard approaches.
- Assesses data pipeline performance and suggests/implements changes as required.
- Designs and supports data pipelines that enable time series forecasting, scenario analysis, and machine learning driven analytics.
- Partners with analytics, forecasting, and engineering teams to translate modeling requirements into reliable, production data assets.
- Supports deployment, monitoring, and lifecycle management of machine learning models used in forecasting and planning applications.
- Ensures data governance, reproducibility, and auditability for analytical and forecasting use cases.
- Communicates (oral and written) recommendations.
- Mentors/provides guidance to less experienced colleagues.
Qualifications
Minimum:
- BA/BS in Computer Science, Management Information Systems or related field of study, or equivalent experience
- 7 years of experience with data engineering/ETL ecosystems such as Palantir Foundry, Spark, Informatica, SAP BODS, OBIEE.
- Experience with multiple data engineering/ETL ecosystems.
- Experience with machine learning algorithm deployment.
Desired:
- Master’s degree in Computer Science, Management Information Systems or related field, or equivalent experience
- Previous Snowflake experience (cloud data warehouse, performance, governance)
- Production ML deployment
- Time‑series / forecasting analytics exposure
- Strong data governance, lineage, and metadata practices
- Software engineering discipline (CI/CD, testing, version control)Leadership experience, development teams
- Business Intelligence and data access tool expertise.
- Knowledge of software engineering principals such as unit testing, CI/CD, source control.