XX
DBT Analytics DeveloperUSMUnited States

This job offer is no longer available

XX

DBT Analytics Developer

USM
  • US
    United States
  • US
    United States

About

Start Date: Interview Types Skills 1) 2+ years recent e.. Visa Types Green Card, US Citiz..
TEK Formers/Currents will be given preference!
Client: Coupa Software
Location: Foster City, CA (100% remote)
Title: DBT Analytics Developer
Duration: 6 months
PLEASE READ:
We are NOT looking for a typical data engineer. There is NO building of pipelines involved (a skill like Python is not needed). The source is already loaded into Snowflake by the data engineering team, a separate group from this analytics team. The Analytics Engineer will take source system data that has already been extracted and loaded into the Snowflake data warehouse, and curate, model or aggregate this data so that dashboards can be created and additional analytics can be performed. He or she will build SQL transformations using dbt.
The job profile is someone who has experience building SQL transformations using DBT, preferably someone who worked on Salesforce, Marketo, and NetSuite data in their previous portfolio.
Description
Our customer is beginning multiple projects and needs to add capacity to their 8-person analytics engineering team. They need an Analytics Engineer to perform DBT/SQL modeling of production level data models, using dbt Cloud. The consultant must have experience building SQL transformations using dbt. It\'s highly preferred he or she have knowledge of Salesforce data (i.e. connecting objects, API names, etc.). Experience with Marketo, NetSuite and other B2B SaaS data would be helpful.
Please note for this role: We are NOT looking for a typical data engineer. There is NO building of pipelines involved (a skill like Python is not needed). The source is already loaded into Snowflake by the data engineering team, a separate group from this analytics team.
The Analytics Engineer will take source system data that has already been extracted and loaded into the Snowflake data warehouse, and curate, model or aggregate this data so that dashboards can be created and additional analytics can be performed. He or she will build SQL transformations using dbt. There is also some calculating of KPIs through definitions; then getting the data sets into a schema (aka certified data sets). This is all done in Snowflake using dbt (data build tool) code in dbt Cloud.
Sources of data include Salesforce CRM (account, contact, opportunity, etc.), NetSuite ERP, Marketo, etc. There is also some first-party, client specific data that might be product related. All data comes into Snowflake through pipelines, mostly standard objects . Data is manipulated, curated, and then data sets are created.
The analytics engineer will take source data and model it, curate it, aggregate it, and take the requirements from the business when necessary and translate them into resulting data sets. Again, the data already exists in the data warehouse, but it needs to be prepared for the analytics that are needed (i.e. dashboards, ML models, ad hoc analysis).
Required skills for Analytics Engineer include:
- MUST have at least 2+ years of solid, recent experience writing DBT (data build tool) code in dbt Cloud - this is the #1 skill needed
- excellent overall SQL skills, experience building SQL transformations using dbt
- Experience modeling, curating, and transforming data that\'s housed in a Snowflake DW for analytics and business consumption
- Knowledge of Salesforce sales data (i.e. connecting objects, API names, etc.) is highly preferred
- experience utilizing GitHub for version control and collaboration needed
- understanding of CI/CD principles and knowing how to collaborate with teams when considering code basi
- experience with NetSuite, Marketo and other B2B SaaS data is desirable
Sales KPIs are currently being established related to annual revenue, bookings, go-to-market, etc.
Example use case: client sales team wants to build a new dashboard to measure certain KPIs related to sales pipeline quality. They may want to divide it by region and calculate win rate or create a sales opportunity a certain way. There will be business requirements related to what it means to turn a lead into an opportunity, or a sales incentive opportunity. The engineer might have to gather some requirements then surface the data set to do dashboarding.
Top Skills Details
1) 2+ years recent experience writing dbt code (data build tool) using dbt Cloud
2) Outstanding SQL skills, experience building SQL transformations using dbt
2) Experience modeling, curating, and transforming data that\'s housed in a Snowflake DW for analytics and business consumption
4) Knowledge of Salesforce data (i.e. connecting objects, API names, etc.) is highly preferred
Worksite Address
950 Tower Ln Ste 2000,Foster City,California,United States,94404-4255
Workplace Type
100% Remote
EVP
Great opportunity to work with modern analytics technology.
Potential long-term assignment.
Somewhat flexible work hours.
Work Environment
100% remote work.
Hours somewhat flexible but must be able to attend meetings during regular business hours.
Additional Skills & Qualifications
Excellent communication skills are required - there will be a lot of interaction with the business.
Interview Information
30 minute video interview with VP of Analytics. She can interview on Friday, Jan 30.
Many of the interview questions will be questions related to a specific scenario, and how the candidate would solve that business problem. Essentially, the candidate should prove whether he/she understands the business problem and how to solve it.
Business Challenge
Customer needs to increase revenue. But using advanced analytics it can monetize its existing sales data to accomplish this.
  • United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.