The ETL DevOps engineer is responsible for the set-up and maintenance of Apache Airflow-based solution that facilitate the flow of data into, out of, and within the agency. Using solutions developed with the data ops team, the ETL DevOps engineer will address requests from the business in Jira to bring in new datasets from clients, media partners, website tagging, and other external vendors. They will be responsible for validating data accuracy against requirements and by working with the end-users. They will monitor daily data ETL jobs and troubleshoot problems. The ETL DevOps engineer will also work within the Broadbeam DMP to prepare tables and views in response to new requests.
Responsibilities
Implement & support integration with various data sources: evaluate various data sources and implement data ingestion and export with Apache Airflow via various means – API, SFTP, email, etc
Handle ETL tasks and scripts: implement data extract, transform and load using Python and Airflow tools.
Day-to-day Operations: support uninterruptable data processing and flow implemented as BBM DMP infrastructure
Data strategy & prep: data manipulation coding (SQL) to combine or summarize datasets prior to end-user access per requirements.
Supervisory Responsibilities
none
Key Competencies Required
Analytical - Synthesizes complex or diverse information; Collects and researches data; Uses intuition and experience to complement data; Completes statistical analyses; Queries complex data
Problem Solving - Identifies and resolves problems in a timely manner; Gathers and analyzes information skillfully; Develops alternative solutions; Works well in group problem solving situations
Technical Skills – Strong data querying, report building, statistical skills; Pursues training and development opportunities; Strives to continuously build knowledge and skills; Shares expertise with others.
Quality Management - Looks for ways to improve and promote quality; Demonstrates accuracy and thoroughness.
Planning/Organizing - Prioritizes and plans work activities; Uses time efficiently; Meets deadlines; Plans for additional resources; Sets goals and objectives; Develops realistic action plans.
Quality - Demonstrates accuracy and thoroughness; Looks for ways to improve and promote quality; Monitors own work to ensure quality.
Dependability - Follows instructions, responds to management direction; Takes responsibility for own actions; Keeps commitments; Commits to long hours of work when necessary to reach goals.; Completes tasks on time or notifies appropriate person with an alternate plan.
Innovation - Displays original thinking and creativity; Meets challenges with resourcefulness; Generates suggestions for improving work; Develops innovative approaches and ideas; Presents ideas and information in a manner that gets others' attention.
Qualifications
5+ years is a data operations at an agency or company that purchases media. Proven track record of supporting internal and external teams with data storage and ETL solutions. Experience working with Airflow and designing and evolving overall data lake, DMP, and ETL architecture is preferred.
5+ years’ experience with driving the overall approach to data governance including aligning usability with end-user needs and cost, data security, and data validation and integrity.
Good communication and client relationship skills with a proven history of building and maintaining relationships.
Experience with day-to-day data operations including handling stream of requests, clarifying requirements, and handling tasks as part of ongoing projects, and overall company priorities.
The ability to translate results into clear, accessible data set, written reports, and light data visualizations
Knowledge of multiple media types is preferred including linear TV, OTT / CTV, SEM, SEO, social paid, social earned / owned, display, radio.
Travel
This position requires in-frequent travel with current public health conditions improve.
Responsibilities
Implement & support integration with various data sources: evaluate various data sources and implement data ingestion and export with Apache Airflow via various means – API, SFTP, email, etc
Handle ETL tasks and scripts: implement data extract, transform and load using Python and Airflow tools.
Day-to-day Operations: support uninterruptable data processing and flow implemented as BBM DMP infrastructure
Data strategy & prep: data manipulation coding (SQL) to combine or summarize datasets prior to end-user access per requirements.
Supervisory Responsibilities
none
Key Competencies Required
Analytical - Synthesizes complex or diverse information; Collects and researches data; Uses intuition and experience to complement data; Completes statistical analyses; Queries complex data
Problem Solving - Identifies and resolves problems in a timely manner; Gathers and analyzes information skillfully; Develops alternative solutions; Works well in group problem solving situations
Technical Skills – Strong data querying, report building, statistical skills; Pursues training and development opportunities; Strives to continuously build knowledge and skills; Shares expertise with others.
Quality Management - Looks for ways to improve and promote quality; Demonstrates accuracy and thoroughness.
Planning/Organizing - Prioritizes and plans work activities; Uses time efficiently; Meets deadlines; Plans for additional resources; Sets goals and objectives; Develops realistic action plans.
Quality - Demonstrates accuracy and thoroughness; Looks for ways to improve and promote quality; Monitors own work to ensure quality.
Dependability - Follows instructions, responds to management direction; Takes responsibility for own actions; Keeps commitments; Commits to long hours of work when necessary to reach goals.; Completes tasks on time or notifies appropriate person with an alternate plan.
Innovation - Displays original thinking and creativity; Meets challenges with resourcefulness; Generates suggestions for improving work; Develops innovative approaches and ideas; Presents ideas and information in a manner that gets others' attention.
Qualifications
5+ years is a data operations at an agency or company that purchases media. Proven track record of supporting internal and external teams with data storage and ETL solutions. Experience working with Airflow and designing and evolving overall data lake, DMP, and ETL architecture is preferred.
5+ years’ experience with driving the overall approach to data governance including aligning usability with end-user needs and cost, data security, and data validation and integrity.
Good communication and client relationship skills with a proven history of building and maintaining relationships.
Experience with day-to-day data operations including handling stream of requests, clarifying requirements, and handling tasks as part of ongoing projects, and overall company priorities.
The ability to translate results into clear, accessible data set, written reports, and light data visualizations
Knowledge of multiple media types is preferred including linear TV, OTT / CTV, SEM, SEO, social paid, social earned / owned, display, radio.
Travel
This position requires in-frequent travel with current public health conditions improve.