Data Engineer

Job description

At ZoomInfo we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. If you are collaborative, take initiative, and get stuff done we want to talk to you! We have high aspirations for the company and are looking for the right people to help fulfill the dream. We strive to continually improve every aspect of the company and use cutting-edge technologies and processes to delight our customers and rapidly increase revenues.

As a Data Engineer on the Corporate Engineering - Business Intelligence Team, you will help us build a scalable and robust data environment for our BI team. In this role, you’ll be a source of technical knowledge and experience, facilitating, building (hands-on), and fostering great technical decisions.You’ll be involved in multi-faceted projects at once, where you’ll be required to solve complex problems that require a varied and multi-disciplinary skillset. You’ll need to understand the full picture, and build systems end-to-end, with high attention to user and business requirements.

The responsibilities of this opportunity include:

  • Build highly scalable data pipelines for diversified and complex data flows
  • Work with our data engineers / BI Developers by setting technical directions and providing standards, architectural governance, design patterns, and practices
  • Influence our BI solution roadmap strategy and coordinate it with the architecture vision
  • Track and identify relevant new technologies in the market and push their implementation into our pipelines through research and POC activities
  • Proactively identify gaps in data consumption and define processes to complete them
  • Work closely with tech teams on the design and implementation of data solutions.

A successful candidate will have the following qualifications:

  • At least 3 years of relevant experience as Data engineer
  • Experience with cloud platforms (GCP/AWS/Azure)
  • Experience with data lake storage (cloud) and data formats (Parquet/ORC etc)
  • Experience with big data solutions like BigQuery/Snowflake/RedShift
  • Experience with high-scale, high-volume relational databases, and SQL language
  • Significant knowledge in big data language - familiarity with a variety of big data technologies in the big data world
  • DevOps skills such as Docker, Kubernetes, Cloudformation, etc. - advantage
  • BSc Computer Science/Data Management or equivalent
  • Experience with workflows and data processing pipelines like AirFlow - big advantage.
  • In-depth understanding of database management systems and ETL (Extract, transform, load) framework
  • Experience in the online industry.
  • Quick learner, a team player, independent and motivated individual
  • Able to multitask, prioritize, and manage time efficiently

Intelligent sales and marketing technology.