Job Details

ID #49572481
Estado Distrito de Columbia
Ciudad Washington
Tipo de trabajo Contract
Salario USD TBD TBD
Fuente Tential
Showed 2023-03-28
Fecha 2023-03-27
Fecha tope 2023-05-26
Categoría Etcétera
Crear un currículum vítae

Data Engineer

Distrito de Columbia, Washington, 20001 Washington USA

Vacancy caducado!

The Data Engineer will be part of the Data Science and Engineering Technologies team and will be filling in for a data engineer must be able to analyze, design, develop, integrate, run, and support various Google Cloud Platform ETL and Data related jobs from ETL, Data Warehouse to across multiple technologies and architectures involving various technologies including application servers, databases, logs, APIs for external data sources and operating systems.

This position will be required to:

  • Work with Business Owners, Data and Business Intelligence teams [On Looker/Tableau etc] to identify requirement on new and existing data sources and implement ETL logics from various different types of interfaces that we extract from - APIs, Web Services, Databases, external and On-Prem databases and warehouses.
  • Work with business users and technical designers to assist in efficient data model designs that meet business unit requirements involving integrations of various ACS technical data from systems and platforms.
  • Work with management, project managers and other lead developers to design and develop pipelines and ensure data accuracies.
  • Lead and participate in troubleshooting and fixing major system problems in core data systems and supplemental data pipelines as well.
  • Understand relationship between Google Cloud Platform products well - Primarily Data Fusion, Big Query and Looker, and demonstrate experience in Data Fusion [comparable ETL acceptable] and Google Big Query [Comparable DW acceptable].
  • Provide strong leadership and mentoring for less senior personnel in the areas of design, implementation, and professional development.
  • Where required, effectively delegate tasks to development teams of Software Engineering, providing guidance and proper knowledge transfer to ensure that the work is completed successfully.
  • Be flexible to work during non-Business hours.
  • The ideal candidate will:

  • Have experience with Data Fusion or Equivalent, Big Query or Equivalent, SQL server, scripting in Java/Python that works well in Google Cloud Platform products, and their respective practices.
  • Python experience is a plus.
  • Develop data ETL pipelines that meets both the functional and non-functional requirements, including performance, scalability, availability, reliability and security.
  • Have experience with writing code in Java, in order to work on data extracts that require cleanup.
  • Have a working knowledge of XML, JSON and other forms of data streaming artifacts and related technologies in a Java/Python environment.
  • Have strong written and verbal communication skills
  • Able to multi task on various streams of the entire data process.
  • Education and Experience and Technical Requirements:

    Bachelor's degree or equivalent experience. 5+ years with proven results in system development, implementation, and operations is required. Strong understanding of design patterns with a focus on tiered, large-scale data systems.

    #LI-AW#DICE#REMOTE

    Vacancy caducado!

    Suscribir Reportar trabajo