Job Details

ID #53428124
Estado Pennsylvania
Ciudad Philadelphia
Tipo de trabajo Full-time
Salario USD TBD TBD
Fuente City of Philadelphia
Showed 2025-02-09
Fecha 2025-02-09
Fecha tope 2025-04-10
Categoría Etcétera
Crear un currículum vítae
Aplica ya

Data Engineer / Integration Specialist

Pennsylvania, Philadelphia, 19113 Philadelphia USA
Aplica ya

Data Engineering at the City is a unique opportunity for meaningful, exciting work and professional development using state-of-the art technologies and software development best practices. This is a position on a diverse team of engineers, analysts, and GIS specialists, contributing to developing, maintaining, and optimizing our data pipelines and integration strategies, assisting departments and agencies in innovating, and enhancing the City’s service to its residents. The data engineer/integration specialist will develop an intimate understanding of the City’s diverse data and contribute to improving the City’s data engineering infrastructure, pipelines, models, and integrations. The team uses a blend of open source, custom developed and off-the-shelf tools, including Python, Bash, SQL, DBT, GIS, Docker, Terraform, Apache Airflow, Jenkins, Postgres, PostGIS, AWS, GitHub, MuleSoft as an iPaaS and SAS API provides like ArcGIS Online and CARTO.

Some prominent services the team is responsible for include DataBridge, OIT’s enterprise data warehouse, AIS, the City’s custom address information systems which supports public address search and internal geocoding, and Databridge-Airflow, a custom workflow generator and orchestrator for departmental integrations (https://metadata.phila.gov/) and open data publishing (https://opendataphilly.org/). Some examples of OIT’s commitment to serving the public through the effective and innovative integration of City open data include:

https://Atlas.phila.gov

https://property.phila.gov

https://Openmaps.phila.gov

https://streetsmartphl.phila.gov/

www.phila.gov/solarmap

www.phila.gov/stressmap

We are looking for a candidate who is not only technically skilled but also passionate about leveraging data to improve City operations and community life. This team is fundamental to the innovation of City services that directly affect residents. If you are eager to contribute to a team that values innovation, efficiency, service-oriented work and data-driven decision-making, we encourage you to apply.Essential Functions

    Working with business partners of varying technical ability to understand how their data is produced, stored, and updated, designing enhancements to optimize enterprise integrations and public data accessibility.

    Design and develop new and expand and refine existing data pipelines using Apache Airflow, custom Python-based software, and other tools to connect diverse systems of record, centralized databases, and SaaS environments, improving data platform components, data governance, master data management, data standardization, enrichment and quality assurance.

    Play a key role in enterprise integration and open data initiatives, developing and supporting critical systems like the City’s DataBridge, Databridge-Airflow and AIS.

    Write clear, descriptive technical documentation for systems/applications, partner with data stewards of various technical levels and communicate technical details to non-technical stakeholders.Competencies, Knowledge, Skills and AbilitiesCompetencies:

    Clear and concise writing and communication skills.

    Creative problem-solving and critical thinking.

    Ability to work with data from diverse domains.

    Efficient time management and the ability to manage multiple workflows simultaneously.

    Ability to seek innovative opportunities and continuous improvement.

Knowledge and Skills:

    Proficient in Python

    Strong SQL skills and experience with databases (Postgres preferred).

    Familiarity with Docker, bash and minimal Linux server administration.

    Experience with cloud services (AWS preferred).

    Understanding of Git/GitHub for version control and CI/CD pipelines.

    Experience or ability to learn Airflow, DBT, Terraform, Kubernetes, Mulesoft.

    Experience working with spatial datasets a plus.

    Experience using command line interfaces.

Abilities:

    To reason about, model and manipulate complex datasets.

    To maintain clean and secure data environments.

    To work effectively in a hybrid (on-premises and cloud) environment.

    To communicate complex technical concepts in understandable terms.

 

Aplica ya Suscribir Reportar trabajo