Vacancy caducado!
The Data Consumption team of Data Security & Infrastructure (DSI) is seeking a highly motivated Data Engineer to start an IT career in GEICO Data Operations division. Teaming up with architects, scrum masters, leads, managers and directors, you will work in an Agile environment to make the data on our Enterprise Data Platform accessible for the needs of the organization. You will be working in a team with Data Architects and Analysts to build our next generation data platform in Azure. You will be trailblazing to apply Software Development techniques such as Automated Testing and CI/CD to building data products. You should be intellectually curious, have a solutions-oriented attitude and enjoy learning new tools and techniques.Basic Qualifications:
- 5+ years' experience with Relational Database Systems and SQL, designing, developing ETLs
- 3+ Years of experience working in Azure environment delivering software.
- Experience with Azure Data Products such as ADF, ADLS and Event Hub
- 3+ years' experience in at least one scripting language (Python, JavaScript, Shell)
- 3+ years' experience with Agile engineering practices and end to end automation of software delivery
- Experience with Data Modeling, source to target mapping, automated testing frameworks
- Developing new and enhancing existing data processing (Data Ingest, Data Transformation, Data Store, Data Management, Data Quality) components
- Strong working knowledge of SQL and the ability to write, debug, and optimize SQL queries
- Bachelor's degree in a computer related field or equivalent professional experience required
- Data engineering experience focused on batch and real time data pipelines development using Spark, Python or Java; Data processing / data transformation using ETL tools, Azure Databricks platform (preferred)
- Experience with Cloud Data Warehouse solutions experience (Snowflake, Azure DW, or Redshift)
- Exposure to Cloud and Distributed Data Storage (HDFS, S3, ADLS, Cassandra or other NoSQL storage systems)
- Experience in Data integration technologies: Kafka, eventing/streaming, NiFi, Azure Data Factory
- Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
- Familiarity with Data Vault, DataBricks, Fishtown/DBT tool & Graph Databases