Vacancy caducado!
- Design, maintenance and ownership of a Data Infrastructure
- Working with modelers to understand the business and their requirements. Help determine the optimal data set and structure to deliver on those user requirements
- Act as Domain Experts on the products over the course of time
- Understanding the data and setup and monitor the QC/ Surveillance
- Implementation of and maintenance of a standard data / technology deployment workflow to ensure that all deliverables/enhancements are delivered in a disciplined and robust manner
- Bachelors/Masters Computer Science or related field
- 7-10 years of programming experience and minimum of 5 years of relevant experience
- Experience with Scala
- Experience with big data technologies such as Hadoop, Pig, Cassandra, Spark
- Aptitude for design and building tools for data due diligence and data extraction pipeline
- Knowledge of ETL, data curation and analytical jobs using distributed computing framework with Spark and Hadoop
- Knowledge and Experience of working with large enterprise wide data warehouses
- Java /Python knowledge is a plus
- DevOps and Cloud experience is a plus
Vacancy caducado!