Vacancy caducado!
- Five plus years of experience building ETL and related data processing and transformation technologies to build large-scale data analytics solutions
- Experience working independently with users to define concepts, information needs and functional requirements, specifications and database models.
- Have strong experience in designing and developing data warehouses and ETL pipeline.
- Have strong expertise in SQL/Advanced SQL and working with relational database systems (MySQL)
- Have strong expertise in Linux/UNIX and bash shell scripting.
- Have strong experience processing large amounts of structured and unstructured data, including integrating data from multiple sources.
- Have strong experience with SQL performance tuning.
- Proficiency with one or more programming language(s) (C, Python, Perl,) a plus.
- Experience with markup languages (JSON, XML, YAML)
- Experience with NoSQL, Time Series (OneTick) databases a plus.
- Good understanding of Cloud solutions i.e. AWS a plus.
- Familiarity with Big Data Frameworks/Hadoop-based technologies (Spark, HDFS, Kafka, Yarn etc.) a plus.