Vacancy caducado!
- 5+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
- 5+ years of work experience with very large data warehousing environment
- 3+ years of experience data modeling concepts
- 3+ years of Python development experience
- 2+ years’ experience in Big Data stack environments (EMR, Hadoop, Glue, Hive)
- Apache Airflow experience is a huge PLUS!
- Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred.
- Experience in writing Spark ETL jobs
- Experience using software version control tools (Git, Jenkins, Apache Subversion)
- Demonstrated strength in architecting data warehouse solutions and integrating technical components
- Good analytical skills with excellent knowledge of SQL.
- Excellent communication skills, both written and verbal
- Experience working with CDC solutions like Golden Gate, Syncsort, Attunity
- Java Development Experience is Preferred
- Experience in gathering requirements and formulating business metrics for reporting.
- Experience with Kafka, Flume and AWS tool stack such as Glue ETL, Redshift and Kinesis are preferred.
- Experience building on AWS using S3, EC2, Redshift, DynamoDB, Lambda, QuickSight, etc.
- AWS certifications or other related professional technical certifications
- Experience with cloud or on-premise middleware and other enterprise integration technologies
- Bachelor's Degree
- Relevant Experience or Degree in: Computer Science, Management Information Systems, Business or related field
- Typically Minimum 4+ Years Relevant Exp
- Four-year college degree and 4 or more years, and/or a high school diploma with 6 or more years professional experience in full life cycle design and development to include IT architecture, banking industry experience, and understanding client requirements