Vacancy caducado!
- Assist with writing analytics code, jobs, services and components in Java, Apache Spark, Python, Hive and related Big Data Technologies
- Responsible for design, development and operations of systems that store and manage large amounts of data in data lake and snowflake
- Responsible for crafting micro service components using the REST, Spring/Spring Boot Technologies
- Guiding in technical analysis and solving technical issues during project delivery
- Work proactively, independently and with multi-functional teams to address project requirements, and eloquent issues/challenges with enough lead time to seek out project delivery risks
- Requirement gathering and understanding, analyze and convert functional requirements into concrete technical tasks.
- Bachelor's degree required; Masters' preferred and/or equivalent experience
- 10+ years of software development experience in designing, implementing large, sophisticated, distributed, highly scalable and secure applications.
- 9+ years' experience in application development and system analysis; detailed understanding of the middleware and message protocols
- 2+ years technical team leadership experience
- Experience with Apache Spark, Python, Hive and Hadoop
- Hands-on experience with Java.
- Demonstrable experience with Tomcat, EMR and AWS cloud services.
- Knowledge of Docker and OpenShift Container Platform is an advantage
- Hands-on DevOps experience using tools like Git, Bitbucket, Cloud bees/Jenkins, Maven
- Solid experience in developing, refactoring and re-engineering applications using REST, Spring/Spring Boot, Hibernate and Angular
- Experience using PostgreSQL and Snowflake database; SQL tuning experience is a huge plus
- Solid grasp of Linux/Unix operating system and Shell/Perl scripting
- Experience with Groovy and Terraform is a plus
- Agile/Scrum methodology experience is required.
Vacancy caducado!