Vacancy caducado!
- Minimum 5+ years of experience in software development.
- 3+ years of related industry experience in an enterprise environment.
- 5+ years of data engineering experience.
- Scala / Python, PySpark(Boto, Boto3, etc.)/ Spark experience.
- Delta lake, delta table and lakehouse architecture
- Datawarehouse Experience (Snowflake)
- Experience with lambda, EMR, SQS, DynamoDB, Glue, Stepfunctions, etc.
- Linux and shell scripting.
- Kubernetes.
- Formal design patterns and industry best-practices.
- 2+ years of experience with requirements, design, implementation, integration, and testing for data and analytics integration.
- 2+ years of experience across a variety of technologies such databases, directory services, application servers, network infrastructures, Linux operating systems, and an understanding of fundamental security and data flows within these components.
- Excellent verbal and written communication skills.
- Self-motivated, driven, and creative individual.
- Scaling systems and microservices.
- Familiarity with CI/CD processes
- Code coverage analysis / static analysis tools.
- Agile programming processes and methodologies such as Scrum.
- Scheduling tools like Autosys, ControlM.
- Informatica IICS, Talend