Vacancy caducado!
- Design, develop, and deploy infrastructure on which data gets collected, aggregated moved, cleansed, transformed, and analyzed
- Operationalize data insights from research into fault-tolerant, production-scale deployments
- Apply perseverance and imagination, both in writing new software and also deploying existing tools like Airflow, Hadoop, Spark, Jenkins, Docker, Kubernetes and Mesos
- Partner with data scientists and product team to understand their needs and create a platform that empowers them
- Work closely with Product and Project Managers to understand the features, do technical assessment, design, code test and deliver
- 5+ years of prior experience for the Senior Software Engineer role
- Coding experience in shipping complex software to production
- Command of Scala or Java, or interest in learning Scala
- Experience with Kafka, Spark, Hive, HDFS, AirFlow
- Command of data structures, algorithms, performance and scalability
- Understanding of fault-tolerant systems, network programming, multithreaded programming and security
- Experience with distributed systems and application design in a SOA environment
- Knowledge of high-scale performance and optimization tools and techniques
- Experience with AWS (configuring, deploying, managing) services and distributed applications
- BS in Computer Science, Engineering or a related technical role or equivalent experience
- Please note that hiring for this position will only be considered in the following states: AZ, CA, DC, FL, IL, MD, MN, NC, NJ, NV, NY, OR, PA, TN, TX, VA, WA