Job Details

ID #11390055
Estado Alabama
Ciudad Dothan
Tipo de trabajo Permanent
Salario USD Based on Experience Based on Experience
Fuente Jobot
Showed 2021-03-25
Fecha 2021-03-24
Fecha tope 2021-05-23
Categoría Arquitecto/ingeniero/CAD
Crear un currículum vítae

Remote Data Engineer - Python, ETL, AWS, Graph

Alabama, Dothan, 36301 Dothan USA

Vacancy caducado!

Remote Data Engineer (Python / ETL / AWS) needed for a growing cloud based R&D Biotech company!

This Jobot Job is hosted by: Stanton SikorskiAre you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.

A bit about us:

Founded over a decade ago and based on the East Coast, we are a team of life scientists and software engineers who believe the brightest minds in science should have access to the best tools that are key to driving innovation. Our flagship product is a groundbreaking cloud-based R&D platform built specifically for how life scientists work. By providing researchers with the best digital tools and networking them with colleagues, we're empowering R&D teams to generate novel insights, compress the time and money required to achieve key R&D milestones and produce knowledge that can be monetized to drive business forward

Why join us?
  • Competitive base salary based on exp.
  • Full benefits: Medical, Dental, Vision
  • 401 (K) with generous company match
  • Generous vacation, sick, PTO, and holidays
  • Work Remote
  • Bonus

Job Details
  • Build & operate automated ETL pipelines that process terabytes of text data nightly
  • Develop service frontends around our various backend datastores (AWS Aurora MySQL, Elasticsearch, S3)
  • Perform technical analyses and requirements specification with our product team on data service integrations
  • Help customers bring their data to the platform
You should know the following:

  • Python 3+ or Java programming, both would be preferred
  • Day-to-day experience using AWS technologies such as Lambda, ECS Fargate, SQS, & SNS
  • Exp. building and operating cloud-native data pipelines
  • Exp. extracting, processing, storing, and querying of petabyte-scale datasets
  • Familiarity with building and using containers
  • Familiarity with event-based microservices
Pluses:

  • Prior experience with Elasticsearch (custom development and/or administration)
  • Prior work with text and natural-language processing
  • Knowledge of Graph databases and Graph Theory

Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.

Vacancy caducado!

Suscribir Reportar trabajo