Vacancy caducado!
job summary:
Qualifications:- Bachelor's degree or equivalent work experience
- 4+ years' experience in enterprise data management, Hadoop, Big Data, DevOps, and Cloud based systems (Azure, AWS)?
- Proficiency in SQL 4+ years' experience implementing DevOps best practices (CI/CD, telemetry, test automation, infrastructure automation, etc)
- A view of "everything as code"?
- 4+ years' experience with Linux Operating Systems?
- 4+ years of Java/JavaScript, Python, C#, or complementary languages?
- 4+ years' experience with ETL tools and Pentaho experience a plus?
- Healthcare data experience preferred along with knowledge of HIPAA/HITECH compliance?
- HITRUST common security framework knowledge preferred
- Build high performing and scalable data systems, applications, and pipelines to process very large amounts of data from multiple source
- Collaborate on Big Data systems, and features within an Agile environment?
- Collaborate with cross-function teams of developers, senior architects, product managers, DevOps, and project managers?
- Drive continuous delivery initiatives to production for all systems?
- Agile Delivery - actively engages as part of the scrum team?
- Deliver solutions that are devoid of significant security vulnerabilities?
- Onboard clients in progressively more efficient manner, delivering significant business impact to revenue Requirements
- Experience level: Experienced
- Minimum 5 years of experience
- Education: No Degree Required
- SQL
- DevOps
- HIPAA
- Big Data
- Java/JavaScript
- Python