Job Duties:
1. Develop and Implement Advanced Analytics Systems by creating robust analytics solutions to simplify complex problems through straightforward, scalable frameworks that address both immediate and long-term business needs.
2. Conduct in-depth analyses on large datasets, uncovering growth trends and patterns to inform data-driven strategic decisions for key business initiatives.
3. Define Data Mapping and Model Specifications by Providing comprehensive documentation for data transformations, ensure consistency and accuracy through detailed source-to-target mappings and model specifications.
4. Generate Actionable Reports by producing high-quality reports, leveraging best practices in data mining, analysis, and visualization to support decision-making across departments.
5. Collaborate with Stakeholders for Requirement Gathering and building strong working relationships with managers and cross-functional teams, gathering project requirements and aligning solutions to meet business objectives.
6. Team up with Project Managers on Analytics needs, to define and track KPIs, ensuring analytics outputs aligned with project goals and provided actionable insights.
7. Develop dynamic, user-friendly visualizations to clearly communicate data insights to stakeholders, enhancing their ability to make informed decisions.
8. Conduct proactive Data Analysis for Business Impact by independently analyzing datasets to address critical business questions, pinpoint operational gaps, and recommend process improvements.
9. Maintain database and manage data from multiple sources, optimizing database scripts and configurations to enhance flexibility and scalability in data evaluation.
10. Compose Comprehensive Reports by compiling detailed reports summarizing the analysis process, outcomes, and key takeaways, with recommendations tailored to business objectives and stakeholders’ needs.
11. Work in partnership with various teams to understand their data needs, provide analytical solutions, and enable data-informed decisions across different departments.
12. Leverage analytical tools, such as Excel, SQL and Python alongside visualization platforms like Tableau, PowerBI to perform complex analysis and automate repetitive tasks.
13. Adhere to data governance protocols and industry standards to ensure data handling meets legal and ethical requirements, protecting data integrity and privacy.
14. Design and Manage ETL Workflows to oversee Extract, Transform, Load (ETL) processes to enable seamless data integration, scheduling, error-checking, and ensuring data flows efficiently across systems.
15. Utilize SQL optimization, indexing, and query tuning to maximize the speed and efficiency of data operations, facilitating quicker access to insights.
16. Leverage extensive expertise in AWS services, including S3, Lambda, and Glue, to develop and optimize cloud-based data pipelines for seamless data storage, processing, and analytics.
17. Advanced skill in handling scalable, cloud-native solutions improved the efficiency and scalability of data processing and integration.
18. High-level skills in SQL and Python to implement automated data quality checks, identify anomalies, and ensure data integrity.
19. Develop custom scripts for data validation and regular file verification
20. Enhance reporting accuracy, ensure data reliability, and minimize manual intervention in critical business processes.
21. Utilize Python's advanced libraries, including Pandas and NumPy, to perform statistical modeling and predictive analytics on clinical trial data.
Travel Requirements:
Travel: The only travel required is when relocating to within commuting distance of unanticipated client locations anywhere in the U.S. for long-term assignments.
Relocation: Must be able to relocate to unanticipated client locations anywhere in the U.S. for long-term assignments.
Education:
Bachelor’s degree (or its foreign equivalent) in Data Science, Information Technology, or closely related field.
Experience:
36 months of experience in the job offered or closely related occupation. Experience to include 24 months experience in the following:
1. Utilizing AWS-based solutions (S3, Lambda, Glue) for data storage, processing, and analytics, including performing ETL processes to ensure seamless data transformation and integration across various sources, with S3 for scalable storage of raw and processed data.
2. Applying statistical modeling and advanced analytical methodologies in Python (using libraries such as Pandas and NumPy) to derive actionable insights and interpret complex data.
3. Migrating on-premise data to AWS with minimal downtime and developing serverless functions to automate data processing tasks, enabling real-time analytics and improving operational efficiency.
4. Proficiency in Python (with libraries like Pandas and NumPy), AWS (S3, Lambda, Glue, RDS), SQL, ETL tools, Git, and cloud-based services for deploying analytics solutions and managing scalable data pipelines.
Work Schedule:
Number of hours per week: 40
Daily Work Schedule: Monday to Friday | 9:00 A.M. to 6.00 P.M.
Work Location: 263 N Jog Road, West Palm Beach, FL 33413
Mail / Send resume to:
Attn: Jesus M. Velarde - Director of Operations
263 N Jog Road,
West Palm Beach, FL 33413