PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Cloud Data Developer/Engineer:
-
Employment Type:
Contractor
-
Location:
Woodcliff Lake, NJ (Onsite)
Do you meet the requirements for this job?
Cloud Data Developer/Engineer
Description:
The R&D group at Client is seeking a Cloud Data Developer / Engineer to work on a team building data lake(s), warehouse(s), pipelines, and machine learning models which will be utilized to drive drug development through predictive modeling of disease and drug response. The person in this role will also collaborate closely with biostatisticians in statistics methodology and machine learning to support projects in various stages of development in multiple business groups within Client.
The Cloud Data Developer / Engineer will be part of providing actionable insights while creating mission critical data science projects for our business.
Technical Skills Required:
• Experience with AWS and all services required for providing big data and machine learning solutions
• Should be familiar with big data platforms like AWS EMR, AWS Glue, and Databricks
• Must be familiar with cataloging of data in big data scenarios such as data lakes utilizing tools like Hive or Glue Catalog.
• Must be familiar with creating data pipelines for big data ingestion
• Experience with Python with PySpark, Scala, R, or SQL (Scala is a nice to have but not required)
• Ability to work with imaging files like DICOMs and how to extract metadata from those images and catalog the data.
• Experience with data cleansing and processing SAS datasets
• Experience with sophisticated deep learning data science tools like Keras, PyTorch, and TensorFlow
• Advanced data analysis
• Data storytelling
• Visual analytics with tools like Tableau, Spotfire, or QuickSight
• Must have deep expertise in ETL including the creation of destination files like ORC and Parquet.
• SAS experience preferred, but not required
Qualifications Required:
• Proven track record of developing, deploying, and supporting data analytic tools
• Experience developing front-end interface to statistical models with tools like R/Shiny or others
• Experience managing and coordinating with IT teams to maintain secure and compliant tools and applications
• Experience with developing and deploying cloud-based tools or distributed computing environment using Spark
• Excellent communication and presentation skills required
• Experience in managing different workstreams and coordinating tasks with internal teams and outside consultants
• Years of experience: 2-10 years minimally
• Bachelor’s Degree required with Master’s Degree preferred or equivalent industry experience
***Can be remote