US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…

ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Computational Pathology Scientist:
-
Employment Type:
Contractor
-
Location:
South San Francisco, CA (Onsite)
Do you meet the requirements for this job?
Computational Pathology Scientist
Careers Integrated Resources Inc
South San Francisco, CA (Onsite)
Contractor
Job Title: Computational Pathology Scientist
Location: South San Francisco, CA
Duration: 12 months + W2 contract (with high possibility of extension)
Job Description:
- The Translational Safety, Pathology team provides preclinical pathology assessments of risk. Within this group, the Digital Pathology team focuses on revolutionizing the analysis of digital histopathology slides by leveraging computational methods to enhance pathological evaluations traditionally performed solely by humans.
- Our objective is to integrate cutting-edge digital and computational techniques into pathology workflows and develop computational tools to support pathologist-driven identification and interpretation of findings.
- We are seeking a talented image data scientist for a contract position within our Digital Pathology team.
- This role involves contributing to the development and application of image-processing methods and pipelines using both conventional techniques and advanced techniques, such as machine learning and deep learning.
- The successful candidate should be proficient with commercially available image analysis software and able to perform basic statistical analyses and data visualizations.
- Ideally, the candidate will also contribute to the development and implementation of new AI-powered image analysis algorithms and should have programming expertise, particularly in Python.
- The role requires close collaboration with pathologists to design and execute image analysis workflows tailored to biological questions, as well as working with computational and data scientists across various departments.
- Strong interpersonal and communication skills, as well as a passion for interdisciplinary collaboration, are essential.
- Strong Programming Foundation: Demonstrated proficiency in Python and its scientific computing ecosystem, including libraries like NumPy, Pandas, Scikit-learn, and OpenCV.
- Version Control: Proficiency with version control systems, particularly Git, and experience with collaborative platforms like GitHub or GitLab.
- Computer Vision & Image Analysis: Solid experience in both classical and modern image analysis techniques. This includes traditional image processing and applying machine learning for tasks like image classification and semantic segmentation.
- Whole-Slide Image (WSI) Handling: Hands-on experience processing and analyzing gigapixel whole-slide images, using libraries such as OpenSlide or similar tools.
- Collaborative Mindset: A strong aptitude for iterative design and a proactive approach to receiving and incorporating frequent feedback from cross-disciplinary teams.
- Communication Skills: Excellent interpersonal and communication skills, with a proven ability to explain complex computational concepts to pathologists and biologists.
- Advanced Deep Learning: Deep expertise in developing and implementing advanced deep learning models for digital pathology, including for tasks like instance segmentation. High proficiency with at least one major framework such as PyTorch (experience with object detection libraries like Detectron2 is a plus), TensorFlow, or Keras.
- High-Performance Computing (HPC): Experience using HPC environments and familiarity with job schedulers, specifically SLURM, for training models on large datasets.
- Commercial Pathology Software: Practical experience with commercial digital pathology platforms (e.g., HALO, Visiopharm, or QuPath).
- Workflow Orchestration: Experience building and managing data pipelines with workflow orchestration tools such as Dagster or Airflow.
- Application Development: Experience building simple graphical user interfaces (GUIs) for research tools using Python frameworks like Tkinter or PyQt.
- Cloud Computing: Familiarity with cloud computing services for model training and deployment, particularly Amazon Web Services (AWS EC2)
- MS, or PhD-level scientist.
- Minimum years of experience: 2
Get job alerts by email.
Sign up now!
Join Our Talent Network!