Your space-enabled career begins here

Space-based technologies are the building blocks of these pillars of innovation:

Search for credible job opportunities with top entrepreneurial space companies.

Junior Geospatial Data Scientist (Pipeline & Algorithm Focus)

Matter Intelligence

Matter Intelligence

Data Science
San Francisco, CA, USA
Posted on Feb 12, 2026

Location

San Francisco

Employment Type

Full time

Location Type

On-site

Department

Software Engineering

About Matter Intelligence

Welcome to Matter, where we are building the future of vision AI: pairing a world-first sensor that sees molecular chemistry, temperature, and 3D shape with a Large World Model that will be the most powerful intelligence engine for our physical world. This system doesn't just see what something looks like; it understands everything from a single pixel. We call this Superintelligent Vision.

You'll join a team that has delivered technologies to Mars for NASA/JPL, co-founded and led infrastructure for OpenAI, designed cutting-edge sensors for U.S. Defense, and invented core algorithms for spectral and 3D imaging. We've come together to build the next infrastructure for vision and intelligence in the physical world.

About the Role

We are looking for a Junior Geospatial Data Scientist to build and maintain the processing pipelines and algorithms that turn our satellite's raw hyperspectral imagery into science-grade data products. This is not a pure research role—you will write production code that runs at scale, and you'll need the math and physics intuition to know why each step in the pipeline works, not just how.

You'll work alongside senior remote sensing scientists and software engineers to implement radiometric corrections, atmospheric compensation, spectral analysis algorithms, and geospatial data transformations. If you're the kind of person who can derive a reflectance equation on a whiteboard and then implement it as a clean, tested Python module—this role is for you.

Key Responsibilities

Pipeline Development

  • Build, test, and maintain scalable data processing pipelines for satellite imagery—from raw sensor data through calibrated, orthorectified, analysis-ready products.

  • Write modular, well-documented Python code that runs reliably in cloud environments (AWS), not one-off notebooks.

  • Implement radiometric calibration, atmospheric correction, geometric orthorectification, and spectral resampling stages within automated workflows.

Algorithm Implementation

  • Implement and optimize spectral analysis algorithms including classification, unmixing, regression, and retrieval methods for hyperspectral data.

  • Translate mathematical and physical models (e.g., radiative transfer, Beer-Lambert law, spectral mixture analysis) into performant, validated code.

  • Benchmark algorithm accuracy against ground truth and reference datasets.

Data Engineering

  • Manipulate complex raster data at scale using GDAL, Rasterio, Xarray, and related geospatial libraries.

  • Work with high-dimensional spectral cubes—understanding data structures, coordinate systems, and metadata conventions (e.g., ENVI, GeoTIFF, NetCDF/HDF).

  • Optimize data I/O, memory management, and compute for large imagery datasets.

Scientific Rigor

  • Apply your understanding of remote sensing physics (reflectance, radiance, atmospheric effects, sensor response functions) to ensure every pipeline stage is scientifically sound.

  • Participate in calibration/validation efforts, comparing algorithm outputs against known references.

  • Flag and investigate anomalies—understanding when results look wrong and diagnosing whether the issue is data, code, or physics.

Qualifications

Required

  • B.S. or M.S. in Applied Mathematics, Statistics, Physics, or a closely related quantitative field.

  • Hands-on experience with hyperspectral or multispectral remote sensing data (spectral cubes, high-dimensional imagery).

  • Strong Python proficiency with an engineering mindset—you write modular, testable, version-controlled code, not just scripts.

  • Understanding of the math and physics behind remote sensing: reflectance, atmospheric correction, radiative transfer basics, spectral analysis.

  • Experience with geospatial data tools: GDAL, Rasterio, Xarray, NumPy, SciPy.

  • Comfort working with raster data formats and coordinate reference systems.

Preferred

  • Experience building data pipelines in cloud environments (AWS S3, EC2, Lambda, Batch).

  • Familiarity with atmospheric correction models (MODTRAN, 6S, FLAASH) or radiative transfer concepts.

  • Exposure to ML frameworks (PyTorch, TensorFlow, scikit-learn) applied to geospatial data.

  • Experience with software engineering practices: CI/CD, unit testing, code review, Git workflows.

  • Coursework or research experience in remote sensing, imaging spectroscopy, or Earth observation.

What Success Looks Like

  • Pipeline stages you build run reliably in production with clear documentation and test coverage.

  • You can explain the physical basis for each algorithm you implement—not just call a library function.

  • You identify and fix data quality issues before they propagate downstream.

  • Senior scientists trust your implementations because the code is clean and the math is right.

Location

This role is based in San Francisco, CA, with onsite presence required. Ability to travel to San Francisco Bay Area or El Segundo offices as needed.

ITAR Requirements

To comply with U.S. export regulations, applicants must be one of the following:

  • A U.S. citizen or national

  • A lawful permanent resident (green card holder)

  • Eligible to obtain required authorizations from the U.S. Department of State

Employee Offerings & Benefits

At Matter, we believe in rewarding high performance and providing the support you need to thrive:

  • Compensation: Competitive total package based on experience.

  • Equity: Early-stage equity package so you share directly in Matter's growth and success.

  • Health & Wellness: 100% employer-paid health, dental, and vision coverage.

  • Growth: Opportunities to develop deep technical expertise and grow into senior scientist or engineering roles as we scale.