Forager

Senior Full Stack Engineer (Python/React)

🇲🇽 Remote, MX Remote Full time Posted May 11, 2026
LocationRemote, MX
WorkplaceRemote
EmploymentFull time
LanguageEnglish
PostedMay 11, 2026
Last verifiedMay 12, 2026

JobGrid listing details

JobGrid.eu keeps the employer description in its original language and adds clear listing facts, freshness, and source context so candidates can evaluate the role before applying.

Key details
1 location, Remote, Full time
Current openings
6 active jobs
Original language
English
Source and freshness
Collected from public career pages and reviewed through JobGrid.eu source availability checks. Last verified: May 12, 2026.
Apply path
JobGrid.eu sends candidates to the original application page and adds non-personal referral parameters.

At Forager.ai, we deliver premier workforce data encompassing people, contacts, organizations, jobs, and intent signals. Renowned for providing the most up-to-date and accurate lead generation data on the market, our solutions empower cutting-edge recruiting and sales platforms, AI-driven models, custom audience creation, and much more. With seamless delivery through APIs, data feeds, and CRM integrations, Forager.ai ensures our customers access the data they need, when and how they need it.

Why Join Us Now

There’s never been a better time to join Forager.ai. We’re experiencing rapid growth, driven by the increasing demand for our high-quality data solutions. To keep pace, we’re enhancing the way we deliver value to our customers by developing new features, integrations, and scalable infrastructure. We’re seeking a Senior Full Stack Engineer to help drive the evolution of our platform—building robust web applications, APIs, and data systems that power our products. Be part of an exciting phase of innovation and help launch groundbreaking data products into their next chapter of success.

This role is a global opportunity that requires a 4 hour overlap with US Mountain Time

What You'll Build

Build and operate the systems that deliver Forager.ai's people and organization data to platform customers at scale — customer-facing apps, large-scale data pipelines, and search infrastructure. These are the surfaces we win competitive bakeoffs on and retain customers through, the foundation of our "Data Quality Championship Belt."

  • Real-time enrichment APIs — person/org lookup, contact data, reverse search for waterfall platforms. Match rate, latency, and freshness drive renewals.
  • Bulk data feed delivery — maintain the Snowflake service delivering billions of data points daily to Data Feed customers.
  • Elasticsearch search infrastructure — indexing, query design, relevance tuning, and cluster scaling for person/company search and filtering APIs.
  • ETL pipelines — workers, task queues, and transformations moving data into APIs and feed exports.
  • Customer-facing web app and developer experience — React/TypeScript app, docs, onboarding flows, and self-serve surfaces.
  • Compliance and observability — data-sourcing proofs, GDPR/PII handling, and scoreboard metrics for each Data Quality dimension.

Core Responsibilities

Product & Application Development

  • Build and maintain Forager's customer-facing web app (React, TypeScript, Django/Python).
  • Implement and maintain RESTful APIs for integrations, feeds, and platform customer workflows.
  • Develop scalable backend services — workers, task queues, data pipelines — that keep refresh cycles predictable and fill rates high.
  • Participate actively in product planning; help shape which features have the highest customer impact.

Search, Data Layer & ETL

  • Build and operate Elasticsearch indices for people/company search — schema, ingestion, relevance, scaling.
  • Design and operate ETL applications moving data into searchable stores, feeds, and warehouses (Snowflake, S3).
  • Optimize PostgreSQL — query performance, indexing, cache utilization.
  • Drive measurable improvements in latency, uptime, error rate, and scalability.

DevOps & Infrastructure

  • Own day-to-day AWS infrastructure (ECS, S3, etc.) alongside DevOps.
  • Operate CI/CD, observability (Grafana, CloudWatch, Sentry), and on-call response for the surfaces you build.
  • Share crawler infrastructure maintenance with the team.

Collaboration & Quality

  • Code review with high standards for readability, security, and performance.
  • Write unit, integration, and E2E tests — test reliability is a quality contributor, not overhead.
  • Document features, architecture, and API contracts; great developer docs are how our customers succeed.

Before you leave

Leave your email to track this opening and receive relevant alerts. You can also continue without sharing it.