Software Engineer- Data Platform(India-Remote)
Join Granica’s core engineering team to design and scale systems powering data workflows, automation, and analytics. This is a deep engineering role—not feature delivery.
WHAT YOU’LL DO-
- Build backend APIs and scalable data pipelines (Python, PySpark).
- Work with modern data lakehouse/warehouse tech (Iceberg, Delta Lake, Snowflake, Databricks).
- Orchestrate workflows (Airflow) and optimize big data frameworks.
- Manage infra as code (Terraform) and ensure reliability with monitoring/logging.
- Collaborate across teams and with customers to solve complex data challenges and design seamless integration solutions.
- Drive best practices in scalability, reliability, and cost efficiency.
WHAT WE'RE LOOKING FOR-
- 5+ years in software/data engineering or infrastructure roles
- Strong Python skills (backend APIs a plus)
- Proven ability to build scalable data pipelines from scratch
- Hands-on with Apache Iceberg/Delta Lake + Snowflake/Databricks
- Workflow orchestration expertise (Airflow, Luigi, etc.)
- Big data frameworks experience (Spark, Hadoop)
- Familiar with monitoring/analytics tools (Prometheus, Grafana, ELK, Datadog)
- Skilled in designing scalable, reliable, cost-efficient systems
- Experience with large-scale distributed data architectures
- Thrives in fast-paced startup environments
- Excellent problem-solving, communication, and customer-facing skills
NICE-TO-HAVES:
- Hands-on experience with Terraform or other infrastructure-as-code tools.
- Familiarity with security and privacy best practices in data processing pipelines.
- Exposure to cloud platforms (AWS, GCP, Azure) and containerisation (Docker, Kubernetes).
COMPENSATION & BENEFITS
- Competitive salary, meaningful equity, and substantial bonus for top performers
- Flexible time off plus comprehensive health coverage for you and your family
- Support for research, publication, and deep technical exploration
At Granica, you will shape the funda...