At Enable, we are transforming the supply chain with our cutting-edge rebate management software. We see rebates as a strategic advantage, strengthening partnerships, driving smarter decisions, and unlocking significant value across the entire supply chain – from manufacturers to consumers.
After securing $276M in Series A-D funding, we are positioned for continued, significant growth. Since the launch of our flagship product in 2016, we have been rapidly scaling our client base, product offerings, and built a team of top-tier talent committed to reshaping the industry.
Want a glimpse into life at Enable? Visit our Life at Enable page to learn how you can be part of our journey.
Job Summary
You’ll work as a senior voice within the data platform team to build and evolve the core tools, infrastructure, and processes that empower other domain teams within our data mesh ecosystem to develop and maintain data products, ensuring our data solutions (including Kubernetes-based deployments, Snowflake application development, and event-driven architectures) are reusable, standardized, and enable self-service for domain teams. You will contribute to the technical design, implementation, testing, deployment, and ongoing support and maintenance of our data platform on Snowflake and Azure. By going above and beyond simply implementing new features, we focus on customer experience, building high-quality, secure, and highly scalable software. You’ll use your full range of skills and further develop them—and those of your colleagues—including:
* Problem-solving, and the ability and confidence to tackle complex data and platform challenges.
* Peer code reviews to maintain quality, reliability, and security.
* Modern big data architecture design, encompassing data orchestration and choreography.
* Ability to prioritize and meet deadlines in a dynamic environment.
* Attention to detail and solid written and verbal English communication skills.
* Willingness and an enthusiastic attitude to work within existing processes/methodologies, while driving improvements where needed.
We want all our people to be whoever they want to be and are committed to creating a truly inclusive culture at Enable. We believe that bringing your full, authentic self to work helps us build the best quality software, and by creating a truly diverse workforce we bring innovation into everything we do.
This is a senior technical role focused on the development of our SaaS products—suited to a highly focused, ownership-driven engineer. You will regularly leverage Python, Kubernetes, and Snowflake for both data and application development. Development is a part of the role, but you’ll also be expected to contribute to all areas of our engineering work, including product and feature design, leading and mentoring peers, and helping us to continually improve.
You’ll have focused professional experience as a data engineer, preferably in cloud-based SaaS products. Ideally, you’ll have at least five years of experience, but we focus on skill and ability, not tenure.
Duties and Responsibilities - Architecture Design
- Plan, design, and evolve data platform solutions within a Data Mesh architecture, ensuring decentralized data ownership and scalable, domain-oriented data pipelines.
- Apply Domain-Driven Design (DDD) principles to model data, services, and pipelines around business domains, promoting clear boundaries and alignment with domain-specific requirements.
- Collaborate with stakeholders to translate business needs into robust, sustainable data architecture patterns.
Duties and Responsibilities - Software Development & DevOps
- Develop and maintain production-level applications primarily using Python (Pandas, PySpark, SnowPark), with the option to leverage other languages (e.g., C#) as needed.
- Implement and optimize DevOps workflows, including Git/GitHub, CI/CD pipelines , and infrastructure-as-code (Terraform), to streamline development and delivery processes.
- Containerize and deploy data and application workloads on Kubernetes leveraging KEDA for event-driven autoscaling and ensuring reliability, efficiency, and high availability.
Duties and Responsibilities - Big Data Processing
- Handle enterprise-scale data pipelines and transformations, with a strong focus on Snowflake, or comparable technologies such as Databricks or BigQuery.
- Optimize data ingestion, storage, and processing performance to ensure high-throughput and fault-tolerant systems.
Duties and Responsibilities - Data Stores
- Manage and optimize SQL/NoSQL databases, Blob storage, Delta Lake, and other large-scale data store solutions.
- Evaluate, recommend, and implement the most appropriate storage technologies based on performance, cost, and scalability requirements.
Duties and Responsibilities - Data Orchestration & Event-Driven Architecture
- Build and orchestrate data pipelines across multiple technologies (e.g., dbt, Spark), employing tools like Airflow, Prefect, or Azure Data Factory for macro-level scheduling and dependency management.
- Design and integrate event-driven architectures (e.g., Kafka, RabbitMQ) to enable real-time and asynchronous data processing across the enterprise.
- Leverage Kubernetes & KEDA to orchestrate containerized jobs in response to events, ensuring scalable, automated operations for data processing tasks.
Duties and Responsibilities - Scrum Methodologies
- Participate fully in Scrum ceremonies, leveraging tools like JIRA and Confluence to track progress and collaborate with the team.
- Provide input on sprint planning, refinement, and retrospectives to continuously improve team efficiency and product quality.
Duties and Responsibilities - Cloud
- Deploy and monitor data solutions in Azure, leveraging its native services for data and analytics.
Duties and Responsibilities - Collaboration & Communication
- Foster a team-oriented environment by mentoring peers, offering constructive code reviews, and sharing knowledge across the organization.
- Communicate proactively with technical and non-technical stakeholders, ensuring transparency around progress, risks, and opportunities.
- Take ownership of deliverables, driving tasks to completion and proactively suggesting improvements to existing processes.
Duties and Responsibilities - Problem Solving
- Analyze complex data challenges, propose innovative solutions, and drive them through implementation.
- Maintain high-quality standards in coding, documentation, and testing to minimize defects and maintain reliability.
- Exhibit resilience under pressure by troubleshooting critical issues and delivering results within tight deadlines.
Required Education and Experience
- Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (or equivalent professional experience).
- Proven experience with Snowflake (native Snowflake application development is essential).
- Proficiency in Python for data engineering tasks and application development.
- Experience deploying and managing containerized applications using Kubernetes (preferably on Azure Kubernetes Services).
- Understanding of event-driven architectures and hands-on experience with event buses (e.g., Kafka, RabbitMQ).
- Familiarity with data orchestration and choreography concepts, including the use of scheduling/orchestration tools (e.g., Airflow, Prefect) and using eventual consistency/distributed systems patterns to avoid centralised orchestration at the platform level.
- Hands-on experience with cloud platforms (Azure preferred) for building and operating data pipelines.
- Solid knowledge of SQL and database fundamentals.
- Strong ability to work in a collaborative environment, including cross-functional teams in DevOps, software engineering, and analytics.
Preferred Education and Experience
- Master’s degree in a relevant technical field.
- Certifications in Azure, Snowflake, Databricks (e.g., Microsoft Certified: Azure Data Engineer, SnowPro, Databricks Certified: Data Engineer).
- Experience implementing CI/CD pipelines for data-related projects.
- Working knowledge of infrastructure-as-code tools (e.g., Terraform, ARM templates).
- Exposure to real-time data processing frameworks (e.g., Spark Streaming, Flink).
- Familiarity with data governance and security best practices (e.g., RBAC, data masking, encryption).
- Demonstrated leadership in data engineering best practices or architecture-level design.
Supervisory Responsibilities
- This position may lead project-based teams or mentor junior data engineers, but typically does not include direct, ongoing management of staff.
- Collaboration with stakeholders (Data Architects, DevOps engineers, Data Product Managers) to set technical direction and ensure high-quality deliverables.
Total Rewards:
At Enable, we’re committed to helping all Enablees grow. During the interview process, we assess your level based on experience, expertise, and role scope, aligning it with our compensation bands. Starting pay is determined by factors like location, skills, experience, market conditions, and internal parity.
Salary/TCC is just one component of Enable’s total rewards package. Enable is committed to investing in the holistic health and wellbeing of all Enablees and their families. Our benefits and perks include, but are not limited to:
Paid Time Off: Take the time you need to relax and recharge
Wellness Benefit: Quarterly incentive dedicated to improving your health and well-being
Comprehensive Insurance: Health and life coverage for you and your family
Retirement Plan: Build your future with our retirement savings plan
Lucrative Bonus Plan: Enjoy a rewarding bonus structure subject to company or individual performance
Equity Program: Benefit from our equity program with additional options tied to tenure and performance
Career Growth: Explore new opportunities with our internal mobility program
Additional Perks:
According to LinkedIn's Gender Insights Report, women apply for 20% fewer jobs than men, despite similar job search behaviors. At Enable, we’re committed to closing this gap by encouraging women and underrepresented groups to apply, even if they don’t meet all qualifications.
Enable is an equal opportunity employer, fostering an inclusive, accessible workplace that values diversity. We provide fair, discrimination-free employment, ensuring a harassment-free environment with equitable treatment.
We welcome applications from all backgrounds. If you need reasonable adjustments during recruitment or in the role, please let us know.