Kroll Logo

Kroll

Data Engineer I

Posted Yesterday
Be an Early Applicant
Remote
Hiring Remotely in Canada
Mid level
Remote
Hiring Remotely in Canada
Mid level
The Data Engineer I will design and implement data solutions on Azure, developing ETL processes, managing data pipelines, and ensuring data quality and security, while collaborating with various teams.
The summary above was generated by AI

Kroll’s Private Capital Markets (PCM) platform is transforming private asset valuation and portfolio workflows for alternative asset managers. We’re seeking a Data Engineer to design and implement secure, scalable data solutions across the PCM platform on Azure.

You will collaborate closely with Product and Implementation teams to deliver client-ready analytics, robust APIs, and high-performance data pipelines that power financial workflows spanning private equity, fixed income, derivatives, and structured products.

You’ll also help establish engineering standards and communities of practice across a global team of data professionals and developers.

This is a hybrid role, requiring 2–3 days of on-site presence each week.

Day-to-Day Responsibilities

  • Data Pipeline Construction: Design, build, and maintain reliable data pipelines to move, transform, and integrate data from diverse sources into data warehouses or lakes.
  • ETL and Data Integration: Develop and optimize ETL/ELT processes using tools such as Azure Data Factory, Databricks, Synapse, DBT, Airflow, or Informatica.
  • Data Warehousing: Model and manage data warehouses to ensure efficient querying, high performance, and data quality using platforms like Azure Synapse, Snowflake, Redshift, or BigQuery.
  • Data Quality & Monitoring: Implement validation, cleaning, and monitoring processes to ensure data accuracy, consistency, and reliability.
  • Data Security: Apply robust data governance practices, manage access permissions, and ensure compliance with privacy regulations.
  • Performance & Scalability: Optimize systems to handle growing data volumes and support evolving business needs.
  • Lead and mentor cross-functional teams, driving adoption of modern data technologies and best practices.
  • Spearhead greenfield initiatives that align with strategic business objectives, including innovation to support revenue growth and market expansion.
  • Own key functional areas of the PCM platform to ensure operational efficiency, reliability, and peak performance.
  • Promote collaboration and excellence by participating in architectural reviews, defining technical standards, and contributing to a culture of continuous improvement.

Essential Traits

  • Technical Expertise
    • Proven experience building ETL/ELT pipelines using Azure, AWS, or Databricks platforms.
    • Strong proficiency in SQL (T-SQL, PL/pgSQL, Spark-SQL) for data transformation and optimization.
    • Skilled in Python, C#/.NET, or Java for data engineering and backend services.
    • Hands-on experience with REST API development, Python SDKs, and containerization tools such as Docker and Kubernetes.
    • Working knowledge of CI/CD pipelines, Git, and Azure DevOps.
  • Data Systems & Architecture
    • Experience with Microsoft SQL Server, PostgreSQL, and cloud-native databases.
    • Understanding of data warehousing, dimensional modeling, and data lake architectures.
    • Hands-on experience with data pipeline orchestration tools like Airflow, Ascend, or Azure Synapse.
    • Exposure to data quality frameworks and monitoring best practices.
  • Collaboration & Delivery
    • Partner effectively with Product Owners and end users in an agile environment.
    • Participate in code reviews, technical design sessions, and architecture discussions.
    • Demonstrated ability to manage multiple priorities, solve complex problems, and deliver scalable solutions.
  • Master’s degree in Computer Science, Data Science, Mathematics, Statistics, or a related field.
  • Minimum 3 years of hands-on data engineering experience, ideally within financial services.
  • Relevant Cloud (Azure/AWS) or Data Engineering certifications preferred.
  • Ability to handle confidential and sensitive information with discretion.

About Kroll

Join the global leader in risk and financial advisory solutions—Kroll. With a nearly century-long legacy, we blend trusted expertise with cutting-edge technology to navigate and redefine industry complexities. As a part of One Team, One Kroll, you'll contribute to a collaborative and empowering environment, propelling your career to new heights. Ready to build, protect, restore and maximize our clients’ value? Your journey begins with Kroll.

We are proud to be an equal opportunity employer and will consider all qualified applicants regardless of gender, gender identity, race, religion, color, nationality, ethnic origin, sexual orientation, marital status, veteran status, age or disability.

In order to be considered for a position, you must formally apply via careers.kroll.com.


Top Skills

Airflow
Azure
Azure Data Factory
Azure Devops
C#/.Net
Databricks
Dbt
Docker
Git
Informatica
Java
Kubernetes
Microsoft Sql Server
Postgres
Python
Rest Api
SQL
Synapse

Similar Jobs

15 Days Ago
Remote
Canada
Senior level
Senior level
Cloud • Fintech • Productivity • Software
The Senior Software Engineer I will lead projects, mentor engineers, and design scalable solutions while developing features, APIs, and collaborating in an Agile environment.
Top Skills: DockerEmberJavaScriptKubernetesNode.jsPostgresPython
8 Days Ago
Easy Apply
Remote
Canada
Easy Apply
Senior level
Senior level
Healthtech
The Senior Data Engineer at League will develop and support data services and software products, ensuring performance and security of data infrastructure while collaborating with cross-functional teams.
Top Skills: AirflowGoGCPPython
An Hour Ago
Remote or Hybrid
6 Locations
Senior level
Senior level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Design, coordinate, facilitate, and evaluate business continuity and crisis response exercises. Own exercise lifecycle from planning through facilitation, after-action reporting, corrective action tracking, and disaster recovery testing. Partner with cross-functional stakeholders to validate preparedness, identify gaps, and drive continuous improvement of resilience posture.
Top Skills: Business Continuity SoftwareChaos EngineeringCloud Native PlatformsDisaster Recovery TestingTableau

What you need to know about the Ottawa Tech Scene

The capital city of Canada and the nation's fourth-largest urban area, Ottawa has proven a rapidly growing global tech hub. With over 1,800 tech companies, many of which are leaders in their sectors, the city's tech talent now makes up more than 13 percent of its total workforce. This growth is driven not only by the big players like UL Solutions and Dropbox, but also by a thriving startup ecosystem, as new businesses emerge to follow in the footsteps of those that came before them.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account