Egor Makhov
Principal Data Engineer
Transforming data challenges into business advantages. Specializing in scalable architectures and high-performance data solutions.
Expert Data Engineering Solutions
Delivering transformative data capabilities that drive business value and innovation
Data Architecture Design
Design scalable, future-proof data architectures that align with business objectives and support analytics, ML, and operational workloads.
Engineering Leadership
Build and lead high-performing data teams with effective hiring, mentoring, and agile delivery practices to meet critical business objectives.
Performance Optimization
Transform underperforming data systems into high-efficiency platforms with reduced costs and improved reliability through expert tuning and modernization.
Professional Track Record
Proven expertise across industries and technologies
Principal Data Engineer
Raiffeisen Bank Russia
2021 – Present
- Built low-latency data marts across all CIB banking products, enhancing data accessibility and performance.
- Created a fast, scalable, and user-friendly computing platform for data analytics and data operations.
- Developed a universal data quality framework for big data processing, adopted across the entire bank.
Data Engineering Consultant
Oliver James Associates
2023 – 2023
- Migrated data pipelines of a major European stock exchange to a new market protocol, ensuring seamless transitions and compliance with updated standards.
- Optimized existing architecture to improve performance and efficiency, resulting in faster data processing and reduced operational costs.
- Introduced data quality checks to ensure completeness and consistency of market events.
Machine Learning Engineer
Agile Lab
2019 – 2020
- Managed dataset collection and training for personal protective equipment (PPE) detection machine learning model.
- Developed a FLANN-based pattern detection module for access control and personnel identification.
- Led a team of three developers during the review and refactoring of a feature extraction application for a corporate banking platform.
Big Data Engineer
Agile Lab
2017 – 2019
- Developed and tested multiple ETL applications at UniCredit Services as part of a 4-member team.
- Designed, developed, and integrated the data quality core module of an open-source big data analytical framework.
- Built multiple pipelines during the IoT platform development for Vodafone as part of a 20-member team.
Featured Work
Impactful solutions that deliver measurable results
Checkita Data Quality Framework
http://www.checkita.org/
An open-source data quality framework bundled with end-to-end no-code applications for batch and streaming data. I created Checkita while I was at Agile Lab and seamlessly integrated it for clients. Upon joining Raiffeisen Bank, I further developed and elevated the framework.
Let's Discuss Your Data Challenges
Ready to transform your data infrastructure?