Thank you!
Your application was successfully submitted, and we will review it shortly.
We will provide you with feedback within 3 business days.
If you haven’t heard from us after this time, please don’t hesitate to follow up.
About the company
Quantum is a global technology partner delivering high-end software products that address real-world problems.
We advance emerging technologies for outside-the-box solutions. We focus on Machine Learning, Computer Vision, Deep Learning, GIS, MLOps, Blockchain, and more.
Here at Quantum, we are dedicated to creating state-of-art solutions that effectively address the pressing issues faced by businesses and the world. To date, our team of exceptional people has already helped many organizations globally attain technological leadership.
We constantly discover new ways to solve never-ending business challenges by adopting new technologies, even when there isn’t yet a best practice. If you share our passion for problem-solving and making an impact, join us and enjoy getting to know our wealth of experience!
About the position
Quantum is expanding the team and has brilliant opportunities for the Data Engineer. The client is a technological research company that utilizes proprietary AI-based analysis and language models to provide comprehensive insights into global stocks in all languages. Our mission is to bridge the knowledge gap in the investment world and empower investors of all types to become “super-investors.”
Through our generative AI technology implemented into brokerage platforms and other financial institutions’ infrastructures, we offer instant fundamental analyses of global stocks alongside bespoke investment strategies, enabling informed investment decisions for millions of investors worldwide.
Must have skills:
- At least 5+ years in a Data Engineer role or relevant experience with leading skills
- Expertise in Apache Spark / PySpark, including debugging, profiling (Spark UI), and Delta Lake optimization (OPTIMIZATION, VACUUM, MERGE)
- Strong Python skills (the codebase is in Python)
- Experience with data warehouses, lakehouses, Medallion architecture, star schema, and wide tables
- Knowledge of OLAP vs. OLTP systems
- SQL & NoSQL proficiency
- Hands-on experience with AWS Athena
- TDD, unit & integration testing
- At least an Upper-Intermediate level of English (spoken and written)
Would be a plus:
- Streaming & Real-Time Processing: Kafka, Kinesis, Pulsar, Flink, Storm, Beam
- Data Warehousing & Query Engines: Snowflake, Dremio, Redshift, Presto/Trino
- Workflow Orchestration: Airflow, Prefect, Airbyte
- AWS Services: AWS Glue, Amazon EMR
- Data Quality & Lineage, Lambda/Kappa architectures
- FastAPI experience (if applicable)
Your tasks will include:
- Architect, build, and optimize data pipelines
- Manage databases, ensuring data consistency, integrity, and security
- Improve performance at architectural, design, and code levels
- Develop robust tests and ensure high data quality
- Provide technical solutions, project plans, and estimates
- Collaborate with teams to integrate data solutions
We offer:
- Delivering high-end software projects that address real-world problems
- Surrounding experts who are ready to move forward professionally
- Professional growth plan and team leader support
- Taking ownership of R&D and socially significant projects
- Participation in worldwide tech conferences and competitions
- Taking part in regular educational activities
- Being a part of a multicultural company with a fun and lighthearted atmosphere
- Working from anywhere with flexible working hours
- Paid vacation and sick leave days
Join our team. Make a difference.
Take a step towards your data-driven future with Quantum