Study format
We divided the course into four parts:
1. Lectures
2. Practical lab tasks
3. Individual work
4. Final test
Course program
- Overview of the data processing pipeline
- Collection and storage of raw data
- Processing, cleaning, and storage of processed data
- Scaling tools for data processing systems
- Visualization and monitoring of data
Practical usefulness
During this course, we will not read another documentation or speak again about the top 10 popular tools.
- We will explain the basic principles of most data processing tasks.
- We will help you understand the mechanics of modern tools.
- We will show rules and practices not based on the technology stack.
- We will add practical tasks, which will be useful for novices and interesting for those who already work in this speciality.
- We will provide you with useful links to blogs and forums.
Must-have skills for students
Competence in the basics of the programming language or good knowledge of at least one programming language: Python / Java / R.
Nice-to-have skills for students
1. Theoretical skills of working with Docker
2. Jupyter notebook skills
3. Linux terminal skills
4. Theoretical knowledge of microservice architecture
Our lector
Our lecturer is Yura Braiko, Quantum engineer with years of experience. Yura is open to live communication, is eager to share the knowledge he acquired, and helps students not only with the course assignments but with any engineering task they encounter.