Production ML systems
Transformer models, LLM assistants, fine-tuned workflows, backend services, and the unglamorous optimization work that makes models usable outside demos.
Machine Learning Engineer / AI Research Engineer / Ph.D. Researcher
I build transformer-based systems, LLM applications, and data-heavy ML platforms, then spend a healthy amount of time making them faster, cheaper, and less dramatic in production.
Most of my work lives somewhere between applied research, backend engineering, cloud infrastructure, and asking PyTorch to please behave on multi-GPU workloads. I am currently focused on production ML at Rankacy AI while continuing Ph.D. research in bioinformatics and computational biology.
What I do
I like work that survives both curiosity and production traffic.
Transformer models, LLM assistants, fine-tuned workflows, backend services, and the unglamorous optimization work that makes models usable outside demos.
TB-scale pipelines with Python, Spark, Delta Lake, RabbitMQ, and cloud-native deployment on AWS and Kubernetes with proper observability and CI/CD.
Behavioral modeling, explainability, hyperbolic ML, biomedical AI, and model inspection tooling for answering why the model did something suspicious.
Selected Work
Text-first on purpose. The projects are newer than the old screenshots anyway.
Designed and deployed an in-house transformer for micro and macro analysis of CS2 gameplay, covering the path from research experiments to production-facing systems.
Prototyped automated CS2 commentary with gameplay events, LLMs, and TTS, while also building task-specific assistants and internal tools around model usage.
Architected large-scale data and platform systems with Python, RabbitMQ, Spark, Delta Lake, AWS Glue, Terraform, ArgoCD, GitHub Actions, Karpenter, and KEDA.
Built embedded AI workflows for NXP S32K microcontrollers, from dataset preparation and augmentation to quantization, benchmarking, and C/C++ deployment.
Foundations
The older projects still matter because they taught me how to build, debug, and actually finish things.
Desktop and systems work in C/C++, C#, and Python, including SDL experiments, a space shooter, maze generation and solving, and the usual university algorithm detours.
Earlier work across ASP.NET, Django, HTML/CSS/JS, Bootstrap, and OnsenUI, including a tire service information system and smaller utility apps.
TaskIE still exists on the internet, somehowComfortable with SQL, PL/SQL, T-SQL, CUDA, assembly, Git, Linux, and the kind of debugging that politely reminds you abstractions are optional.
Experience
Own work across transformer modeling, LLM features, cloud infrastructure, data pipelines, backend refactors, and production ML systems.
Developed embedded AI systems for constrained hardware, including model benchmarking, optimization, and deployment in C/C++ environments.
Worked on embedded computer vision tasks, ONNX/TensorFlow workflows, and low-level debugging around storage and deployment constraints.
Researching hyperbolic machine learning and biomedical applications, while teaching AI/ML and Python courses and supervising bachelor theses.
Dean's Award thesis: X-Ray Image Analysis and Processing.
Dean's Award thesis: Transfer Learning for Text Data Analysis.
Research
Human behavior, explainability, representation learning, and practical systems for running those ideas in the real world.
About
I still enjoy algorithms, longboarding, black holes, psychology, neuroscience, and music with probably more emotional commitment than is strictly necessary.
Research depth when needed, production ownership when it counts, and enough range to move between model code, infrastructure, data work, and debugging without getting precious about it.
If you want to talk about applied ML, research, weird product ideas, or something interesting you are building, the links below are the fastest way to find me.
Languages: Czech (native), English (C1), German (A2), Polish (A2)