Research Group RationAI

RationAI research group concentrates on developing cutting-edge AI methods in biomedicine. We aim to create an appropriate environment that will maximally support cooperation between domain experts and computer science specialists, focusing on explaining the behavior of these AI methods (explainable AI, XAI). As an indispensable part of such an effort, we consider traceable development, training, and validation of the AI methods using automated provenance information generation and robust visualization systems. Furthermore, for specific domains, we also develop a trusted environment for the validation of AI methods using evaluation metrics developed in tight cooperation with the domain experts.

  • IndustryHealthcare, Pharma & Biotech, Information & Communication Technologies, Science & Research
  • InstitutionMasaryk University
  • Faculty / InstituteFaculty of Informatics
  • Research typeApplied, Basic
  • Research areaAI for Science, Explainable & Trustworthy AI, Human-AI Interaction, Machine Learning, Computer Vision & Video Analytics, Trustworthy & Responsible AI
Group head
  • Tomáš Brázdil, GS

Affiliated Research Center / Institute

self-explanatory AI methods, Advanced visualization, explainability, AI methods in biomedicine, Machine learning, explainable AI, digital pathology, biomedical image processing, data provenance

Stay informed with CNAIP. Subscribe to our regular mediamonitor and never miss an update in the world of AI. We’ll deliver a digest of the most essential news straight to your inbox.

By subscribing, you agree to our Terms of Service.

© cnaip 2026

Want to become a part of Czech AI?

Share your story and showcase what you can achieve with artificial intelligence. Your involvement will inspire others and help us map out the Czech AI scene in its entirety.