# AnalyzeThat'26, co-located with ETAPS!
## Motivation
Static program analysis has matured to the point where many of our research tools can tackle substantial real-world code. What we lack are opportunities to exercise these tools on real software together, compare approaches, and learn from one another. Real-world audits require careful modeling and domain understanding, which can be time-consuming and scientifically unglamorous. AnalyzeThat creates a focused venue to make this work visible, collaborative and fun!
AnalyzeThat complements existing initiatives aimed at creating more robust and interchangeable verification tools. The [Software-Verification Competition (SV-Comp)](https://sv-comp.sosy-lab.org/) provides rigorous head-to-head evaluation on benchmarks with known outcomes established by human oracles. [VerifyThis](https://verifythis.github.io/) offers a two-day challenge to prove the correctness of intricate algorithms using deductive verification tools.
## Goals
The aim of AnalyzeThat is to gather static analysis developers around a **challenge**: spending limited time auditing an open-source project. This event is built around two goals:
- **Open-ended exploration**. There is no "ground truth" of bugs for the chosen project, the goal will be to prove some components of the project (un)safe. Each team will be free to choose the components they inspect, and their modeling of interactions.
- **Collaborative research**. The event will foster sharing of practices when analyzing new projects, strengthen the community and offer opportunities for collaboration of researchers and tools. At the end of the workshop, we will have identified common struggles of participating tools, from which we can identify further research directions (identification of new challenging areas of application of static analysis; needs for developing novel abstractions)
What kind of open-source project? To gather a significant number of interested parties, this first event will target analysis of real-world, non-concurrent C programs/libraries. You can find example projects that could be considered [here](https://gitlab.com/analyzethat/2026).
## Calendar
20 February 2026: Deadline to [submit software to be considered for analysis](https://gitlab.com/analyzethat/2026), alongside note describing why it's a real-world software which would be interesting to analyze.
- 02 March 2026: Announcement of the chosen software to be analyzed until the workshop day
- 10 March 2026: ETAPS early registration deadline
- 12 April 2026: Workshop day @ ETAPS in Torino, dedicated to presenting obtained results, and discussing common issues.
**Join us on [Zulip](https://absint.zulipchat.com/#narrow/channel/535188-AnalyzeThat) to participate!**
## Participating tools/teams (WIP)
NB: this list will be regularly updated. Contact us on [Zulip](https://absint.zulipchat.com/#narrow/channel/535188-AnalyzeThat) to register by 02 March 2026!
[Frama-C Eva](https://www.frama-c.com/): [André Maroneze](https://fr.linkedin.com/in/andre-maroneze-a512677/fr), TBD
- [Goblint](https://goblint.in.tum.de/): [Simmo Saan](https://sim642.eu), TBD
- [Mopsa](https://mopsa.lip6.fr/): [Raphaël Monat](https://rmonat.fr), TBD
## Organizers
[Raphaël Monat](https://rmonat.fr)
- [Helmut Seidl](https://www.professoren.tum.de/en/seidl-helmut)
- [Vesal Vojdani](https://kodu.ut.ee/~vesal/)