Skip to main content

SoBigData Articles

A Journey to the Capital of XAI

The rapid development and adoption of AI technologies are transforming various fields, and
with that comes an ever-growing demand for responsible AI systems. The need for
explainability in AI is more crucial than ever. When I came across the opportunity to visit Pisa
through the SoBigData program, I immediately applied. Pisa, for me, is the "capital of XAI"
due to the groundbreaking work being done by researchers in this beautiful city. The chance
to collaborate with some of the world’s leading experts in AI explainability was incredibly
exciting. During my three-week stay, I immersed myself in an inspiring array of meetings and
discussions focused on the explainability of AI models.

 

A Triumvirate of Institutions Leading the XAI Frontier


One of the factors that sets Pisa apart in the world of XAI is the close collaboration between
three key academic institutions: the University of Pisa, the National Research Council of Italy
(CNR), and the Scuola Normale Superiore. Each institution brings its own unique expertise to
the development of explainable AI. The University of Pisa focuses on the theoretical and
ethical aspects of transparency in AI, while CNR contributes its technical expertise in AI
systems. The Scuola Normale Superiore, known for its interdisciplinary approach, integrates
knowledge from fields like philosophy, cognitive science, and data science. This close-knit
collaboration creates a rich, interdisciplinary environment where the frontiers of XAI are
continuously pushed forward.

 

 

Presenting My Work on Generative AI and XAI

During my stay, I had the opportunity to present my research on integrating generative AI with explainability (XAI). At one of the seminars, we discussed insights on how generative models can be used to provide clearer understanding of the decisions made by AI systems. We delved into the application of these techniques across three main types of data: images, text, and time series, sparking significant interest among local researchers. The discussions that followed were immensely valuable, and the feedback I received will play an important role in refining my research moving forward.

For image data, we explored how generative AI can create visual explanations that help users understand how AI models detect and emphasize key features in images. When it came to text data, we discussed the potential of generating natural language explanations, making AI decisions more accessible and transparent for users. Regarding time series data, we examined how generative models can predict future trends and explain the most influential factors behind those predictions. These techniques allow XAI to become more flexible and adaptable across different types of data, significantly enhancing its usefulness and impact.

 

 

Life in Pisa: More Than Just Research

Outside of the fascinating scientific conversations, Italy itself provided an incredible backdrop to this journey. Pisa, with its rich history and striking architecture, offered more than just academic inspiration. The narrow streets, the vibrant local culture, and of course, the delicious Italian cuisine made every day a delight. The local pizza, in particular, lived up to its reputation as some of the best in the world. However, what truly made my time in Pisa special was the people. Not only were they brilliant researchers, but they were also incredibly warm and welcoming—always willing to share ideas or simply enjoy a friendly chat. Between research sessions, I had the chance to explore cultural landmarks, embrace the relaxed Italian lifestyle, and take in the breathtaking beauty of Tuscany. All of this made my stay in Pisa not only professionally fulfilling but also personally enriching, leaving me with wonderful memories that extended beyond the academic world.