## Title: Dissipative Quantum Neural Network¶
## Presenter: Beatrix Hiesmayr
## Date: 13.10 2025
## Participants:
Aurelien Coussat (AC)
Marcin Stolarski(MS)
Aleksander Ogonowski (AO)
Wojciech Krzemień (WK)
Konrad Klimaszewski (KK)
Roman Shopa (RS)
Lech Raczynski (LR)
Michał Mazurek (MM)
TUOMHS - Magnus Simon
Michał Obara (MO)
Wojciech Wiślicki (WW)
Beatrix Hiesmayr (BH)
Michał Obara (MO)
Andrzej (A)
Gagandeep Singh (GS)
Kamil Dulski (KD)
## Questions/Remarks:
MS: Why, during the 300 iterations, is the network not improving, and suddenly it improves?
BH: It is not clear yet.
WK: Could you provide an example of a practical use case where this particular approach could have an application?
BH: Not yet :-)
WK: How does the selection of the input data affect the choice of the cost function? In classical ML, it is sometimes more convenient to choose a given cost function depending on the characteristics of the input data sample. In the QML case, can one find a general criterion to decide which cost function is better?
BH: It is similar to the classical case. There is an interplay between the cost function and the geometry.
WK What are the parameters/weights? Are they complex numbers?
BH: No, they are real numbers corresponding to unitarities operations
WK: Can those studies be realised on existing quantum machines?
BH: In principle, it is feasible, but we haven't done that yet.