From imaging algorithms to quantum methods Seminar

Europe/Warsaw
https://cern.zoom.us/j/66151941204?pwd=n7upvvZYibexBhbtyn5kvTpy36L0Wo.1 (Zoom)

https://cern.zoom.us/j/66151941204?pwd=n7upvvZYibexBhbtyn5kvTpy36L0Wo.1

Zoom

Konrad Klimaszewski (NCBJ), Wojciech Krzemien (NCBJ)

## Title: Dissipative Quantum Neural Network¶
## Presenter: Beatrix Hiesmayr
## Date: 13.10 2025

## Participants:

Aurelien Coussat (AC)
Marcin Stolarski(MS)
Aleksander Ogonowski (AO)
Wojciech Krzemień (WK)
Konrad Klimaszewski (KK)
Roman Shopa (RS)
Lech Raczynski (LR)
Michał Mazurek (MM)
TUOMHS - Magnus Simon
Michał Obara (MO)
Wojciech Wiślicki (WW)
Beatrix Hiesmayr (BH)
Michał Obara (MO)
Andrzej (A)
Gagandeep Singh (GS)
Kamil Dulski (KD)

## Questions/Remarks:
MS: Why, during the 300 iterations, is the network not improving, and suddenly it improves?
BH: It is not clear yet. 

WK: Could you provide an example of a practical use case where this particular approach could have an application?
BH: Not yet :-)

WK: How does the selection of the input data affect the choice of the cost function? In classical ML, it is sometimes more convenient to choose a given cost function depending on the characteristics of the input data sample.  In the QML case, can one find a general criterion to decide which cost function is better?
BH: It is similar to the classical case. There is an interplay between the cost function and the geometry.

WK What are the parameters/weights? Are they complex numbers?
BH: No, they are real numbers corresponding to unitarities operations

WK: Can those studies be realised on existing quantum machines?
BH: In principle, it is feasible, but we haven't done that yet.

 

There are minutes attached to this event. Show them.
    • 10:00 11:00
      Dissipative Quantum Neural Network 1h

      Artificial neural networks have been shown to fulfil unexpected image recognition abilities. In this talk I will give an introduction to quantum machine learning that focuses on ideas to quantify artificial neural networks by changing neurons to qubits or qudits. We will discuss advantages and disadvantages of such a
      quantization of an artificial neural network. Focusing on the impact of different cost functions on the optimization process, we show significant training differences among the cost functions considered.
      Our findings facilitate both the theoretical understanding and the
      experimental implementability of quantum neural networks.

      Speaker: Beatrix Hiesmayr
    • 11:00 11:30
      Discussion 30m
Your browser is out of date!

Update your browser to view this website correctly. Update my browser now

×