## Speaker | ## Affiliation |
---|---|

Eddy Keming Chen | University of California, San Diego |

Irene D’Amico | University of York |

Matteo Lostaglio | PsiQuantum |

Xiao Mi | |

Stefan Nimmrichter | University of Siegen |

Marcos Rigol | Penn State University |

Giulia Rubino | University of Bristol |

## Speaker | ## Affiliation |
---|---|

Stefan Aimet | Freie Universität Berlin |

Lindsay Bassman Oftelie | CNR-Pisa |

Felix Binder | Trinity College Dublin |

Tanmoy Biswas | Los Alamos National Laboratory, Los Alamos |

Federico Cerisola | University of Exeter |

Guilherme de Sousa | University of Maryland |

Paul Erker | Atominstitut (TU Wien) & IQOQI Vienna (ÖAW) |

David Gelbwaser-Klimovsky | Technion |

Santiago Hernandez Gomez | Istituto Nazionale di Ottica del Consiglio Nazionale delle Ricerche (CNR-INO) |

Michal Horodecki | International Centre for Theory of Quantum Information |

Karen Hovhannisyan | University of Potsdam |

Kenji Maeda | University of Massachusetts Boston |

Lata Kharkwal Joshi | SISSA Trieste, Italy |

Philip Kurian | Quantum Biology Laboratory, Howard University |

Patryk Lipka-Bartosik | University of Geneva |

Florian Meier | Atominstitut, TU Wien, Austria |

Harry Miller | University of Manchester |

Anthony Munson | University of Maryland, College Park |

Greeshma Oruganti | University of Maryland, College Park |

Laetitia Paula Bettmann | Trinity College Dublin |

Dario Poletti | Singapore University of Technology and Design |

Haitao Quan | Peking University |

Alberto Rolandi | University of Geneva |

Huang Ruo Cheng | Nanyang Technological University |

Valerio Scarani | CQT, National University of Singapore |

Finn Schmolke | Institut für theoretische Physik I, Universität Stuttgart |

Oles Shtanko | IBM Quantum |

Janine Splettstoesser | Chalmers University of Technology |

Amit Vikram | JQI and Dept. of Physics, University of Maryland, College Park |

Jake Xuereb | Vienna University of Technology |

Amanda Younes | University of California, Los Angeles |

Maciej Zgirski | Institute of Physics, PAN, Warsaw |

*University of California, San Diego*

Two of the most difficult problems in the foundations of physics are (1) what gives rise to the arrow of time and (2) what the ontology of quantum mechanics is. They are difficult because the fundamental dynamical laws of physics do not privilege an arrow of time, and the quantum-mechanical wave function describes a high-dimensional reality that is radically different from our ordinary experiences. In this talk, I characterize and elaborate on the ‘‘Wentaculus’’ theory, a new approach to time’s arrow in a quantum universe that offers a unified solution to both problems and is inspired by recent works in quantum thermodynamics. Central to the Wentaculus are (i) Density Matrix Realism, the idea that the quantum state of the universe is objective but can be impure, and (ii) the Initial Projection Hypothesis, a new law of nature that selects a unique initial quantum state. On the Wentaculus, the quantum state of the universe is sufficiently simple to be a law, and the arrow of time can be traced back to an exact boundary condition. The Wentaculus arguably yields a new strategy for understanding and proving the Second Law of Thermodynamics, as recent results in quantum thermodynamics imply that typical pure states starting in a low-entropy macrostate will be extremely close to the density matrix specified by the Wentaculus. The case study of the Wentaculus is further evidence that research in quantum foundations has much to benefit from engagement with quantum thermodynamics. (Related papers: https://arxiv.org/abs/2211.03973 and https://arxiv.org/abs/2405.01025)

*University of York*

Density functional theory (DFT) is one of the most powerful methods to study properties of interacting many-body systems [1]. It focuses on the local particle density (instead of the system’s state) as the key variable to derive the properties of interest. While its zero-temperature formalism and applications are well established, the development of the finite-temperature formalism [2] and functionals are comparatively in their infancy. Interest in this direction has been spurred by the recent advent of quantum technologies and quantum thermodynamics, in turn fuelled by the increasing ability to prepare and control quantum systems on a microscopic scale. This capability allows the experimental verification of fundamental properties such as fluctuation theorems, and the potential to develop new quantum technologies based on increasingly complex many-body quantum systems. Understanding thermodynamic properties of these systems is crucial as they could limit applications, but also help the fabrication and running of efficient quantum devices. The effects of many-body interactions on thermal machines have started to be considered only recently[3]; however, from a theoretical point of view, addressing the study of interacting many-body quantum systems at finite temperature and out-of-equilibrium demands significant effort, usually requiring the use of approximations to tackle the complexity of problems beyond a handful of particles.

In this context, we discuss the possibility of using DFT as a way to study the out-of-equilibrium thermodynamics of interacting many-body systems. We first consider finite-temperature approximations to the average work built from static, zero-temperature DFT concepts [4]. We apply this approach to out-of-equilibrium finite-time finite-temperature dynamics and discuss its advantages and limitations. We then propose an approach based on thermal DFT to extrapolate information about the statistics of work and the irreversibility of a thermal quench [5]. In particular, we demonstrate that, in this case, both the characteristic function of work and that of irreversible entropy production can be expressed as functionals of the finite-temperature equilibrium densities of the pre- and post-quench Hamiltonians. This positions these densities as fundamental variables to derive information on the thermodynamics, as requested by thermal density functional theory. We provide relevant functional approximations and illustrate the method with numerical examples.

[1] “Density functional theory: Its origins, rise to prominence, and future” R. O. Jones, Rev. Mod. Phys. 87, 897, (2015) [2] “Thermal Density Functional Theory in Context”, Pribram-Jones et al., Frontiers and Challenges in Warm Dense Matter. Lecture Notes in Computational Science and Engineering, vol 96. Springer, Cham. (2014) [3] “Many-body quantum thermal machines”, V. Mukherjee and U. Divakaran, Journal of Physics: Condensed Matter, 33, 454001 (2021) [4] “Approximating quantum thermodynamic properties using DFT” K. Zawadzki et al. J. Phys.: Condens. Matter 34 274002 (2022). [5] “Thermal density functional theory approach to quantum thermodynamics” A. Palamara et al., preprint (2024).

*PsiQuantum*

Thermodynamic phenomena such as catalysis are characterized by complex interactions between physical and chemical processes that require challenging multiscale modeling. Transport and equilibration are typically described at a mesoscopic scale via computational fluid-dynamics (CFD) simulations (Navier-Stokes, lattice-Boltzmann…), reaction-diffusion equations and other models. To what extent can quantum theory, with its underlying linear, unitary dynamics, be used to efficiently simulate the nonlinear, non-unitary differential equations of classical thermodynamics? I will give an overview of recent remarkable progress as well as outstanding challenges of this nascent research direction.

*Google*

Understanding how interacting particles approach thermal equilibrium is a major challenge of quantum simulators. Unlocking the full potential of such systems toward this goal requires flexible initial state preparation, precise time evolution, and extensive probes for final state characterization. We present a quantum simulator comprising 69 superconducting qubits which supports both universal quantum gates and high-fidelity analog evolution, with performance beyond the reach of classical simulation in cross-entropy benchmarking experiments. Emulating a two-dimensional (2D) XY quantum magnet, we leverage a wide range of measurement techniques to study quantum states after ramps from an antiferromagnetic initial state. We observe signatures of the classical Kosterlitz-Thouless phase transition, as well as strong deviations from Kibble-Zurek scaling predictions attributed to the interplay between quantum and classical coarsening of the correlated domains. This interpretation is corroborated by injecting variable energy density into the initial state, which enables studying the effects of the eigenstate thermalization hypothesis (ETH) in targeted parts of the eigenspectrum. Finally, we digitally prepare the system in pairwise-entangled dimer states and image the transport of energy and vorticity during thermalization. These results establish the efficacy of superconducting analog-digital quantum processors for preparing states across many-body spectra and unveiling their thermalization dynamics.

*University of Siegen*

When it comes to concrete physical applications of quantum thermodynamics, one question has been investigated since the very onset of the field: What happens if the thermal devices from our everyday world – engines, refrigerators, batteries – were operated with a quantum working medium? Will their performance simply be ruined by strong quantum fluctuations, or can we harness non-classical phenomena such as energy quantization, coherence, and entanglement to boost it instead? Many theoretical case studies and various proof-of-principle experiments have led to positive and negative answers. Since genuine quantum advantages are hard to come by and rarely feasible in the lab, it is vital to continue exploring quantum optical and optomechanical toy models for thermal machines. I will share our recent ideas and observations concerning somewhat exotic quantum working fluids and their advantages and disadvantages for the paradigmatic Otto cycle and the three-level maser.

*Penn State University*

Quantum-chaotic systems are known to exhibit eigenstate thermalization and to generically thermalize under unitary dynamics. In contrast, quantum-integrable systems exhibit a generalized form of eigenstate thermalization and need to be described using generalized Gibbs ensembles after equilibration. I will discuss evidence that the entanglement properties of highly excited eigenstates of quantum-chaotic and quantum-integrable systems are fundamentally different. They both exhibit a typical bipartite entanglement entropy whose leading term scales with the volume of the subsystem. However, while the coefficient is constant and maximal in quantum- chaotic models, in integrable models it depends on the fraction of the system that is traced out. The latter is typical in random Gaussian pure states. I will also discuss the nature of the subleading corrections that emerge as a consequence of the presence of abelian and nonabelian symmetries in such models.

*University of Bristol*

Work is a process-based quantity, and its measurement typically requires interaction with a measuring device multiple times. While classical systems allow for non-invasive and accurate measurements, quantum systems present unique challenges due to the influence of the measuring device on the final value of work. As recent studies have shown, among these challenges is the impossibility of formulating a universal definition of work that respects energy conservation for coherent quantum systems and is compatible with the Jarzynski equality. In this talk, I will show how this challenge can be overcome by introducing a genuinely quantum, positive correction to the Jarzynski equality stemming from imposing energy conservation. When sufficiently large, this correction forces quantum work to violate the second law more often. Moreover, I will present a modified two-point measurement (TPM) scheme for work that ensures energy conservation for coherent quantum states and aligns with this quantum-corrected fluctuation relation. I will further underscore the practicality and effectiveness of this scheme by providing a detailed circuit implementation for it.

*Freie Universität Berlin*

Landauer’s principle establishes a bridge between information theory and thermodynamics, by fundamentally relating the erasure of a single bit of information to a minimum amount of heat dissipation. While extensively explored in the context of small quantum systems, extending this principle to complex quantum many-body systems is essential for understanding equilibration, with thermodynamics emerging as an effective coarse-grained description. This talk aims to present the first experimental measurement of Landauer’s principle in a quantum field simulator consisting of two coupled one-dimensional ultra-cold Bose gases. We characterized entropy production along a global mass quench, decomposing it into its information-theoretic contributions using mutual information and relative entropy. Additionally, we may briefly discuss theoretical work on the quantum thermodynamics of local quantum quenches in the many-body domain.

*CNR-Pisa*

A key hurdle for the success of quantum computers is the ability to prepare pure (i.e., cold) qubits. Dynamic cooling, whereby a target qubit is cooled at the expense of heating up $N-1$ further identical qubits by means of a global unitary operation, was proposed as a means of effective cooling over two decades ago. A standard back-of-the-envelope high temperature estimate establishes that the target qubit temperature can only be dynamically cooled by at most a factor of $1/\sqrt{N}$. Here, we provide the exact expression for the smallest temperature to which the target qubit can be cooled and reveal that there is a crossover from the high initial temperature regime where the scaling is in fact $1/\sqrt{N}$ to a low initial temperature regime where a much faster scaling of $1/N$ occurs. This slow $1/\sqrt{N}$ scaling, relevant for early high-temperature NMR quantum computers, is the reason dynamic cooling was initially dismissed as ineffectual; the fact that current low-temperature quantum computers fall in the fast $1/N$ scaling regime reinstates the appeal of dynamic cooling. We further show that the associated work cost of cooling is exponentially more advantageous in the low temperature regime. Finally, we explore the complexity, in terms of quantum circuits, of dynamic cooling and discuss schemes for implementation on near-term quantum computers. We examine the effect of hardware noise on cooling and conclude that it is a promising technique for near-future quantum computers with reduced levels of noise.

*Trinity College Dublin*

Efficiently harvesting thermodynamic resources requires a precise understanding of their structure. This becomes explicit through the lens of information engines—thermodynamic engines that use information as fuel. Maximizing the work harvested using available information is a form of physically-instantiated machine learning that drives information engines to develop complex predictive memory to store an environment’s temporal correlations. We show that an information engine’s complex predictive memory poses both energetic benefits and risks. While increasing memory facilitates detection of hidden patterns in an environment, it also opens the possibility of thermodynamic overfitting, where the engine dissipates additional energy in testing. To address overfitting, we introduce thermodynamic regularizers that incur a cost to engine complexity in training due to the physical constraints on the information engine. We demonstrate that regularized thermodynamic machine learning generalizes effectively. In particular, the physical constraints from which regularizers are derived improve the performance of learned predictive models. This suggests that the laws of physics jointly create the conditions for emergent complexity and predictive intelligence.

*Los Alamos National Laboratory, Los Alamos*

We consider a model of a heat engine operating in the microscopic regime: the two-stroke engine. It produces work and exchanges heat in two discrete strokes that are separated in time. The engine consists of two d-level systems initialized in thermal states at two distinct temperatures. Additionally, an auxiliary non-equilibrium system called catalyst may be incorporated into the engine, provided the state of the catalyst remains unchanged after the completion of a thermodynamic cycle. This ensures that the work produced arises solely from the temperature difference. Upon establishing the rigorous thermodynamic framework, we characterize two-fold improvement stemming from the inclusion of a catalyst. Firstly, we show that the presence of a catalyst allows for surpassing the optimal efficiency of two-stroke heat engines which are not assisted by a catalyst. In particular, we prove that the optimal efficiency for two-stroke heat engine consisting of two-level systems is given

*University of Exeter*

State of the art nanoscale electromechanical devices provide exciting platforms to explore and test thermodynamics at small scales, where fluctuations dominate and quantum effects become relevant. In particular, suspended carbon nanotubes (CNTs) are a very rich and promising platform which features a wide range of phenomena, from quantised electron transport through quantum dots, to spin qubits and mechanical motion. In this talk, I will discuss recent experiments where we have measured ultra-strong coupling between the charge transport along the CNT and the mechanical motion. Furthermore, through precise measurement of the quantum dot’s population and CNT displacement, we estimate the energy and entropy of the different degrees of freedom. In this way, we explore the entropy to energy conversion between the charge state of the dot and the mechanical displacement. A thorough understanding of these exchanges naturally leads to a generalised formulation of the Landauer bound that takes into account the non-equilibrium nature of these devices and strong coupling to the reservoirs. Moreover, the coupling between mechanics and electron transport can lead to non-equilibrium steady states that can exhibit long lived self-sustained oscillations. Finally, I will discuss the first observation of coupling between a spin qubit and mechanical motion of the CNT. We have further developed a detailed theoretical model to understand the origin and nature of this coupling and estimate its strength. The observation of the spin-mechanics coupling further opens the doors to a new range of experimental tests of thermodynamics involving energetic coherences.

*University of Maryland*

To manipulate quantum systems efficiently, one needs to make measurements of quantum observables, collect the measurement outcome, and act accordingly, providing feedback. The typical procedure to perform a quantum measurement is understood using projective measurements that destroy all quantum coherence and leave the state in a particular eigenstate of the measured operator. This process is invasive as it destroys the quantumness of the state and can lead to a quantum Zeno effect in the limit of quick and repeated measurements.

*Technion*

Systems that break time-reversal symmetry can break detailed balance at equilibrium (DBE) when the interaction strength with the thermal bath goes beyond the weak coupling limit, as we showed in a recent work (see PRL 131, 040401 (2023)). In this presentation, we show that the lack of DBE influences how the system reaches thermal equilibrium. Instead of the standard exponential decay, thermalization can oscillate when the correct temperature conditions are met. Moreover, we demonstrate that the transition from the exponential to the oscillatory dynamics occurs at a critical temperature where the dynamical matrix has an exceptional point. This point is characterized by the divergences of time scales.

*Istituto Nazionale di Ottica del Consiglio Nazionale delle Ricerche (CNR-INO)*

Kirkwood-Dirac quasiprobabilities are directly connected to the quantum correlation function of two observables measured at two distinct times, and are therefore relevant for fundamental physics and quantum technologies. However, their experimental reconstruction may be challenging when expectation values of incompatible observables are involved. Strategies to reconstruct them using weak values, or combining strong measurements have been used in the past. Here, we use a more direct approach, making use of an interferometric scheme with the help of an auxiliary system, we fully reconstruct the characteristic function of the Kirkwood-Dirac quasiprobability distribution. In our experiment, the interferometric scheme is realized on a nitrogen-vacancy center in diamond using the electronic spin as a quantum system and the nuclear spin of the nitrogen as an auxiliary qubit. Thus we infer the quasiprobability distribution of the work exerted by the system from the measured characteristic function, and we show the behavior of the first and second moments of work. We are also able to study the uncertainty of measuring the Hamiltonian of the system at two times, via the Robertson-Schr{\o}dinger uncertainty relation, for different initial states.

*International Centre for Theory of Quantum Information*

Let initial and final state be equilibrium states with respect to initial an final Hamiltonian. Consider a stochastic process composed of interlaced level transformations and thermalizations. Total work obtained in such a process is a sum of random variables which are works obtained in individual level transformations. If the stochastic process is quasistatic - i.e. is composed of many small steps, the total work is highly concentrated around the change of free energy. We pose the question of what happens, if one is allowed to run completely arbitrary process. In particular we allow arbitrary energy change during level transformations, and unbounded number of steps. Such a process is very hard to analyse by standard tools, as available concentration inequalities hold for random variables, that are somehow controlled - e.g. are jointly bounded, or their variances are bounded, or the total variance is controlled. In this contribution we prove that with finite probability, the obtained work is close to the free energy change. The proof involves various configurations of Cantelli theorem, Berry-Essen theorem, and other techniques that are not standard tools from probability theory. We apply the result to verify whether one can perform thermal operations without memory.

*University of Potsdam*

When two initially thermal many-body systems start to interact strongly, their transient states quickly become non-Gibbsian, even if the systems eventually equilibrate. To see beyond this apparent lack of structure during the transient regime, we use a refined notion of thermality, which we call g-local. A system is g-locally thermal if the states of all its small subsystems are marginals of global thermal states. We numerically demonstrate for two harmonic lattices that whenever the total system equilibrates in the long run, each lattice remains g-locally thermal at all times, including the transient regime. This is true even when the lattices have long-range interactions within them. In all cases, we find that the equilibrium is described by the generalized Gibbs ensemble, with three-dimensional lattices requiring special treatment due to their extended set of conserved charges. We compare our findings with the well-known two-temperature model. While its standard form is not valid beyond weak coupling, we show that at strong coupling it can be partially salvaged by adopting the concept of a g-local temperature.

*University of Massachusetts Boston*

Quantum channel classification constitutes a significant task to explore the dynamics of quantum systems and design protocols for robust quantum information processing. Quantum processes are categorized into three distinct classes: unitary, non-unitary but unital, and non-unital evolution. In this study, we illustrate how the quantum fluctuation theorem, applied within the framework of open quantum systems utilizing the one-time measurement scheme, facilitates the differentiation of these processes. Specifically, we demonstrate that the heat conditioned upon the initial energy measurement of the system, termed conditional heat, serves as the metric for quantum channel classification. We establish three conditions that the conditional heat must satisfy for the system to undergo any of these evolutions. Moreover, we provide an operational significance to the conditional heat by associating it with quantum hypothesis testing.

*SISSA Trieste, Italy*

The non-equilibrium physics of many-body quantum systems harbor various unconventional phenomena. In recent studies on thermalization following quantum quenches on specific states an interesting and puzzling phenomena has emerged – the quantum Mpemba effect. In this effect, following a quantum quench, an initially tilted ferromagnet restores its symmetry more rapidly when it is farther from a symmetric state compared to when it is closer. In this talk I will present the first experimental evidence of this effect in a trapped-ion quantum simulator. The symmetry breaking and restoration are monitored through entanglement asymmetry, probed via randomized measurements, and postprocessed using the classical shadows technique. Furthermore, a direct comparison between the late time thermal symmetric theoretical state and the experimental state offer direct evidence of subsystem thermalization.

*Quantum Biology Laboratory, Howard University*

Superradiance is the phenomenon of many identical quantum systems absorbing and/or emitting photons collectively at a higher rate than any one system can individually. This phenomenon has been studied analytically in idealized distributions of electronic two-level systems (TLSs), each with a ground and excited state, as well as numerically in realistic photosynthetic nanotubes and cytoskeletal architectures. Superradiant effects are studied here in realistic biological mega-networks of tryptophan (Trp) molecules, which are strongly fluorescent amino acids found in many proteins. Each Trp molecule acts as a chromophore absorbing in the ultraviolet spectrum and can be treated approximately as a TLS, with its $1L_a$ excited singlet state; thus, organized Trp networks can exhibit superradiance. Such networks are found, for example, in microtubules, actin filaments, and amyloid fibrils. Microtubules and actin filaments are spiral-cylindrical protein polymers that play significant biological roles as primary constituents of the eukaryotic cytoskeleton, while amyloid fibrils have been targeted in a variety of neurodegenerative diseases. We treat these proteinaceous Trp networks as open quantum systems, using a non-Hermitian Hamiltonian to describe interactions of the chromophore network with the electromagnetic field. We numerically diagonalize the Hamiltonian to obtain its complex eigenvalues, where the real part is the energy and the imaginary part is its associated enhancement rate. We find that all three of these structures exhibit highly superradiant states near the low-energy portion of the spectrum, which enhances the magnitude and robustness of the quantum yield to static disorder and thermal noise. The high quantum yield and stable superradiant states in these biological architectures may play a photoprotective role in vivo, downconverting highly energetic ultraviolet photons emitted from reactive free radical species and thereby mitigating biochemical stress and photophysical damage. Contrary to conventional assumptions that quantum effects cannot survive in large biosystems at high temperatures, our results suggest that macropolymeric collectives of TLSs in microtubules, actin filaments, and amyloid fibrils exhibit increasingly observable and robust effects with increasing length, up to the micron scale, due to quantum coherent interactions in the single-photon limit. Superradiant enhancement and high quantum yield exhibited in neuroprotein polymers could thus play a crucial role in information processing in the brain, the development of neurodegenerative diseases such as Alzheimer’s and related dementias, and a wide array of other pathologies characterized by anomalous protein aggregates.

*University of Geneva*

We develop a physics-based model for classical computation based on autonomous quantum thermal machines. These machines consist of few interacting quantum bits (qubits) connected to several environments at different temperatures. Heat flows through the machine are here exploited for computing. The process starts by setting the temperatures of the environments according to the logical input. The machine evolves, eventually reaching a non-equilibrium steady state, from which the output of the computation can be determined via the temperature of an auxilliary finite-size reservoir. Such a machine, which we term a ``thermodynamic neuron’’, can implement any linearly-separable function, and we discuss explicitly the cases of NOT, 3-majority and NOR gates. In turn, we show that a network of thermodynamic neurons can perform any desired function. We discuss the close connection between our model and artificial neurons (perceptrons), and argue that our model provides an alternative physics-based analogue implementation of neural networks, and more generally a platform for thermodynamic computing.

*Atominstitut, TU Wien, Austria*

Computation is an input-output process, where a program encoding a problem to be solved is inserted into a machine that outputs a solution. Whilst a formalism for quantum Turing machines which lifts this input-output feature into the quantum domain has been developed, this is not how quantum computation is physically conceived. Usually, such a quantum computation is enacted by the manipulation of macroscopic control interactions according to a program executed by a classical system. To understand the fundamental limits of computation, especially in relation to the resources required, it is pivotal to work with a fully self-contained description of a quantum computation where computational and thermodynamic resources are not obscured by the classical control.

In this talk, we investigate the question: Can we build a physical model for quantum computation that is fully autonomous?, by envisaging a quantum machine where the program to be executed as well as the control are both quantum. We do so by developing a framework that we dub the autonomous Quantum Processing Unit (aQPU). This machine, consisting of a timekeeping mechanism, instruction register and computational system allows an agent to input their problem and receive the solution as an output, autonomously. Using the theory of open quantum systems and results from the field of quantum clocks we are able to model the thermodynamic cost of computation, including the contributions coming from the control. We find that inputting only finite thermodynamic resources leads to imperfect computation, and that perfect precision for the computation requires divergent thermodynamic resources.

Finally, we discuss how the aQPU formalism applies beyond computation and address the ongoing discourse surrounding the work cost of controlling quantum thermal machines. Serving as a formalism for rendering such machines autonomous, the aQPU allows for the assessment of control costs within this domain, offering a potential resolution to such questions.

A preprint of the work to be presented can be found on https://arxiv.org/abs/2402.00111.

*University of Manchester*

When a small system is driven out of equilibrium, the dissipated work cost is a stochastic variable and it is important to consider its higher order fluctuations alongside average behaviour. A fundamental question is to understand the role of quantum mechanics in such processes. In this talk I will discuss the utility of Kubo’s linear response theory (LRT) for exploring stochastic thermodynamics in quantum regimes. The power of LRT lies in the fact that it can be used to study work statistics from a phenomenological point of view without the need for an exact dynamical model. I will demonstrate how it can be used to derive refined constraints on the form of work distributions that can arise, and it will be shown how LRT predicts a distinctly quantum signature in the work distribution through increased dispersion at low temperatures. I will link this signature to the breakdown in the energy equipartition theorem that is usually encountered in equilibrium statistical mechanics.

*University of Maryland, College Park*

According to Landauer, irreversible computation requires work. In principle, one can often evade work costs by implementing reversible transformations. In practice, complexity—the difficulty of realizing a quantum process—poses an obstacle: a realistic agent can perform only a limited number of gates and so not every reversible transformation. Hence an agent, if unable to complete a task unitarily, may expend work on an irreversible process, such as erasure, to finish the job. We pinpoint a work–complexity trade-off, quantifying how protocols that involve higher-complexity unitaries require less work and vice versa. We illustrate with the task of resetting qubits to the all-zero state using a limited number of gates and work-costing erasure. To quantify the resetting’s optimal efficiency, we introduce the complexity entropy, which quantifies a state’s apparent randomness to an agent who can implement only limited-complexity measurement effects. The complexity entropy emerges as a general tool for quantifying the optimal efficiencies with which complexity-restricted agents can perform tasks in quantum thermodynamics and information processing.

*University of Maryland, College Park*

Gauge theories underpin our best models of nature’s fundamental interactions, yet their open-system quantum thermodynamics remains largely unexplored. Lattice Hamiltonian formulations of these theories are most naturally suited to quantum simulations. However, calculations with such formulations must be restricted to a certain subspace of the total Hilbert space. This restriction can engender significant interactions between the two constituents - a system and a reservoir - of a bipartite system. We address these interactions using strong-coupling quantum thermodynamics. To model lattice gauge theories undergoing sudden quenches, we present a framework for defining thermodynamic quantities for a system strongly coupled to a reservoir. We have culled three definitions of the system’s internal energy from literature, and these definitions lead to three distinct definitions of work and heat for sudden quench processes. We test the validity of the second law of thermodynamics under the different work and heat definitions. Only two of the work (and so heat) definitions satisfy the second law, we find. Using a thermodynamically consistent framework, we calculate the work and heat exchanged during a quench of a one-dimensional lattice gauge theory. Work and heat, we discover, signal a phase transition in this gauge theory. After bridging lattice gauge theories and open-system quantum thermodynamics, we tie in a concept from quantum information to provide a potential path toward experimental calculations of thermodynamic quantities.

*Trinity College Dublin*

Accurately modelling macroscopic reservoirs in open systems theory is crucial to resolve thermodynamic effects that arise at a finite temperature, beyond the linear response and outside of the weak coupling regime. In the mesoscopic-leads formulation, macroscopic reservoirs are modeled by a finite collection of modes that are continuously damped toward thermal equilibrium by an appropriate Gorini-Kossakowski-Sudarshan-Lindblad (GKSL) master equation. The system together with the finite number of lead modes is referred to as the extended system. To access the time-resolved full distribution of integrated thermodynamic currents, such as heat and entropy production, for systems with a quadratic Hamiltonian, we employ a trajectory-unraveling technique on the GKSL master equation governing the dynamics of the covariance matrix of the extended system. We show that the integral fluctuation theorems for the total entropy production, as well as the martingale and uncertainty entropy production, hold. Furthermore, we investigate the fluctuations of the dissipated heat in finite-time Landauer erasure.

*Singapore University of Technology and Design*

A deeper understanding of the emergence of out-of-equilibrium statistical physics, and enhanced ability to control quantum many-body systems with single-site precision, can lead to a new era of quantum transport exploration. In order to reach that, we focus on a nonequilibrium scenario in which two nonintegrable systems prepared in different states are locally and non-extensively coupled to each other. We show the emergence of steady-like currents which are well defined and typical. We identify their origin to be from a prethermalization mechanism. We then discuss the experimental realization of these findings on state-of-the-art experimental setups.

*Peking University*

Fluctuation theorems establish connections between fluctuations and irreversibility by considering stochastic thermodynamic quantities. In this study, we derive special relativistic covariant fluctuation theorems by defining covariant work, heat, and entropy. We focus on a driven scalar field in contact with a Markovian heat bath. For moving inertial observers relative to the heat bath, both the energy components and the momentum components of work and heat must be included to formulate the corresponding fluctuation theorems, and the four-velocity of the heat bath plays an important role. It turns out that, the irreversibility is characterized by the conventional thermodynamic quantities in the rest reference frame of the heat bath, regardless of the reference frame of the observer. Even in the nonrelativistic case, the above identification is nontrivial. We study the work statistics for a Klein-Gordon field in a driving process measured by a moving inertial observer to explicitly verify the covariant version of the Jarzynski equality.

*University of Geneva*

A central task in finite-time thermodynamics is to minimize the excess or dissipated work Wdiss when manipulating the state of a system immersed in a thermal bath. We consider this task for an N-body system whose constituents are identical and uncorrelated at the beginning and end of the process. In the regime of slow but finite-time processes, we show that Wdiss can be dramatically reduced by considering collective protocols in which interactions are suitably created along the protocol. This can even lead to a sublinear growth of Wdiss with N: Wdiss ∝ N^x with x < 1; to be contrasted to the expected Wdiss ∝ N satisfied in any noninteracting protocol. We derive the fundamental limits to such collective advantages and show that x = 0 is in principle possible; however, it requires long-range interactions. We explore collective processes with spin models featuring two-body interactions and achieve noticeable gains under realistic levels of control in simple interaction architectures. As an application of these results, we focus on the erasure of information in finite time and prove a faster convergence to Landauer’s bound.

*Nanyang Technological University*

Quantum information-processing techniques enable work extraction from a system’s inherently quantum features, in addition to the classical free energy it contains. Meanwhile, the science of computational mechanics affords tools for the predictive modelling of non-Markovian classical and quantum stochastic processes. We combine tools from these two sciences to develop a theoretical prototype for a predictive quantum engine: a machine that charges a battery by feeding on a multipartite quantum system whose parts are temporally correlated via a classical stochastic process. In other words, the engine’s fuel is a classical stochastic process with quantum outputs. We also test the engine on simple models to benchmark the performance of our engine against various alternatives, including one without coherent quantum information-processing and one without predictive functionality; our predictive quantum engine is shown to outperform these alternatives in terms of work output. Finally, we evaluate the engine’s performance on fuel processes with different degrees of temporal correlations and find the work yield to increase with such correlations. Additionally, our results suggest that there exists a phase boundary in parameter space where memory of past observations can enhance the work extraction. Our work opens the prospect of machines that harness environmental free energy in an essentially quantum, essentially time-varying form.

*CQT, National University of Singapore*

A quantum translation of Crooks’ fluctuation theorem (1998) was presented by Tasaki already in the year 2000; but exporting fluctuation theorems to more general quantum processes has proved challenging for years, with issues related to the definition of work, the role of coherences, etc. In 2021, some of us put up an approach to fluctuation theorems that avoids all those problems, and also opens new perspectives for classical stochastic thermodynamics [1,2]. The core of the idea is to define reverse processes as Bayesian retrodiction of the forward process. In the quantum case, the reverse map is given by the Petz map (as previously proposed by Kwon and Kim [PRX 9, 031029 (2019)] with a different motivation). This talk presents this approach, arguing for its universality [2,3]. The rederivation of known fluctuation theorems will also be sketched, and novel research questions will be overviewed [3-6].

*Institut für theoretische Physik I, Universität Stuttgart*

Observing a quantum object may dramatically affect its dynamics in a non-classical manner. We show that a continuously monitored quantum many-body system can undergo a spontaneous transition from stochastic dynamics to noise-free stable synchronization at the level of individual quantum trajectories, when subject to standard homodyne detection. This effect can occur in generic many-body quantum systems based on the existence of decoherence-free subspaces. Such a synchronization transition is always associated with the localization of the state in Hilbert space. On the trajectory level, ergodicity is thus typically broken and synchronization may appear along individual realizations while being absent at the ensemble level. These findings can be extended and generalized to predict the asymptotic fate of quantum trajectories. Generically, if the corresponding Lindblad equation admits multiple steady states, localization transitions and ergodicity breaking still occur but the conditions may depend on the measurement scheme.

*IBM Quantum*

Thermal (Gibbs) states are important for the study of thermal equilibrium and quantum thermodynamics. Although the simulation of Gibbs states for quantum Hamiltonians is challenging for classical computers, the advent of quantum computing offers new possibilities for their preparation using quantum circuits. While many methods for Gibbs state sampling have been proposed recently, they often suffer from noise sensitivity and typically require fault-tolerance. I will explore a few algorithms that may be more practical. The first algorithm exploits the ergodicity of the target Hamiltonian, as given by the Eigenstate Thermalization Hypothesis, and shows promise for efficiency on quantum computers for some Hamiltonians. The second algorithm provides a universal approach, but requires exponential running time. Both algorithms incorporate an error mitigation strategy that may prove amenable to today’s noisy quantum hardware.

*Chalmers University of Technology*

Standard thermodynamic machines transform heat into work or vice versa. Macroscopic quantities, like temperatures, quantify the efficiency of their operation. This is different in nanoscale systems—often smaller than the length scales on which thermalisation takes place and where fluctuations can be of the same magnitude as average quantities. In this talk, I will show some properties of nanoelectronic devices, operating as engines, that are unique to the small scales on which they are realized. I will start by introducing novel relations between charge currents flowing in nanoelectronic conductors and their noise. In contrast to standard fluctuation-dissipation relations, valid in equilibrium, these relations for generic nonequilibrium situations consist in inequalities, setting bounds on the currents that can be obtained given a certain noise level of the signal. This has direct implications on the performance of nanoelectronic engines, complementing recently introduced thermodynamics uncertainty relations. However, noise is not only a nuisance, but noisy resources can be beneficial for the operation of an engine! In the second part of my talk, I will introduce engines working without absorbing heat or work from the resource on average, seemingly violating the second law of thermodynamics. This is possible when the resource has nonthermal properties (namely, it cannot be characterized by a temperature or potential) and requires a minimum amount of fluctuations in the input.

*JQI and Dept. of Physics, University of Maryland, College Park*

In this talk, we’ll provide an overview of some recent results that uncover universal connections between quantum dynamics and the energy spectrum, which can be intuitively regarded as refinements of the energy-time uncertainty principle. At the level of the fine structure of the spectrum, “quantum chaos” has often been associated with rigidity in spectral fluctuations, as exemplified by the eigenvalues of random matrices. We will show that this rigidity is directly indicative of a dynamical form of quantum ergodicity, which measures the ability to use an orthonormal basis for “quantum time-keeping” in the Hilbert space. Moreover, this quantum dynamical phenomenon captures the classical intuition behind ergodicity as exploring all regions of a phase space, addressing a long-standing question of the latter’s counterpart in an isolated quantum system. Zooming out to the overall profile of the energy spectrum, we will show that the spectral form factor sets a quantum speed limit that is nontrivial for asymptotically long times, and tighter than comparable versions of known speed limits based on the energy-time uncertainty principle. Applying this speed limit to the scrambling of information in a many-body system, we will obtain bounds on the fast scrambling of initial states under arbitrary interactions with a bath at any temperature. This sets an exact limit on the fastest allowed scrambling or thermalization time in an arbitrary quantum mechanical system.

*Vienna University of Technology*

Quantum direct coding or Schumacher compression generalised the ideas of Shannon theory, gave an operational meaning to the von Neumann entropy and established the term qubit. But remembering that information processing is carried out by physical processes prompts one to wonder what thermodynamic resources are required to compress quantum information and how they constrain one’s ability to perform this task. That is, if Alice and Bob only have access to thermal quantum states and clocks with finite accuracy, how well can they measure, encode and decode pure quantum state messages? In this work we examine these questions by modelling Alice’s typical measurement as a unitary involving a measurement probe, investigating imperfect timekeeping on encoding and decoding and considering the role of temperature in Bob’s appended qubits. In doing so, we derive fidelity bounds for this protocol involving the correlations Alice can form with their measurement probe, the variance of the clock’s ticks and the temperature of Bob’s qubits. Finally, we give an insight into the entropy produced by these two agents throughout the compression protocol by relating the resources they use to a quantum thermodynamic cooling protocol.

*University of California, Los Angeles*

Laser cooling typically uses carefully tuned lasers to couple atoms to a near-vacuum optical mode of the electromagnetic field, allowing them to be cooled by spontaneous emission into that mode. When this coupling is controlled by the occupation of another mode, cooling can sometimes be performed by heating that mode, known as ‘cooling by heating’. An experimentally accessible implementation is resolved-sideband cooling, a commonly used technique to cool trapped ions to their motional ground state (corresponding to microkelvin temperatures). While this scheme typically uses a coherent, narrowband laser to drive the cooling, we propose using unfiltered sunlight instead as a demonstration of cooling by heating. We measure the spectrum of sunlight coupled into single mode optical fiber for delivery to the ion, compare it to the spectrum of a blackbody in quasi-one dimension, and provide estimates of the achievable cooling rate with typical experimental parameters. Finally, we discuss and demonstrate a related scheme for cooling the internal states of the ion using sunlight, an example of internal cooling by heating.

*Institute of Physics, PAN, Warsaw*

We introduce the Single Vortex Box (SVB) – a nanodevice that allows to treat a single superconducting vortex as a macroscopic, albeit quantized “particle”, which can be created and annihilated with pulses of electrical current [1,2]. Using the method of fast nanosecond resolving switching thermometry [3], we measure the temperature rise and the subsequent thermal relaxation resulting from the expulsion of just a single magnetic field vortex out of the SVB. Our experiment provides a calorimetric estimation of the dissipation in a superconductor due to a single moving vortex. This is a feat of the fundamental importance that has never been accomplished before for the lack of appropriate tools. Our pioneering demonstration is also a pivotal step towards the development of the vortex electronics i.e. memory cells, superconducting diodes, and logical elements.

“### " & {Full Name} & “\n*” & Affiliation & “*\n\n#### " & Title & “\n” & Abstract & “\n\n—\n”