Creating and sharing knowledge for telecommunications

Entanglement and the second law of thermodynamics


on 13-12-2013

... Marco Pezzuto, University of Trieste

13/12/2013, 16:15.
Room P3.10, Mathematics Building, IST.

Under certain assumptions, it is possible to define for an open quantum system many key thermodynamic quantities, such as the internal energy, entropy, exchanged heat and work. By means of these quantities, the zeroth, first and second law of thermodynamics can also be given a consistent formulation. A brief introduction on the dynamics of open quantum systems will be given, together with a review of the concepts of positivity and complete positivity in relation with entanglement. Afterwards, it will be shown how to define the law of thermodynamics, and specifically the second one in terms of positivity of the internal entropy production, and the connections with complete positivity of the dynamics. Such techniques have been applyed to a concrete case, namely a model for a quantum pumping process in a noisy environment. The master equation originally proposed for this model turns out to provide a non-completely positive dynamics, and it was found that, in certain conditions, this fact can lead to consequences from a thermodynamical point of view, such as violations of the second law. Complete positivity, beside guaranteeing a physically consistent description when entanglemet is taken into account, seems then to gain an important role in relation to thermodynamics.

Joint session with the Physics of Information Seminar. More Information..

Session - Computational Complexity of the GPAC


on 29-11-2013

... Amaury Pouly, École Polytechnique

November 29, 2013, Friday, 16h15m.

Abstract: In 1941, Claude Shannon introduced a model for the Differential Analyzer, called the General Purpose Analog Computer (GPAC). Originally it was presented as a model based on circuits, where several units performing basic operations (e.g. sums, integration) are interconnected. However, Shannon itself realized that functions computed by a GPAC are nothing more than solutions of a special class of differential equations of the form y′=p(y) where p is a (vector of) polynomial. Analog computers have since been replaced by digital counterparts. Nevertheless, one can wonder whether the GPAC could in theory have super-Turing computable power. A few years ago, it was shown that Turing-based paradigm and the GPAC have the same computational power. So, switching to analog computers would not enable us to solve the Halting problem, or any uncomputable exotic thing, but we can nonetheless compute everything a Turing machine does (given enough resources), and a return to analog technology would at least not mean a loss of computational power. However, this result did not shed any light on what happens at a computational complexity level: can an analog computer (GPAC) solve a problem faster (modulo polynomial reductions) than a digital computer (Turing machine). I will provide some elements which show that some reasonable restrictions of the GPAC are actually equivalent to P (polynomial time) and NP (nondeterministic polytime), and that there are strong links with Computable Analysis which study the complexity of real functions, at the complexity level.

Room: 3.10, Mathematics

Support: SQIG/Instituto de Telecomunicações with support from FCT and FEDER namely by the FCT project PEst-OE/EEI/LA0008/2013. More Information..