The fundamental thermodynamic bounds on finite models

From MaRDI portal
Publication:5000875

DOI10.1063/5.0044741zbMATH Open1469.80002arXiv1912.03217OpenAlexW3173374096MaRDI QIDQ5000875FDOQ5000875

Andrew J. P. Garner

Publication date: 15 July 2021

Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)

Abstract: The minimum heat cost of computation is subject to bounds arising from Landauer's principle. Here, I derive bounds on finite modelling -- the production or anticipation of patterns (time-series data) -- by devices that model the pattern in a piecewise manner and are equipped with a finite amount of memory. When producing a pattern, I show that the minimum dissipation is proportional to the information in the model's memory about the pattern's history that never manifests in the device's future behaviour and must be expunged from memory. I provide a general construction of model that allow this dissipation to be reduced to zero. By also considering devices that consume, or effect arbitrary changes on a pattern, I discuss how these finite models can form an information reservoir framework consistent with the second law of thermodynamics.


Full work available at URL: https://arxiv.org/abs/1912.03217




Recommendations




Cites Work


Cited In (4)

Uses Software





This page was built for publication: The fundamental thermodynamic bounds on finite models

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5000875)