<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://rs-485.com/index.php?action=history&amp;feed=atom&amp;title=Shannon_%28unit%29</id>
	<title>Shannon (unit) - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://rs-485.com/index.php?action=history&amp;feed=atom&amp;title=Shannon_%28unit%29"/>
	<link rel="alternate" type="text/html" href="https://rs-485.com/index.php?title=Shannon_(unit)&amp;action=history"/>
	<updated>2026-05-04T10:32:24Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.42.3</generator>
	<entry>
		<id>https://rs-485.com/index.php?title=Shannon_(unit)&amp;diff=2252&amp;oldid=prev</id>
		<title>Admin: Imported missing template from Wikipedia</title>
		<link rel="alternate" type="text/html" href="https://rs-485.com/index.php?title=Shannon_(unit)&amp;diff=2252&amp;oldid=prev"/>
		<updated>2026-05-03T16:51:53Z</updated>

		<summary type="html">&lt;p&gt;Imported missing template from Wikipedia&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{short description|Unit of information}}&lt;br /&gt;
{{fundamental info units}}&lt;br /&gt;
&lt;br /&gt;
The &amp;#039;&amp;#039;&amp;#039;shannon&amp;#039;&amp;#039;&amp;#039; (symbol: Sh) is a [[unit of information]] named after [[Claude Shannon]], the founder of [[information theory]]. [[IEC 80000-13]] defines the shannon as the [[information content]] associated with an event when the probability of the event occurring is {{sfrac|1|2}}. It is understood as such within the realm of [[information theory]], and is conceptually distinct from the [[bit]], a term used in [[data processing]] and storage to denote a single instance of a binary [[Digital signal|signal]]. A sequence of &amp;#039;&amp;#039;n&amp;#039;&amp;#039; binary symbols (such as contained in computer memory or a binary data transmission) is properly described as consisting of &amp;#039;&amp;#039;n&amp;#039;&amp;#039; bits, but the information content of those &amp;#039;&amp;#039;n&amp;#039;&amp;#039; symbols may be more or less than &amp;#039;&amp;#039;n&amp;#039;&amp;#039; shannons depending on the &amp;#039;&amp;#039;a priori&amp;#039;&amp;#039; probability of the actual sequence of symbols.{{efn|Since the information associated with an event outcome that has &amp;#039;&amp;#039;a priori&amp;#039;&amp;#039; probability &amp;#039;&amp;#039;p&amp;#039;&amp;#039;, e.g. that a single given data bit takes the value 0, is given by {{nowrap|1=&amp;#039;&amp;#039;H&amp;#039;&amp;#039; = −log &amp;#039;&amp;#039;p&amp;#039;&amp;#039;}}, and &amp;#039;&amp;#039;p&amp;#039;&amp;#039; can lie anywhere in the range {{nowrap|0 &amp;lt; &amp;#039;&amp;#039;p&amp;#039;&amp;#039; ≤ 1}}, the information content can lie anywhere in the range {{nowrap|0 ≤ &amp;#039;&amp;#039;H&amp;#039;&amp;#039; &amp;lt; ∞}}.}}&lt;br /&gt;
&lt;br /&gt;
The shannon also serves as a unit of the [[information entropy]] of an event, which is defined as the [[expected value]] of the information content of the event (i.e., the probability-weighted average of the information content of all potential events).  Given a number of possible outcomes, unlike information content, the entropy has an upper bound, which is reached when the possible outcomes are equiprobable.  The maximum entropy of &amp;#039;&amp;#039;n&amp;#039;&amp;#039; bits is &amp;#039;&amp;#039;n&amp;#039;&amp;#039;&amp;amp;nbsp;Sh.  A further quantity that it is used for is [[channel capacity]], which is generally the maximum of the expected value of the information content encoded over a channel that can be transferred with negligible probability of error, typically in the form of an information rate.&lt;br /&gt;
&lt;br /&gt;
Nevertheless, the term &amp;#039;&amp;#039;bits of information&amp;#039;&amp;#039; or simply &amp;#039;&amp;#039;bits&amp;#039;&amp;#039; is more often heard, even in the fields of information and [[A Mathematical Theory of Communication#Contents|communication theory]], rather than &amp;#039;&amp;#039;shannons&amp;#039;&amp;#039;; just saying &amp;#039;&amp;#039;bits&amp;#039;&amp;#039; can therefore be ambiguous. Using the unit &amp;#039;&amp;#039;shannon&amp;#039;&amp;#039; is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data,{{refn|{{cite journal |author=Olivier Rioul |year=2018 |title=This is IT: A primer on Shannon&amp;#039;s entropy and Information |journal=L&amp;#039;Information, Séminaire Poincaré |volume=XXIII |pages=43–77 |url=https://perso.telecom-paristech.fr/rioul/publis/201811rioul.pdf |access-date=2021-05-23 |quote=The &amp;#039;&amp;#039;Système International d&amp;#039;unités&amp;#039;&amp;#039; recommends the use of the &amp;#039;&amp;#039;shannon&amp;#039;&amp;#039; (Sh) as the information unit in place of the &amp;#039;&amp;#039;bit&amp;#039;&amp;#039; to distinguish the amount of information from the quantity of data that may be used to represent this information.  Thus, according to the SI standard, &amp;#039;&amp;#039;H&amp;#039;&amp;#039;(&amp;#039;&amp;#039;X&amp;#039;&amp;#039;) should be expressed using the shannon as the unit. The entropy of one bit lies between 0 and 1&amp;amp;nbsp;Sh. }}}} whereas &amp;#039;&amp;#039;bits&amp;#039;&amp;#039; can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing.&lt;br /&gt;
&lt;br /&gt;
== Similar units ==&lt;br /&gt;
&lt;br /&gt;
The shannon is connected through constants of proportionality to two other units of information:{{refn|{{cite web |title=IEC 80000-13:2008 |url=http://www.iso.org/iso/catalogue_detail?csnumber=31898 |publisher=[[International Organization for Standardization]] |accessdate=21 July 2013 }}}}&lt;br /&gt;
{{block indent|1=1&amp;amp;nbsp;Sh ≈ 0.693&amp;amp;nbsp;[[Nat (unit)|nat]] ≈ 0.301&amp;amp;nbsp;[[hartley (unit)|Hart]].}}&lt;br /&gt;
&lt;br /&gt;
The &amp;#039;&amp;#039;[[hartley (unit)|hartley]]&amp;#039;&amp;#039;, a seldom-used unit, is named after [[Ralph Hartley]], an electronics engineer interested in the capacity of communications channels. Although of a more limited nature, his early work, preceding that of Shannon, makes him recognized also as a pioneer of information theory. Just as the shannon describes the maximum possible information capacity of a binary symbol, the hartley describes the information that can be contained in a 10-ary symbol, that is, a digit value in the range 0 to 9 when the &amp;#039;&amp;#039;a  priori&amp;#039;&amp;#039; probability of each value is {{sfrac|1|10}}. The conversion factor quoted above is given by log&amp;lt;sub&amp;gt;10&amp;lt;/sub&amp;gt;(2).&lt;br /&gt;
&lt;br /&gt;
In mathematical expressions, the [[Nat (unit)|nat]] is a more natural unit of information, but 1&amp;amp;nbsp;nat does not correspond to a case in which all possibilities are equiprobable, unlike with the shannon and hartley. In each case, formulae for the quantification of information capacity or [[Entropy (information theory)|entropy]] involve taking the [[logarithm]] of an expression involving probabilities. If base-2 logarithms are employed, the result is expressed in shannons, if base-10 ([[common logarithm]]s) then the result is in hartleys, and if [[natural logarithm]]s (base [[e (mathematical constant)|e]]), the result is in nats. For instance, the information capacity of a 16-bit sequence (achieved when all 65536 possible sequences are equally probable) is given by log(65536), thus {{nowrap|1=log&amp;lt;sub&amp;gt;10&amp;lt;/sub&amp;gt;(65536) Hart ≈ 4.82 Hart}}, {{nowrap|1=log&amp;lt;sub&amp;gt;e&amp;lt;/sub&amp;gt;(65536) nat ≈ 11.09 nat}}, or {{nowrap|1=log&amp;lt;sub&amp;gt;2&amp;lt;/sub&amp;gt;(65536) Sh = 16 Sh}}.&lt;br /&gt;
&lt;br /&gt;
== Information measures ==&lt;br /&gt;
{{main|Quantities of information}}&lt;br /&gt;
&lt;br /&gt;
In [[information theory]] and derivative fields such as [[coding theory]], one cannot quantify the &amp;#039;information&amp;#039; in a single message (sequence of symbols) out of context, but rather a reference is made to the model of a channel (such as [[bit error rate]]) or to the underlying statistics of an information source. There are thus [[Quantities of information|various measures of or related to information]], all of which may use the shannon as a unit.{{cn|date=December 2022}}&lt;br /&gt;
&lt;br /&gt;
For instance, in the above example, a 16-bit channel could be said to have a [[channel capacity]] of 16&amp;amp;nbsp;Sh, but when connected to a particular information source that only sends one of 8 possible messages, one would compute the [[Entropy (information theory)|entropy]] of its output as no more than 3&amp;amp;nbsp;Sh. And if one already had been informed through a side channel in which set of 4 possible messages the message is, then one could calculate the [[mutual information]] of the new message (having 8 possible states) as no more than 2&amp;amp;nbsp;Sh. Although there are infinite possibilities for a [[real number]] chosen between 0 and 1, so-called [[differential entropy]] can be used to quantify the information content of an analog signal, such as related to the enhancement of [[signal-to-noise ratio]] or confidence of a [[hypothesis test]].{{cn|date=December 2022}}&lt;br /&gt;
&lt;br /&gt;
== Notes ==&lt;br /&gt;
{{notelist}}&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
{{reflist}}&lt;br /&gt;
&lt;br /&gt;
[[Category:Units of information]]&lt;br /&gt;
[[Category:Claude Shannon|unit]]&lt;br /&gt;
[[Category:2 (number)]]&lt;/div&gt;</summary>
		<author><name>Admin</name></author>
	</entry>
</feed>