Entropy: what is it and what are its types

  • Aug 10, 2021
click fraud protection

There are various sciences that, although they differ in purpose, converge in finding the origin, the pattern and the probability of change of their object of study. These sciences include physics, chemistry, computing, thermodynamics and economics.

The probability of change is related to a certain disorder found within a system; To explain this disorder, use is made of entropy, a term used in the pi.

Advertisements

what is entropy

Although it is a somewhat complicated theory, its applications in information science, physics and chemistry have become popular. Therefore, it is useful to know what it is and what types exist.

Advertisements

In this article you will find:

What is entropy?

Entropy can be defined as a physical quantity, that is, a measurable quantity within a physical system; This physical magnitude is used to measure the level of tendency to disorder that a system has in a given situation.

This measure of disorder also makes it possible to determine how much energy cannot be controlled and therefore is not useful for work; In addition, it allows us to distinguish useful energy, that is, that which is completely converted into work.

Advertisements

The entropy theory establishes that over time the systems tend to more disorder, in others for labors, their entropy levels increase. Therefore, it can be said that it is a measure of the level of disorder within a system or of its progressive deterioration.

In mathematics, entropy is represented by the letter S and is based on probabilities; This base allows to use this theory in sciences such as computer science and economics.

Advertisements

How is it possible? Within these sciences, entropy refers to the probability of receiving or not receiving random information in an information system. It was from the year 1948 when the engineer Claude E. Shannon raises the information entropy as the way to measure randomness.

Types of entropy

Depending on the degree of disorder within a system, it can be said that there are two types of entropy. These are:

Advertisements

Negative entropy

Negative entropy or syntropy, has a functioning contrary to entropy. Therefore, it can be defined as a force or procedure that brings balance to a system by reducing or eliminating disorder.

At first, this term was raised by Erwin Schrödinger, a German physicist, who explained that various living beings function as balancing agents for some systems. Among these living beings are human beings, whose survival depends in part on order.

Positive entropy

Positive entropy, often known simply as entropy, refers to a system whose degree of molecular disorder is high. That is, its entropy level is higher.

What are the applications of entropy

Since entropy theory can be used in various sciences, it can be applied to various areas of daily living. Some examples are:

  • Entropy in computer security systems: in this area it is used to create encryption keys that protect a system. In addition, it allows to detect and stop, in some cases, cyber attacks.
  • Entropy in the automotive industry: the entropy formula can be used in any type of energy model; therefore, it can be applied to the combustion chambers used in vehicles.
  • Entropy in psychology: in this case it can be applied to the social systems in which human beings develop. All individuals have similarities and differences simultaneously and preservation depends on these.
  • Entropy in linguistics: in this field it refers to the way in which information is organized and disseminated in a dissertation. In this way, this information can be analyzed in the communication process.

Examples of entropy in everyday life

  • Inside a kitchen, a glass utensil can be seen as a system in complete order and balance. If it fell and broke into many pieces it would have undergone an entropic event, that is, a change would occur causing disorder. But it would be impossible for the event to happen in reverse, for a glass utensil to be formed from the pieces.
  • Thermal death, as they call in contemporary physics what they consider to be the end of the universe is also an example of entropy. This theory holds that at some point the universe will reach equilibrium, a point of maximum entropy, therefore, it will stop evolving.

Sources and references:

  • Bonet L. (2021, January) Entropy concepts and fundamentals.
  • Centro Estudios Cervantinos (ENTROPY: EVERYTHING YOU NEED TO KNOW)
instagram viewer