Entropy can be defined in many ways and thus applied to different phases or contexts in thermodynamics, cosmology, and economics. The concept of entropy talks about the sudden changes that occur in everyday phenomena or the universe’s tendency towards disorder. Entropy is not only a scientific concept but is generally described as a measurable physical property associated with uncertainty.
Entropy
This thermodynamic description takes into account the equilibrium state of the systems. Meanwhile, the statistical definition was developed later, which focused on the thermodynamic properties defined in terms of the statistics of molecular motions of the system. Entropy is a measure of molecular disorder.
Some important points are as follows:
- Information theory is a measure of the ability of a system to transmit a signal or the loss the information in a transmitted signal.
- Entropy defines the increasing complexity of the dynamical system in terms of dynamical systems. It also calculates the average data flow per unit of time.
Properties of Entropy:
- It is a thermodynamic function.
- It is the duty of the state. It depends on the system’s state and not on the path followed.
- It is denoted by S, but in standard conditions, it is denoted by S°.
- Its SI unit is J/Kmol.
- Its CGS unit is cal/Kmol.
- Entropy is a pervasive property, meaning that it scales with the size or extent of the system.
- More disorder appears in an isolated system, so entropy also increases. When chemical reactions occur, entropy increases if the reactants go into more products. A system at higher temperatures has more randomness than a lower one. In these examples, it is clear that entropy increases as regularity decreases.
Entropy Change and Calculations
During an entropy change, a process is the amount of heat released or absorbed divided by the absolute temperature isothermally and vice versa. The entropy formula is given as follows;
∆S = qrev,iso/T
If we add the same heat at high and low temperatures, the randomness is maximum at low temperatures. Hence, it implies that temperature is inversely proportional to entropy.
Total entropy change, ∆Subtotal =∆Ssurroundings+∆Ssystem.
The total entropy change is equal to the sum of the entropy change of the system and the surroundings.
If the system loses an amount of heat q at temperature T1, it is received by the surroundings at temperature T2.
So, ∆S total can be calculated
∆Ssystem=-q/T1
∆S surrounding =q/T2
∆S total=-q/T1+q/T2
● If ∆S total is positive, the process is spontaneous.
● If ∆S total is negative, the process is non-spontaneous.
● If ∆S total is zero, the process is at equilibrium.
Entropy changes during the isothermal reversible expansion of an ideal gas
∆S = qrev,iso/T
According to the 1st law of thermodynamics,
∆U=q+w
Isothermal expansion of the ideal gas, ∆U = 0
qrev = -wrev = nRTln(V2/V1)
ie.∆S = nRln(V2/V1)
Conclusion:
The entropy at the boundary of the black hole and the universe must be infinite because we cannot know its possible statistical configurations. But what I am referring to may be informational rather than thermodynamic entropy.
Join Examdays Telegram
For more details about the Telegram Group, Click the Join Telegram below button.
In case of any doubt regarding Telegram, you can mail us at [email protected].