Search This Blog

Saturday, May 2, 2020

History repeats itself. Came across this poem written in 1869, reprinted during 1919 Pandemic.


This is Timeless....
It was written in 1869 by Kathleen O’Meara:
And people stayed at home
And read books
And listened
And they rested
And did exercises
And made art and played
And learned new ways of being
And stopped and listened
More deeply
Someone meditated, someone prayed
Someone met their shadow
And people began to think differently
And people healed.
And in the absence of people who
Lived in ignorant ways
Dangerous, meaningless and heartless,
The earth also began to heal
And when the danger ended and
People found themselves
They grieved for the dead
And made new choices
And dreamed of new visions
And created new ways of living
And completely healed the earth
Just as they were healed.
Reprinted during Spanish flu
Pandemic, 1919
Photo taken during Spanish flu

Friday, May 1, 2020

Claude Shannon Father of Information Theory

Information Theory is one of the few scientific fields fortunate enough to have an identifiable beginning - Claude Shannon's 1948 paper.  The story of the evolution of how it progressed from a single theoretical paper to a broad field that has redefined our world is a fascinating one.  It provides the opportunity to study the social, political, and technological interactions that have helped guide its development and define its trajectory, and gives us insight into how a new field evolves.

We often hear Claude Shannon called the father of the Digital Age.  In the beginning of his paper Shannon acknowledges the work done before him, by such pioneers as Harry Nyquist and RVL. Hartley at Bell Labs in the 1920s. Though their influence was profound, the work of those early pioneers was limited and focussed on their own particular applications. It was Shannon’s unifying vision that revolutionized communication, and spawned a multitude of communication research that we now define as the field of Information Theory.
One of those key concepts was his definition of the limit for channel capacity.  Similar to Moore’s Law, the Shannon limit can be considered a self-fulfilling prophecy.  It is a benchmark that tells people what can be done, and what remains to be done – compelling them to achieve it.


"What made possible, what induced the development of coding as a theory, and the development of very complicated codes, was Shannon's Theorem: he told you that it could be done, so people tried to do it. [Interview with Fano, R. 2001]

Quantum information science is a young field, its underpinnings still being laid by a large number of researchers [see "Rules for a Complex Quantum World," by Michael A. Nielsen; Scientific American, November 2002]. Classical information science, by contrast, sprang forth about 50 years ago, from the work of one remarkable man: Claude E. Shannon. In a landmark paper written at Bell Labs in 1948, Shannon defined in mathematical terms what information is and how it can be transmitted in the face of noise. What had been viewed as quite distinct modes of communication--the telegraph, telephone, radio and television--were unified in a single framework.
Shannon was born in 1916 in Petoskey, Michigan, the son of a judge and a teacher. Among other inventive endeavors, as a youth he built a telegraph from his house to a friend's out of fencing wire. He graduated from the University of Michigan with degrees in electrical engineering and mathematics in 1936 and went to M.I.T., where he worked under computer pioneer Vannevar Bush on an analog computer called the differential analyzer.
Shannon's M.I.T. master's thesis in electrical engineering has been called the most important of the 20th century: in it the 22-year-old Shannon showed how the logical algebra of 19th-century mathematician George Boole could be implemented using electronic circuits of relays and switches. This most fundamental feature of digital computers' design--the representation of "true" and "false" and "0" and "1" as open or closed switches, and the use of electronic logic gates to make decisions and to carry out arithmetic--can be traced back to the insights in Shannon's thesis.


In 1941, with a Ph.D. in mathematics under his belt, Shannon went to Bell Labs, where he worked on war-related matters, including cryptography. Unknown to those around him, he was also working on the theory behind information and communications. In 1948 this work emerged in a celebrated paper published in two parts in Bell Labs's research journal.
Quantifying Information
Shannon defined the quantity of information produced by a source--for example, the quantity in a message--by a formula similar to the equation that defines thermodynamic entropy in physics. In its most basic terms, Shannon's informational entropy is the number of binary digits required to encode a message. Today that sounds like a simple, even obvious way to define how much information is in a message. In 1948, at the very dawn of the information age, this digitizing of information of any sort was a revolutionary step. His paper may have been the first to use the word "bit," short for binary digit.
As well as defining information, Shannon analyzed the ability to send information through a communications channel. He found that a channel had a certain maximum transmission rate that could not be exceeded. Today we call that the bandwidth of the channel. Shannon demonstrated mathematically that even in a noisy channel with a low bandwidth, essentially perfect, error-free communication could be achieved by keeping the transmission rate within the channel's bandwidth and by using error-correcting schemes: the transmission of additional bits that would enable the data to be extracted from the noise-ridden signal.
Today everything from modems to music CDs rely on error-correction to function. A major accomplishment of quantum-information scientists has been the development of techniques to correct errors introduced in quantum information and to determine just how much can be done with a noisy quantum communications channel or with entangled quantum bits (qubits) whose entanglement has been partially degraded by noise.


The Unbreakable Code
A year after he founded and launched information theory, Shannon published a paper that proved that unbreakable cryptography was possible. (He did this work in 1945, but at that time it was classified.) The scheme is called the one-time pad or the Vernam cypher, after Gilbert Vernam, who had invented it near the end of World War I. The idea is to encode the message with a random series of digits--the key--so that the encoded message is itself completely random. The catch is that one needs a random key that is as long as the message to be encoded and one must never use any of the keys twice. Shannon's contribution was to prove rigorously that this code was unbreakable. To this day, no other encryption scheme is known to be unbreakable.
The problem with the one-time pad (so-called because an agent would carry around his copy of a key on a pad and destroy each page of digits after they were used) is that the two parties to the communication must each have a copy of the key, and the key must be kept secret from spies or eavesdroppers. Quantum cryptography solves that problem. More properly called quantum key distribution, the technique uses quantum mechanics and entanglement to generate a random key that is identical at each end of the quantum communications channel. The quantum physics ensures that no one can eavesdrop and learn anything about the key: any surreptitious measurements would disturb subtle correlations that can be checked, similar to error-correction checks of data transmitted on a noisy communications line.


Encryption based on the Vernam cypher and quantum key distribution is perfectly secure: quantum physics guarantees security of the key and Shannon's theorem proves that the encryption method is unbreakable. [For Scientific American articles on quantum cryptography and other developments of quantum information science during the past decades, please click here.]
A Unique, Unicycling Genius


Shannon fit the stereotype of the eccentric genius to a T. At Bell Labs (and later M.I.T., where he returned in 1958 until his retirement in 1978) he was known for riding in the halls on a unicycle, sometimes juggling as well [see "Profile: Claude E. Shannon," by John Horgan; Scientific American, January 1990]. At other times he hopped along the hallways on a pogo stick. He was always a lover of gadgets and among other things built a robotic mouse that solved mazes and a computer called the Throbac ("THrifty ROman-numeral BAckward-looking Computer") that computed in roman numerals. In 1950 he wrote an article for Scientific American on the principles of programming computers to play chess [see "A Chess-Playing Machine," by Claude E. Shannon; Scientific American, February 1950].
In the 1990s, in one of life's tragic ironies, Shannon came down with Alzheimer's disease, which could be described as the insidious loss of information in the brain. The communications channel to one's memories--one's past and one's very personality--is progressively degraded until every effort at error correction is overwhelmed and no meaningful signal can pass through. The bandwidth falls to zero. The extraordinary pattern of information processing that was Claude Shannon finally succumbed to the depredations of thermodynamic entropy in February 2001. But some of the signal generated by Shannon lives on, expressed in the information technology in which our own lives are now immersed.
https://www.scientificamerican.com

Why Are Some Countries Rich And Others Poor?

Think of an economy as reflecting three fundamental features: capital, labor and what I will call the “efficiency factor.” A country’s stock of capital consists of machinery, buildings, land, etc. Labor consists of the country’s human resources that are used in production. The efficiency factor determines how well the country turns capital and labor into output.
Now let’s jump to the bottom line: which of these three factors is most responsible for differences in GDP per person in countries around the world? The answer: it’s the efficiency factor.


by Scott A. Wolla
"Open markets offer the only realistic hope of pulling billions of people in developing countries out of abject poverty, while sustaining prosperity in the industrialized world."1
—Kofi Annan, former United Nations Secretary-General

Many people mark the birth of economics as the publication of Adam Smith's The Wealth of Nations in 1776. Actually, this classic's full title is An Inquiry into the Nature and Causes of the Wealth of Nations, and Smith does indeed attempt to explain why some nations achieve wealth and others fail to do so. Yet, in the 241 years since the book's publication, the gap between rich countries and poor countries has grown even larger. Economists are still refining their answer to the original question: Why are some countries rich and others poor, and what can be done about it?
"Rich" and "Poor"
In common language, the terms "rich" and "poor" are often used in a relative sense: A "poor" person has less income, wealth, goods, or services than a "rich" person. When considering nations, economists often use gross domestic product (GDP) per capita as an indicator of average economic well-being within a country. GDP is the total market value, expressed in dollars, of all final goods and services produced in an economy in a given year. In a sense, a country's GDP is like its yearly income. So, dividing a particular country's GDP by its population is an estimate of how much income, on average, the economy produces per person (per capita) per year. In other words, GDP per capita is a measure of a nation's standard of living. For example, in 2016, GDP per capita was $57,467 in the United States, $42,158 in Canada, $27,539 in South Korea, $8,123 in China, $1,513 in Ghana, and $455 in Liberia (Figure 1).2


NOTE: Liberia's GDP per capita of $455 is included but not visible due to the scale. The Republic of Korea is the official name of South Korea.
SOURCE: World Bank, retrieved from FRED®, Federal Reserve Bank of St. Louis; https://fred.stlouisfed.org/graph/?g=eMGq, accessed July 26, 2017.

Because GDP per capita is simply GDP divided by the population, it is a measure of income as if it were divided equally among the population. In reality, there can be large differences in the incomes of people within a country. So, even in a country with relatively low GDP, some people will be better off than others. And, there are poor people in very wealthy countries. In 2013 (the most recent year comprehensive data on global poverty are available), 767 million people, or 10.7 percent of the world population, were estimated to be living below the international poverty line of $1.90 per person per day.3 Whether for people or nations, the key to escaping poverty lies in rising levels of income. For nations specifically, which measure wealth in terms of GDP, escaping poverty requires increasing the amount of output (per person) that their economy produces. In short, economic growth enables countries to escape poverty.
How Do Economies Grow?
Economic growth is a sustained rise over time in a nation's production of goods and services. How can a country increase its production? Well, an economy's production is a function of its inputs, or factors of production (natural resources, labor resources, and capital resources), and the productivity of those factors (specifically the productivity of labor and capital resources), which is called total factor productivity (TFP). Consider a shoe factory. Total shoe production is a function of the inputs (raw materials such as leather, labor supplied by workers, and capital resources, which are the tools and equipment in the factory), but it also depends on how skilled the workers are and how useful the equipment is. Now, imagine two factories with the same number of workers. In the first factory, workers with basic skills move goods around with push carts, assemble goods with hand tools, and work at benches. In the second factory, highly trained workers use motorized forklifts to move pallets of goods and power tools to assemble goods that move along a conveyer belt. Because the second factory has higher TFP, it will have higher output, earn greater income, and provide higher wages for its workers. Similarly, for a country, higher TFP will result in a higher rate of economic growth. A higher rate of economic growth means more goods are produced per person, which creates higher incomes and enables more people to escape poverty at a faster rate. But, how can nations increase TFP to escape poverty? While there are many factors to consider, two stand out.
Institutions
First, institutions matter. For an economist, institutions are the "rules of the game" that create the incentives for people and businesses. For example, when people are able to earn a profit from their work or business, they have an incentive not only to produce but also to continually improve their method of production. The "rules of the game" help determine the economic incentive to produce. On the flip side, if people are not monetarily rewarded for their work or business, or if the benefits of their production are likely to be taken away or lost, the incentive to produce will diminish. For this reason, many economists suggest that institutions such as property rights, free and open markets, and the rule of law (see the boxed insert) provide the best incentives and opportunities for individuals to produce goods and services.




North and South Korea often serve as an example of the importance of institutions. In a sense they are a natural experiment. These two nations share a common history, culture, and ethnicity. In 1953 these nations were formally divided and governed by very different governments. North Korea is a dictatorial communist nation where property rights and free and open markets are largely absent and the rule of law is repressed. In South Korea, institutions provide strong incentives for innovation and productivity. The results? North Korea is among the poorest nations in the world, while South Korea is among the richest.4



NOTE: While the Republic of Korea (the official name of South Korea), China, Ghana, and Liberia had similar standards of living in 1970, they have developed differently since then.
SOURCE: World Bank, retrieved from FRED®, Federal Reserve Bank of St. Louis; https://fred.stlouisfed.org/graph/?g=eMGt, accessed July 26, 2017.

While this seems like a simple relationship—if government provides strong property rights, free markets, and the rule of law, markets will thrive and the economy will grow—research suggests that the "institution story" alone does not provide a complete picture. In some cases, government support is important to the development of a nation's economy. Closer inspection shows that the economic transformation in South Korea, which started in the 1960s, was under the dictatorial rule of Park Chung-hee (who redirected the nation's economic focus on export-driven industry), not under conditions of strong property rights, free markets, and the rule of law (which came later).5 South Korea's move toward industrialization was an important first step in its economic development (see South Korea's growth in Figure 2). China is another example of an economy that has grown dramatically. In a single generation it has been transformed from a backward agrarian nation into a manufacturing powerhouse. China tried market reforms during the Qing dynasty (whose modernization reforms started in 1860 and lasted until its overthrow in 1911) and the Republic Era (1912-1949), but they were not effective. China's economic transformation began in 1978 under Deng Xiaoping, who imposed a government-led initiative to support industrialization and the development of markets, both internally and for export of Chinese goods.6 These early government-supported changes helped develop the markets necessary for the current, dramatic increase in economic growth (see Figure 2).
Trade
Second, international trade is an important part of the economic growth story for most countries. Think about two kids in the school cafeteria trading a granola bar for a chocolate chip cookie. They are willing to trade because it offers them both an opportunity to benefit. Nations trade for the same reason. When poorer nations use trade to access capital goods (such as advanced technology and equipment), they can increase their TFP, resulting in a higher rate of economic growth.7 Also, trade provides a broader market for a country to sell the goods and services it produces. Many nations, however, have trade barriers that restrict their access to trade. Recent research suggests that the removal of trade barriers could close the income gap between rich and poor countries by 50 percent.8
Conclusion
Economic growth of less-developed economies is key to closing the gap between rich and poor countries. Dif­ferences in the economic growth rate of nations often come down to differences in inputs (factors of production) and differences in TFP—the productivity of labor and capital resources. Higher productivity promotes faster economic growth, and faster growth allows a nation to escape poverty. Factors that can increase productivity (and growth) include institutions that provide incentives for innovation and production. In some cases, government can play an important part in the development of a nation's economy. Finally, increasing access to international trade can provide markets for the goods produced by less-developed countries and also increase productivity by increasing the access to capital resources.

Notes
1 Globalist. "Kofi Annan on Global Futures." February 6, 2011; https://www.theglobalist.com/kofi-annan-on-global-futures/.
2 Data from the World Bank retrieved from FRED®; https://fred.stlouisfed.org/graph/?g=erxy, accessed July 26, 2017.
3 World Bank. "Poverty and Shared Prosperity 2016: Taking on Equality." 2016, p. 4; http://www.worldbank.org/en/publication/poverty-and-shared-prosperity.
4 Olson, Mancur. "Big Bills Left on the Sidewalk: Why Some Nations are Rich, and Others Poor." Journal of Economic Perspectives, Spring 1996, 10(2), pp. 3-24.
5 Wen, Yi and Wolla, Scott. "China's Rapid Economic Rise: A New Application of an Old Recipe. Social Education." Social Education, March/April 2017, 81(2), pp. 93-97.
6 Wen, Yi and Fortier, George E. "The Visible Hand: The Role of Government in China's Long-Awaited Industrial Revolution." Federal Reserve Bank of St. Louis Review, Third Quarter 2016, 98(3), pp. 189-226; https://dx.doi.org/10.20955/r.2016.189-226.
7 Santacreu, Ana Maria. "Convergence in Productivity, R&D Intensity, and Technology Adoption." Federal Reserve Bank of St. Louis Economic Synopses, No. 11, 2017; https://doi.org/10.20955/es.2017.11.
8 Mutreja, Piyusha; Ravikumar, B. and Sposi, Michael J. "Capital Goods Trade and Economic Development." Working Paper No. 2014-012, Federal Reserve Bank of St. Louis, 2014; https://research.stlouisfed.org/wp/2014/2014-012.pdf.

© 2017, Federal Reserve Bank of St. Louis. The views expressed are those of the author(s) and do not necessarily reflect official positions of the Federal Reserve Bank of St. Louis or the Federal Reserve System.



Glossary
Factors of production: The natural resources, human resources, and capital resources that are available to make goods and services. Also known as productive resources.
Capital resources: Goods that have been produced and are used to produce other goods and services. They are used over and over again in the production process. Also called capital goods and physical capital.
Standard of living: A measure of the goods and services available to each person in a country; a measure of economic well- being. Also known as per capita real GDP (gross domestic product).
Trade barrier: A government-imposed restriction on the international trade of goods or services.
Thanks https://research.stlouisfed.org/publications/page1-econ/2017/09/01/why-are-some-countries-rich-and-others-poor/