Sunday 9 July 2017

Blogs/Websites that I follow

I share here some of the blogs that I follow. Most of the articles in these blogs are about research, education, PhD, and computer science.
I hope these blogs will be of help to you.

Sunday 12 February 2017

Maps

The map lists various areas of study that come under the aegis of mathematics. It is a Herculean task to enumerate all the disciplines in mathematics. Dominic has done an amazing job here.
  • Foundations - Fundamental Rules, Mathematical Logic, Set Theory, Category Theory, Theory of Computation, Complexity Theory 
  • Pure Mathematics
    • Number Systems - Natural Numbers, Integers, Rational Numbers, Real Numbers, Complex Numbers,
    • Structures - Number Theory, Combinatorics, Algebra, Linear Algebra, Group Theory, Order Theory
    • Spaces - Geometry, Trigonometry, Fractal Geometry, Differential Geometry
    • Changes - Calculus, Vector Calculus, Chaos Theory, Dynamical Systems, Complex Analysis
  • Applied Mathematics - Numerical Analysis, Game Theory, Economics, Engineering, Computer Science, Machine Learning, Probability, Statistics, Cryptography, Optimization, Biomathematics, Mathematical Physics, Mathematical Chemistry

Tuesday 3 January 2017

Vijay Amritraj @ IITM

I had the opportunity to listen to Vijay Amritraj as part of Shaastra '17.

He is a great orator. He has worked as the messenger of peace in United Nations, directly reporting to the then UN general secretary Kofi Annan. The Vijay Amritraj Foundation takes up various social causes.

He is confident of the improvements taking place in the sports scenario of the country. When he was playing tennis in the 70s, cricket was the only sport in India. Now, people in India pursue and follow various other sports such badminton, chess, shooting, kabaddi, and wrestling.

He commented that people in our country like to play everything safe. That is the reason why still not many people pursue sports as a career. He is confident that there is talent in the country.

He remarked that technological advents, such as computer ranking and hawk eye prediction, has impacted tennis in many ways. The average height of players has gone higher. The strength of rackets has increased. The surface has become slower. The ball has become heavier. The quality of the game has improved a lot.

Thursday 22 December 2016

To Kill A Mockingbird, Harper Lee

Awesome Read! 

In To Kill A MockingbirdLee discusses a lot of issues - gender disparity, class distinction, race division, capital punishment, rape, The Great Depression, and ethics and morality - as viewed from the perspective of the young girl, Scout. TKAM is one of the best fictional novels I have ever read.

To spice things up, I gave my own titles to each chapter in the book.

Part I Part II
Chapter 1 - Radley Opening Chapter 12 - Cal Church
Chapter 2 - Morning Sickness Chapter 13 - Finch Pride
Chapter 3 - Afternoon Show Chapter 14 - Dill Flee
Chapter 4 - Vacation Drama Chapter 15 - Cunningham Encounter
Chapter 5 - Tweet Radley Chapter 16 - The Courthouse
Chapter 6 - Peep Talk Chapter 17 - Tate-Bob Witness
Chapter 7 - Thank Cement Chapter 18 - Mayella Witness
Chapter 8 - Tundra Blaze Chapter 19 - Tom Witness
Chapter 9 - Landing Tussle Chapter 20 - Closing Remarks
Chapter 10 - Dead Shot Chapter 21 - The Verdict
Chapter 11 - Dubose Deadly Chapter 22 - Tears of Injustice
Chapter 23 - Gender-Class-Race Divide
Chapter 24 - Missionary Tea Hypocrisy
Chapter 25 - Maycomb Tribune
Chapter 26 - Grace Double Standards
Chapter 27 - Back to Normal
Chapter 28 - Bob Attack
Chapter 29 - Boo Save
Chapter 30 - Alternate Story
Chapter 31 - All is Well

Tuesday 6 December 2016

English Grammar Punctuation

Notes from Eats, Shoots& Leaves (Lynne Truss)
 
Traditionally punctuation made it easier to read text aloud or to signal a pause. This was especially useful for actors on stage. In modern usage, punctuation serves additional functions such as indicate emphasis, for syntactic reasons or to avoid ambiguity.

Every publication house follows different style guides for punctuation. Additionally, the British usage differ from the American one (e.g. usage of punctuation within quotation marks).

Apostrophe: possessive marker (e.g. Jack's, boy's, boys'), to indicate omission (e.g. summer of '69), indicate time or quantity (e.g. two month's notice), plurals of letters and words (e.g. f's, do's and don't's); no need to use for plurals or abbreviations (e.g. MPs and MLAs) or dates (e.g. 1980s)

Comma: for lists (e.g. Tom, Dick and Harry), for joining complete sentences, bracketing commas (instead of em-dash or parenthesis);

Semicolon and Colon: to indicate pause and emphasis

Exclamation mark, italics, quotation marks (single and double), brackets (round, square, curly, angle)

How to choose between single and double quotation mark?
How to choose among round bracket, em-dash and comma?

Hyphen: to combine words (e.g. pre-train), when a noun phrase acts as an adjective (e.g. state-of-the-art model), to split unfinished words at the end of a line, to avoid ambiguity (e.g. re-formed vs. reformed)
 
Punctuation Marks
  • Full stop 
    Alice met Bob.
  • Comma
    Alice gave Bob a pen, paper, and a pencil. 
    Alice, a student, met Bob.
  • Semicolon
    Alice gave Bob a paper; Bob took it reluctantly.
  • Colon
    Alice gave Bob a few items: a pen, a paper, and a pencil.
  • Question mark
    Did Alice meet Bob?
  • Exclamation mark
    Hurray, we won! Yipee!
  • Quotes
    ``Come,’' Alice told Bob.
  • Apostrophe denotes contraction and possession.
    it’s, Alice's, p’s, 7’s, 1990s, MPs
  • Hyphen
    Does your organization have a by-law?
  • Dash denotes comment
    Alice will not come - I hope so.
  • Parentheses denotes supplementary information.
    Alice (a student) met Bob.
 
Character
Code Point
Name
Purpose
u2010
Hyphen
To represent compound terms
u2014
Em dash
In place of commas, parentheses
(use em dash sparingly and instead use the alternatives)
u2013
En dash
To denote ranges
u2212
Minus
To represent subtraction
-
u002D
Hyphen-minus
ASCII hyphen

Friday 16 September 2016

Information Theory, Khan Academy

Recently, I have been going through some interesting videos on information theory. I am posting the link here: Information Theory, Khan Academy, for the benefit of those who want to learn something new, fundamental and interesting. I will try to summarize what I understood from the lectures below.

1. Ancient Information Theory
It is interesting to note how humans started communicating with each other. Ancient humans used pictographs and ideograms engraved on rocks and in caves to share information. Later on, symbols and alphabets were devised for ease of communication.

With the passage of time, advanced communication technologies were developed. For instance, Greeks and Romans used torches, especially in battles, for quick long-distance communication. In the 17th century, shutter telegraphs were the norm. It could cover all the letters in English alphabet. With the help of a telescope, it was possible to send information across an incredible amount of distance. But still, these techniques were not sufficient for effective communication due to their low expressive powers and low speeds.

With the discovery of electricity, the information age had begun. The visual telegraphs were soon replaced by electrostatic telegraphs. It was possible to send large bits of information to long distances in short amounts of time.

2. Modern Information Theory
How fast can two parties communicate with each other? The limiting rate at which it is possible to send messages depends on symbol rate (baud) and difference. Symbol rate stands for the number of symbols which can be transferred per second. There is a fundamental limit on the distance between two pulses. Due to noise, a pulse may not be perfect. If two pulses are very close to each other, there is a very high chance for inter-pulse interference to occur between them. This makes it difficult for the receiver to decode the signal. The difference stands for the number of signaling events per symbol. Message space denotes the number of messages possible.

How is it possible to quantify information? A possible way solve this problem is by considering a scenario where the receiver asks a number of yes/no questions to the sender to receive information. Based on the minimum number of questions required to receive the complete message, it is possible to quantify information. This unit is called a binary digit or bit, in short. Mathematically, this is nothing but logarithm (to the base 2) of the size of the message space. For example, to pass the name of a book from the Bible, 6-7 bits are required. To pass the names of 2 books from the Bible, 12-14 bits are required.

Human communication is a mix of randomness and statistical dependencies. Claude Shannon uses a Markov model as the basis of how we can think about communication.  Given a message, a machine can be designed which generates a similar-looking text. As we progress from the zeroth-order approximation to first-order and second-order approximations, the similarity of the text generated by the machine with the message increases.

Entropy - Shannon gives a mathematical method to quantify information. He calls this quantity entropy H= -Σp. log2(p), where p is the probability of an outcome of the event. If all the outcomes are equally likely, then entropy is maximum. If there is some predictability in the outcomes, then entropy comes down. For example, a text with random words and letters will have higher Shannon's information than a text with "normal" letters. This seems counter-intuitive at first because information theory doesn't deal with the semantic information in the message but with the number of symbols required to communicate a message. The only way to regenerate the random text is to copy the text as is. However, it is possible to compress the "normal" test using rules due to its predictable nature.

Coding theory - If the entropy is not maximum, then it is possible to compress a message. But, what are the ways in which we can compress a message? David Huffman came up with the Huffman coding strategy to compress a message with the help of a binary tree by encoding symbols into bits. However, the limit of (lossless) compression is the entropy of the message source. The information in the message would be lost when the message needs to be compressed beyond the specified limit.

Error detection & correction - During communication, noise in the channel corrupts the message resulting in difficulty for the receiver to understand the message. How is it possible to deal with noise? Richard Hamming came up with the idea of parity bits built upon the concept of repetition. Thus, error correction is done by using more symbols to encode the same message resulting in an increase in the size of the message.

Reference
Shannon, Claude Elwood. "A mathematical theory of communication." ACM SIGMOBILE Mobile Computing and Communications Review 5.1 (2001): 3-55. 

Friday 15 July 2016

Animal Farm, A Fairy Story - George Orwell

Animal Farm is a nice political satire on the Soviet Union written by George Orwell in 1943-44. The story portrays the evil effects of Socialism in an intelligent manner.

An interesting feature of Animal Farm is that the book spans only 90 pages which makes it remain one of the favorites among the busy working class even today.

I gave my own title to each chapter in the book. Spoiler Alert!

Chapter I The Dream Proposition
Chapter II The Rebellion Theorem
Chapter III The Heydays Axiom
Chapter IV The Recapture Claim
Chapter V The Napoleon Prime
Chapter VI The Windmill Lemma
Chapter VII The Traitors Corollary
Chapter VIII The Frederick-Pilkington Conundrum
Chapter IX The Boxer-Glue Conjecture
Chapter X The Pig-Man Paradox

Tuesday 24 May 2016

A Brief History of Time, Stephen Hawking

The Uncertainty Principle
Laplace argued that universe was deterministic i.e. we can predict the changes in the state of the universe provided we know the current state of the universe. However, Heisenberg's uncertainty principle showed that the more accurate we try to measure the location of an object the less accurate the result would be. As a result, it is difficult to measure the state of the universe at any given point of time.

Plank's quantum hypothesis and Heisenberg's uncertainty principle led to the theory of quantum mechanism where position of an object is defined in terms of probabilities i.e. an object would be at position A in time B with some probability C.

The dual nature of light is an implication of quantum mechanics. Quantum hypothesis said that light energy, which was thought to be composed of waves, was dissipated in terms of particles called quanta. The uncertainty principle said that particles may seem to be occurring at multiple positions based on the measurement. Interference of waves as well as particles (double-slit experiment) was observed.

The interference of particles helped physicists in understanding the nature of orbits of electrons in an atom. There are only a finite number of valid orbits in an atom because of the positive interference of electrons around the nucleus. The negative interference of electrons leads to the unavailability of certain orbits around the nucleus.

Einstein's general theory of relativity (classical theory) does not take the theory of quantum mechanism into consideration. It is necessary to combine both the theories in order to have a general, unified, consistent theory.

Roger Schank Blog

Roger Schank Blog 

Roger Schank in the latest article in his blog argues the AI is way more than just keyword matching and that an AI program should be able to exchange thoughts, hypotheses and solutions with other programs/humans.
 
He was commenting on the current state of AI research in the context of the massive attention media is giving to the news about a Georgia Tech professor announcing at the end of his course that one of his TAs was actually an “AI”. In fact, the “AI” he was referring to was nothing better than programs such as MARGIE, ELIZA and PARRY which were written in the 1970s and performed simple keyword matching. 

He predicts of an impending AI winter 2.0 due to the skewed perception media and, in turn, people have about the real potential of AI. He says there are important questions to consider in the field of AI such as:
  • Can we build machines that think, wonder, remember, feel and understand?
  • Is it enough if we continue to build such machines which do not perform any of the above actions but are still useful in one way or the other?

Tuesday 29 March 2016

Plato and Platypus

This article is a summary of the book "Plato and a Platypus Walk into a Bar" by Thomas Cathcart and Daniel Klein. The authors argue that philosophy and jokes are two sides of the same coin. Both of them amuse us, tickle our minds and makes us think. The book is an attempt to explain philosophy through jokes. The authors call this approach philogagging.

Metaphysics - Does the universe have a purpose? What are the characteristics that define an object? Do human beings have free will?
  1. Teleology - What is purpose of our lives?
  2. Essentialism - What are the essential qualities that define an object?
  3. Rationalism - We acquire knowledge through reasoning.
  4. Infinity and Eternity - Is the universe going to last till eternity?
  5. Determinism versus Free Will - Do we have free will or are all our actions predetermined?
  6. Process Philosophy - Evolving God!
  7. The Principle of Parsimony
Logic - How to perform reasoning?
  1. Aristotle's Law of Non-contradiction - A statement cannot be true and not true simultaneously.
  2. Illogical Reasoning - Wrong reasoning!
  3. Inductive Logic - Reasoning is performed based on observation and generalization.
  4. Falsifiability - In order for a statement to be valid there should be some possible circumstance when the statement can be proved to be false.
  5. Deductive Logic - Reasoning is performed based on rules.
  6. Argument from Analogy- Conclusions are drawn from analogous entities or situations.
  7. Post Hoc Ergo Propter Hoc Fallacy - An event X succeeded by another event Y does not necessarily mean X is the cause of Y.
  8. Monte Carlo Fallacy - Each trial is independent of the other.
  9. Circular Argument - One of the premises of an argument is the conclusion itself.
  10. Respect for Authority Fallacy - False dependence on the veracity of a higher authority than verifying the accuracy of a statement.
  11. Zeno's Paradox - Achilles and the Tortoise
  12. Logical and Semantic Paradoxes - This following statement is false. The preceding statement in true.
Epistemology - What does it mean to say I know something? How do we gain knowledge?
  1. Reason versus Revelation - Is knowledge attained through human reasoning or through revelation from God?
  2. Empiricism - Knowledge is attained through sense data
  3. German Idealism (Immanuel Kant)
  4. Philosophy of Mathematics - Analytic versus Synthetic statement, A priori versus A posteriori statement
  5. Pragmatism - Truth of a statement lies in its practical consequences.
  6. Phenomenology
Ethics - What is good? What is right? Is there any absolute standard to distinguish between right and wrong?
  1. Divine Revelation - Ethics is decided by God.
  2. Platonic Virtue - Ethics is decided by human wisdom.
  3. Utilitarianism - The end justifies the means.
  4. Golden Rule - Do unto others what you want others to do unto you (as long it can be considered a universal law).
  5. Will to Power
  6. Emotivism - Ethics is decided by our emotional state.
  7. Applied Ethics - Professional ethics
  8. Psychoanalysis - Ethics is decided by our unconscious being.
  9. Situation Ethics - Ethics is decided by the situation under consideration.
Philosophy of Religion - Is there a God?
  1. Deism and Historical Religions - "The Force" versus Creator/Clockmaker
  2. Belief in God - Theism (God exists), Atheism (God does not exist), Agnosticism (Current evidences cannot prove existence of God)
  3. Theological Distinctions - Jewish; Catholic, Protestant, Baptist, Methodist, Jehovah's Witness; Buddhism, Zen
  4. Airhead Philosophy

Philosophy of Language - How do we communicate?
  1. Ordinary Language Philosophy - "I promise" versus "I paint", "nothing" seems to be a thing, "I love you" and "I love ya" occur in different contexts, "I believe in God" can have different connotations,
  2. The Linguistic Nature of Proper Names - Bertrand Russel (short descriptions),  Saul Kripke (rigid demonstrators)
  3. The Philosophy of Vagueness

Social and Political Philosophy - Do we need laws? Is it possible to have an ideal state?
  1. The State of Nature
  2. Might equals Right
  3. Feminism
  4. Economic Philosophies
  5. Philosophy of Law
Relativity - What are the things that are relative and things that are absolute?
  1. Relativity of Truth
  2. Relativity of Time
  3. Relativity of World-views
  4. Relativity of Views
  5. Absolute Relativity
Existentialism