Feeds:
Posts
Comments

Archive for the ‘Science’ Category

The other day I came across a news item: Stephen Hawking is working on a new Theory of Everything. Fine. Einstein before him was wrestling with a similar one even when brought to the nursing home during his final illness. Think of Einstein working on a branch of Science and Stephen Hawking on another branch of a giant tree. The tree is one and our brain has capacity to store information but how much?

According to late Carl Sagan human brain has neurons 10¹¹ with circuits and controlling switches for electro-chemical activities; and every neuron has tiny filaments called dendrites connecting with one another. If we assign each of these connection to a bit of information it can hold only 1 % of the number of atoms that a salt crystal can carry! It is not really mind boggling if we understand the nature of humans to create an order from information the brain gathers.

If we go from a mere speck of salt to the universe we would need a brain as big as the universe itself! Our commonsense experience and evolutionary history has prepared us to understand our workaday world. But if we were to seek exoplanets and other universes these are not reliable guides. When we hit the speed of light our mass increases prodigiously while our thickness decreases to zero in the direction of the motion. Time will cease to mean anything. Without a tangible reference- the context of time and space, would the brain unravel the pulses of information to know anything?

In my view Theory of Everything is a misnomer. It should be qualified as thus: The theory what human reason only can digest.

Take a sample of a tree with an auger. It shall carry everything essential of the whole tree, but also climate changes, time etc.,

What you get of an object is not its material content but its context as well. This context is rooted in our intuition and commonsense experience and beyond. Reason in our workaday world can analyze the object but it is restricted by our own finite nature.

The Theory must go beyond it and also farther than reason. As finite beings we abridge Truth into workable truth that has limited use. So reason alone cannot get hands on Ultimate nature of our universe.

benny

Read Full Post »

Was evolution of brain a necessity? Man being a social animal has necessary space in his skull to accommodate the wiring that would need for honing his communicative skills. Other species have comparatively not much space for what man is equipped for. Nevertheless a bee can make sense of its world as well as get the best in its brief life span. So in terms of utility a life form is never for a moment at a loss to make a go of it. If a group of lizards in Madagascar thrown into the islands with fewer possibilities of sustenance evolves into miniature version of its cousin elsewhere a biologist would say it is owing to insular dwarfism. It is so. The group has learnt to adapt itself in terms of its environment. The dynamics of jiggering with its body size, habits are as unconscious as its consequences are obvious. So brain is not merely the size and circuits but being conscious of its undeniable connection to the world outside. 

Having said this let me mention about bilingualism.

If man was not conscious about other groups different from his having different languages the topic of bilingualism would be a non-starter.

Bilingualism affects the structure of the brain including both major types of brain tissue – the grey matter and the white matter. The neurons in our brain have two distinct anatomical features: their cell bodies, where all the processing of information, thinking and planning happens, and their axons, which are the main avenues that connect brain areas and transfer information between them. The cell bodies are organised around the surface of the brain – the grey matter – and all the axons converge and interconnect underneath this into the white matter.

We call it white matter because the axons are wrapped in a fatty layer, the myelin, which ensures better neuronal communication – the way information is transferred around the brain. The myelin functions as an “insulation” that prevents information “leaking” from the axon during transfer.

Brain is in fact a jumble of parts from jellyfish, lizards all put together. We are using the nerve net from jellyfishes while design features of the brain are derived from lizards. Jellyfish do not have brain and their communication system developed 600 million years ago cannot be what is best for us. The brain evolved out of necessity to take care of different groups speaking different languages do affect the structure of the brain. How is it then that some groups like the Western societies pride in their ability to explore new ideas while some resist any idea that has been associated with something sacrosanct genuine or imagined? If a non-Muslin takes the name of Allah it is tantamount to blasphemy or image of Prophet Mohammed is depicted it is an insult. What is a drawing but of the same category as letters drawn with a brush or pen or by pressing keys? If a mulla in the prophets name urges his audience to hate and kill it is not an insult to the position he holds or to his Maker? 

Recalling the recent outrage of some terrorists attacking the Charlie Hebdo office in Paris makes me wonder if brain has to do with these murderers who can shoot people in the broad light in the name of their prophet. Where does their intolerance come from? Are they brainwashed to forget the consequences? Or have their brains put into sleep mode by their fear of group disapproval? If such fear can skew up their thinking brainwashing is unnecessary. Man has himself necessitated its stagnation. Religion is not detrimental to one’s onward development. It is supposed to sooth and heal divisions since moral value of any life is worth preserving. 

JE SUIS CHARLIE

Read Full Post »

Early in 2014 Jeremy England proposed a theory, based in thermodynamics, showing that the emergence of life was not accidental, but necessary. “Under certain conditions, matter inexorably acquires the key physical attribute associated with life,”a theory, based in thermodynamics, showing that the emergence of life was not accidental, but necessary.

From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. Professor England, a 31-year-old assistant professor at the MIT, has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. This could mean that under certain conditions, matter inexorably acquires the key physical attribute associated with life.

Giordano Bruno, who was burnt at the stake for heresy in 1600, was perhaps the first to take Copernicanism to its logical extension, speculating that stars were other suns, circled by other worlds, populated by beings like ourselves. His extreme minority view in his own time now looks better than ever, thanks to England.

ii

Matter has to out its abstract aspects underlying in a material form that can give matter its rising in the ladder (of evolution) which is just as well that it correspondingly acquires a coherence in its abstract nature.

Light as a photon having velocity and distance to cover shall create an uncertainty principle. If Creationists and Evolutionists cannot agree on the role of God let us not get drawn into the useless polemics. Doubt is necessary in the realm of Material universe.

How did man bring out underlying abstract of nature of matter to propound a theory of anything? Man we say is capable of abstract thinking. So doubt as well as God is borne out of his imagination. What is it if it does not touch anything nor lead to anywhere? It is useless. If a man because of doubt takes a position of sceptic or a Theist it has only value for his species. Science does not prove anything else.

Benny

Read Full Post »

Common sense cannot be relied at all times. When one puts a shell to the ear and listens what sound one hears is not of sea.

Common sense made you connect a seashell with the sound. It is not the sea but requires instead scientific explanation. Spectral resonance it is called.

Common sense makes you wonder how trees can grow in the Tundra. Being covered in snow would make the trees colder, in fact, the snow acts as insulation for the trees helping them stay warmer.

Two parallel lines meet at infinity. Common sense make u balk at the idea. Does it not? Think of miracle as divine commonsense.

benny

Read Full Post »

It has been one of my childhood fancies to imagine I could hear God’s thoughts. If he could hear my prayers it naturally means I could hear him as well. Over the years such fancies had evaporated since reason became the proof of one coming to man’s estate. Recently I came across this news item:an international team of researchers has successfully achieved brain-to-brain transmission of information between humans.

Humans just got a step closer to being able to think a message into someone else’s brain on the other side of the world.

“One such pathway is, of course, the internet, so our question became, ‘Could we develop an experiment that would bypass the talking or typing part of internet and establish direct brain-to-brain communication between subjects located far away from each other in India and France ?'”

Suppose the brain could be stimulated from outside using robot with a series of images? Using electromagnetic induction images could be beamed from person to person. Transcranial Magnetic Stimulation or TMS explains the devise. Thus the team used a similar set-up to that commonly used in brain-computer interface studies. A human subject had electrodes attached to their scalp, which recorded electrical currents in the brain as the subject had a specific thought. Usually, this is interpreted by a computer and translated to a control output, such as a robotic arm, or a drone.

In this case, though, the output target was another human.

The study had four participants, aged between 28 and 50. One participant was assigned to the brain-computer interface to transmit the thought, while the other three were assigned to the computer-brain interface to receive the thought.

At the BCI end, the words “Ciao” and “Hola” were translated into binary. This was then shown to the emitter subject, who was instructed to envision actions for each piece of information: moving their hands for a 1 or their feet for a 0. An EEG then captured the electrical information in the sender’s brain as they thought of these actions, which resulted in a sort of neural code for the binary symbols — which in turn was code for the words.

This information was then sent to the three recipient subjects via TMS headsets, stimulating the visual cortex so that the recipient, with ears and eyes covered, saw the binary string as a series of bright lights in their peripheral vision: if the light appeared in one location, it was a 1, and the second location denoted a 0. This information was received successfully and decoded as the transmitted words.

This experiment, the researchers said, represents an important first step in exploring the feasibility of complementing or bypassing traditional means of communication.. Potential applications perhaps include communicating with stroke patients..

“We anticipate that computers in the not-so-distant future will interact directly with the human brain in a fluent manner, supporting both computer- and brain-to-brain communication routinely,” the team concluded. “The widespread use of human brain-to-brain technologically mediated communication will create novel possibilities for human interrelation with broad social implications that will require new ethical and legislative responses.” (Michelle Starr-C/net)

Read Full Post »

It would have been a proper gesture as well as belated recognition of the role of Aristotle by awarding him the Nobel Prize for Science.

Charles Darwin had this to say of Aristotle:“Linnaeus and Cuvier have been my two gods, though in very different ways, but they were mere school-boys to old Aristotle.”

Like Herodotus who was acknowledged as the Father of History without much of controversy Aristotle ought to have been given long ago the mantle as the Father of Science.

Herodotus lived at a time much of history of nations that loomed large for scholars was accepted as myths where gods played a crucial role. Hellenic thought accepted them as necessary. In China Will of heaven was held up by the emperor whose right to rule was a mandate from above. If a dynasty came unravelled the significance was clear: it had forfeited the right by the Will of the Heavenly Emperor. In Greek ethos no less similar conclusion was accepted as correct.

How is it then that Aristotle the tutor of Alexander the Great failed to gain due recognition from scholars who had received so much from his inquisitive mind?

One may cite so many areas where Aristotle got it wrong. Think of the following ideas proposed by him.

* too much sex causes sunken eyes because semen drains matter from the human brain.

*the right-hand side of the body is more honorable and therefore hotter than the left. (In India this idea has its variant. It is the left hand one uses to wipe the butt after going to the toilet.)

*He also believed that the human heart processes and integrates sensations from the external world.

*The brain, beyond storing the matter that becomes semen, was just a cooling device for when the heart’s fires blazed too hot.

Mingled with all the bizarre zoology, however, are many impressively accurate and detailed descriptions. His accounts of the hyena’s genitals, the parental behavior of male catfish, and the limited sensory capacities of sea sponges are just a few of the many things about which he was essentially correct.

A fascinating new book by the evolutionary biologist and science writer Armand Marie Leroi claims that Aristotle fully deserves Darwin’s high praise. In The Lagoon: How Aristotle Invented Science, Leroi argues that Aristotle developed many of the empirical and analytical methods that still define scientific inquiry.

He was more than an encyclopedist. He collected such comprehensive data in order to analyze and interpret it. His theories and interpretations are often astonishingly insightful. One 20th-century Nobel laureate suggested that Aristotle deserved to receive the prize posthumously for his realization that the information that dictates and replicates an organism’s structure is stored in its semen. In some sense he was anticipating the discovery of DNA. His theory of inheritance can also account for recessive traits that skip generations, the contributions of both parents to the features of a child, and unexpected variations in traits that do not derive from either

Many of his observations are readily recognizable to a reader of Darwin. He notes that an elephant’s size confers protection from predators and that fish with high rates of infant mortality produce a larger number of offspring to compensate for the likelihood that most of the progeny will perish. He showed a nuanced understanding of how the forms and features of animals are adapted to their environments. Darwin even mentions Aristotle as a forerunner who anticipates the theory of natural selection in the preface to the third edition of On the Origin of Species.

Aristotle perceived some of the universal associations between longevity, period of gestation, adult body size, and degree of embryonic development that biologists still study today. He noticed the correlations among these features, but he was sensitive to the distinction between correlation and causation and sought to eliminate confounding variables. Then he integrated his findings into broader theories with deep explanatory power.

(ack: the Daily Beast)

Read Full Post »

In an earlier post I had discussed quantum computing is of a different league than digital computing. Today’s computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren’t limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Having qubits means it has an inherent parallelism. It is this property that allows a quantum computer to work on a million computations at once, while your desktop PC works on one. Let me quote Scientific American: Physicists have now shown how to encode three quantum bits, the kind of data that might be used in the computers of tomorrow, using just two photons. (For those who think science is not of interest may skip over to the second section) Let me refresh about computer memory from my previous post. Atoms, ions, photons or electrons and their respective control devices are working together to act as computer memory and processor. It is vital to compress data lest it should clog up the hard drive resulting in the Internet traffic to slow down. In classical computing a series of any number of identical bits encodes essentially the same information as just one bit. For quantum objects, however, this is not the case. Because in quantum computing same measurement made on distinct, but identically prepared, qubits will yield a range values. As such, accurately recording the quantum state of just one qubit involves taking measurements of multiple identical copies and averaging the results. Now, a group of physicists in Canada has shown for the first time that it is possible to compress the kind of data that might be used in the computers of tomorrow — known as quantum bits, or qubits. For example, if three qubits can each be in a superposition of 0 and 1, measuring them would yield eight possible outcomes: 000, 001, 010, 011, 100, 101, 110 or 111. But for the averaged measurements there are just four options: 0, 1/3, 2/3 or 1. For instance, 001 yields (0+0+1)/3 = 1/3, as do 010 and 100 (the same digits, but in a different order); 110 yields (1+1+0)/3 = 2/3, just as 101 and 011 do. Because the qubits are identical, the extra information in the ordering can be simply discarded, say the researchers. To make the point, Steinberg draws a classical-physics analogy. “Keeping all of the information,” he says, “is like storing the complete works of Shakespeare just to find out the average rates at which letters are used in the English language.” The results are due to appear in Physical Review Letters.

ii

When St. Paul writes to the Church of Corinth he writes about Moses leading the children of Israel through the wilderness. He gives the Hebraic account an altogether new twist. He explains the significance of manna and the Rock. In the books of Moses we read he did as he was commanded of God. He smote the Rock that supplied water to satisfy their thirst. Paul adds, “for they drank of that spiritual Rock that followed them: and that Rock was Christ (1Cor.10:4) Here is an example of superposition in which the coming of Jesus and his ministry was foretold. If it was divine will in the manner the children of Israel were fed the same will must be present in the miraculous ways in which gentile nations are provided for. God’s will cannot play ducks and drakes with his creation. After all God promised Abraham thus,” in thee all families of the earth be blessed.”  Gen.12.3

benny

Read Full Post »

« Newer Posts - Older Posts »

Follow

Get every new post delivered to your Inbox.

Join 2,590 other followers