Thursday, July 5, 2007

Discovery......



अशीच असते ही कातरवेळ......राहून राहून आठवतो तो क्शण.......आप्ल्या वाटा वेग्ळ्या करणारा.....तुझ्या आठवणीन्ची एक तार छेडली जातेअन मझ्या नीश्चल मनाच्या डोहात उठतातभावनान्चे असन्ख्य तरन्ग......अन आपसूकच या तरन्गान्च्या बनतात लाटा...आणि धावतात सैर भैर.....वाट मीळेल तिकडे...मी प्रयत्न करते त्यान्ना थोपविण्याचापण त्या आधीच ओथम्बून वाहत असतातमझ्या पापण्यान्च्या बान्धावरुन......हा महापूर उध्वस्त करुन जातो माझे वीश्व...अन रात्रीच्या त्या गडद अन्धारात.....शोधत राहते मी या उध्वस्ततेतहीपून्हा तुझ्याच अस्तीत्वाच्या काही खूणा........

Pratima........

Computer brain???


The brain is a highly parallel electrochemical computing and storage device. Because of this, it can be (and is) quite "slow" compared with a linear device and not well suited for exact sequential calculations of any significant complexity, but it is extremely well-suited for large-scale pattern storage and recognition. The massive parallelism also makes for a very fault-tolerant device, which is required for an organic construction where individual storage nodes can (and do) become inoperable over time, and the storage within individual nodes is not guaranteed to be permanent due to its electrochemical nature. Many computer scientists and engineers have been attempting to simulate the massive parallel computational ability of an organic brain for some time now. Some topics for further research: neuroscience, neural networks, parallel distributed computing, cognitive science, organic microcircuitry, and artificial intelligence.

Comparison of the brain and a computerMuch interest has been focused on comparing the brain with computers. A variety of obvious analogies exist: for example, individual neurons can be compared to transistors on a microchip, and the specialised parts of the brain can be compared with graphics cards and other system components. However, such comparisons are fraught with difficulties. Perhaps the most fundamental difference between brains and computers are that today's computers operate by performing often sequential instructions from an input program, while no clear analogy of a program appears in human brains. The closest equivalent would be the idea of a logical process, but the nature and existence of such entities are subjects of philosophical debate. Given Turing's model of computation, the Turing machine (which shows that any computation that can be performed by a parallel computer can be done by a sequential computer), this may be a functional, not fundamental, distinction. However, Maass and Markram have recently argued that "in contrast to Turing machines, generic computations by neural circuits are not digital, and are not carried out on static inputs, but rather on functions of time" (the Turing machine computes recursive functions). Ultimately, computers were not designed to be models of the brain, though subjects like neural networks attempt to abstract the behavior of the brain in a way that can be simulated computationally.In addition to the technical differences, other key differences exist. The brain is massively parallel and interwoven, whereas programming of this kind is extremely difficult for computer software writers (most parallel systems run semi-independently, for example each working on a small separate 'chunk' of a problem). The human brain is also mediated by chemicals and analog processes, many of which are only understood at a basic level and others of which may not yet have been discovered, so that a full description is not yet available in science. Finally, and perhaps most significantly, the human brain appears hard-wired with certain abilities, such as the ability to learn language, to interact with experienced and not chosen emotions, and usually develops within a culture.Nevertheless, there have been numerous attempts to quantify differences in capability between the human brain and computers. According to Hans Moravec, by extrapolating from known capabilities of the retina to process image inputs, a brain has a processing capacity of 100 trillion instructions per second, and is likely to be surpassed by computers by 2030.