Friday, May 16, 2014

The Praxis By Herb Wiggins, M.D.; Clinical Neurosciences; Discoverer/Creator of the Comparison Process/COMP Theory/Model; 14 Mar. 2014 The Praxis: The Use of Cortical Evoked Responses (CER), functional MRI (fMRI), Magnetic Electroencephalography (MEG), and Magnetic Stimulation of brain (MagStim) to investigate recognition, creativity and other aspects of the Comparison Process The word, praxis, is Greek and means how a model/theory is made/shown to work. Essentially, this means the Comparison Process ( COMP) can be detected and studied neurophysiologically by means described below. Anything which can be studied exists. Because the higher level cortical function of recognition is fundamentally based upon the COMP, then the study of recognition is the study of the COMP in one of its most basic, higher level forms. How do we recognize something, be it a word, an image, a face, a voice, a tune, a sensation? We compare it to our Long Term Memory and if it matches well, then we have recognized it. This is modeled to some extent by Byesian statistics. Now what happens in our cortex during recognition? It’s complicated, and no one really understands, nor can understand all of the details of how all those 10,000′s of cortical cell column neurons interact, esp. with their 100′s-1000′s of synapses with other neurons. Nor how they interact with the 100,000′s of other cortical cell columns. That is a problem far, far too complicated and detailed for any human mind to figure out in a finite time. So we can approach the problem another way, by cutting through the complexity and simplifying the understanding by using a high level tool, the comparison process. Recognition at its deepest level is essentially signal detection. From out of ambient noise the brain detects a meaningful signal, and then compares it to LTM (Long Term Memory) for recognition. If it maps reasonably well, then we positively recognize it. The relationship of this to signal detection in psychology is at once obvious.. From that comes recognition, or knowing. We know what the signal means. We recognize it by comparing it to our LTM of similar/same events. It matches, another COMP word. For instance if we hear a sequence of notes which sounds very much like the opening of a popular tune by the Eagles, say “Desperado”, we at once recognize the intervals as that and can name it, and often hum/sing along with it. That is recognition of the auditory/musical kind. If we hear our name yelled out, we often turn towards the source of it, and nod or signal back. That means we recognized our name. So by the evidence, recognition is one of the major actions mediated by COMP. Any event in existence, or idea/word/image which can be studied and recognized neurophysiologically, relating to higher cortical functioning, is part of the comparison process. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1569488/ The above URL is essentially consistent with the underlying neurophysiology of the Comparison Process. When we recognize something we get a P300 latency, that is a Cortical Evoked Response(CER). It’s similar to photic stimulation, i.e., a sequence of bright lights flashed during EEG recording) and related auditory evoked responses which are simple, basic detection responses. The high level P300 is associated with general recognition, which may be sensory stimuli, or even recognition of faces, ideas, words, etc. This is the basis of understanding the foundation of the COMP. As it’s cortical in origin, and so are the higher level cortical processes, therefore it fits. To “match” that is “compare” the internal memory/model of sensations, to external stimuli is essentially what the Comparison Processors do. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2727362/ And this which shows that recognition can be detected and understood, as well, including most specific tasks. http://brainlab.psych.qc.cuny.edu/pdfs/Johnson%20et%20al.%20Psychophysiology%201985.pdf Again these responses were very small in amplitude, but could be detected by repeated stimulations and built up from background. Higher level cortical recognitions could be detected and recorded with a much higher significance than chance. These articles above show that recognition can be reliably tested in many cases. Essentially they electrophysiologically show the recognition process going on, that is, the Comparison Process. Now, let’s take a model previously written about. Dictionary indexing of words to where they belong in a series of words. using the same, above method of collecting evoked potentials, can be shown, very likely, that the recognition of where the word fits in a series of words can be specifically studied. Also, the recognition of what a word means can also be studied. Also the placing of such a word in a series of alphabetically listed words can also be seen. In each case, if the experiment is done well enough, the exact point where the reader/indexer finds/inserts the word into its proper place can be determined by cortical evoked response (CER), the P300. This will confirm that recognition as previously discussed by the COMP is being measured by the CER. Each time this is seen and confirmed in subjects, will confirm that the COMP is indeed working. And also that it is, by the Rule of Commonality similarly used by most all persons. Also, the recognition of what a word means can be studied. Also the finding of such a word in a series of alphabetically listed words can also be seen. The same is true for every star on the Hertzsprung Russell diagram, the IUPAC listing of 32 Million known compounds, and the massive Taxonomies of all known species of life. And all the phone books, directories, and so forth. Most every time a word is seen and recognized, until habituation, a reliable CER P300 can be seen which shows that recognition has happened. Far, far more complicated recognitions can also be studied in this way. Images, complex tunes, word phrases, such as “The Beans were in the Can” and show that these make sense, whereas, the “Can was the Beans” makes no sense. There will no difference between the latter and noise, whereas the former will show the characteristic P300 of highly likely recognition, plus the individual will acknowledge that recognition, too. This is the COMP at work in the cortex. The P300 can be used extensively in these cases to show that in most cases of recognition of a sensible word/word cluster, there is a P300. Now, linguistically, we conjugate verbs by the “I love, You love, He loves, We love, you love and they love series, Amo, amas, amant, etc. What the COMP model states is that the two words will be recognized with a P300, while the individual words will not. No word is an island. No word stand alone. The recognition by the P300 will show, according to the COMP model what makes sense to the brain’s speech centers and what does not. That is what is language and communication and what is not. This will show that grammar as we know it is NOT as important an aspect of communication as has been thought,, but the COMP recognition by P300 IS what is meaningful. The REAL, truly underlying basis to most all language is the Comparison process. It’s how the words compare to each other, which can be detected by the P300, which is what makes language work, what makes sense. It’s the sensory and word context which supplies the meaning, not the formal grammar. It is MEANING which is important not the grammar names of the words. The basic linguistic structures will most all be related by the comparison processes in terms of word/phrase recognitions, not single words, which is basic, school grammar. The P300 will show what is meaningful to the brain, is not necessarily our formal grammar, either. It will give us a far deeper insight into how our higher cortical processes work, that is, the comparison Process. This is the Mind/Brain interface. This is the Praxis. It will show that Chomsky’s Language Acquisition Device is indeed the cortical cell columns in our speech centers, that language is innate because the CER P300′s will most all be the same during language use and recognition, over all languages, not just English. it will also show the finer details of how unlike languages are differently processed in brain, too. Linguistically, we will take a user who is otherwise cortically normal by MRI, and present to him/her certain signals such as “What is this?” holding up a banana for instance. There will be a P300 when and if she recognizes the banana. There will be a simple motor related P300 when she’s getting ready to sign the word, “banana”. And this will confirm she knows what it was, and had “re-cognized” the banana. Several other words can also be used. Several other subjects can be tested to find out the normal ranges of these P300 patterns, i.e., to establish Rule of Commonality comparisons and standards. Now, and the astute know where this is going, (their P300′s are registering recognition, again empirical introspection), we take Koko and do the same to her. We will see very similar patterns in her as she detects the sentence signed by, “What is this?” and her cortical P300 will register it. Then when she signs back “Banana” we will see a similar P300 in her motor cortex as we have seen in human subjects. If it’s not understood, then it will be delayed, background noise P300. If she understands it and signs Banana, a similar P300 effect will be seen comparing well to human detection criteria. Those P300′s, the human and the gorilla, or the chimp, will compare favorably and reliably. and this will not only show that humans and higher primates can communicate, but are also using very similar responses to each other, because our neurophysiologies are very similar, too. The neurophysiology of recognition, of the comparison process. It will establish that recognition among animals esp., the higher primates, is real and existing and will in other species, show that they are recognizing, too, by an analogous process if birds, dogs, cats, or other creatures are used. This is the comparative neurophysiology of recognition, that is, the Comparison Process. It will establish that Koko and other apes are indeed capable of recognition of Ameslan, and are using it just like we are, given some modest allowances due to species differences, and the fact they have fewer comparison Processors than do we. Recognition is the key characteristic of the Comparison Process. And it will guide in the following ways, our understanding of how our brains’ cortical functions work. Currently, functional MRI (fMRI) is being combined (Comparison process, of course) with Magnetic Encephalogram (MEG) to study structure/function relationships in the brain. We can detect brain function by increased flow by MRI and then correlate (compare) it with MEG signals, too. These are positive signs and methods, showing real, existing comparison processes are going on inside of the brain. This is a positive sign that our diagnostic methods are actively using the comparison process in combining FMRI/MEG testing to get more reliable data, as the comparison process shows, will, necessarily, happen. Just like a genetic defect can often provide the comparison necessary to better understand normal function by this same instance of having something to compare to normal. But there is something lacking and it’s this. What happens if those localized Comparison Processes are turned off, momentarily in the brain? Specific functions will be interrupted or be unable to be initiated. That will provide further solid evidence that a specific higher level function in brain is being shut down at a certain specific site, as previously revealed to be active there.yet another structure/function relationship, which is basic brain anatomy. The means to do this non-invasively is here. It’s called magnetic stimulation. Grossly if we stimulate the brain with a high Tesla field, it will make the neurons momentarily depolarize. Then we can selectively depolarize, by repeated stimulation to a spot in the cortex/brain where a cortical activity is being done, such as saying the word, “longitude”. We can find out where this process is going on by interrupting it, temporarily, by brain magnetic stim, can we not? And that will tell us what is going on there by the fact that it stops, having previously located that function in that area by fMRI and MEG. This will complete, non-invasively, the chain of structure/function relationships which can be found in living brain, will it not? Now how do we do this? Very easily. We have seen how Vinn diagrams overlap. and once again, in the more astute, there is a strong P300 very quickly being created right now. Each of the 3 overlapping in 3 dimensions, magnetic field strength stimulations is sub-threshold depolarization of the brain. Where they overlap, however, they would be made safely supra-threshold. And the desired point to target in the brain can be accurately found within a few millimeters by already well worked out Neurosurgical methods of Stereotactic mapping. Essentially, the gyri of the cortex of human brain could be sequentially and repeatedly studied non-invasively to find out what was going on at most points in the brain. The fMRI/MEG studies to locate where brain activity was going on, too. And the MagStim point method for confirming the existence at the specific sties in the brain where activity was going on by temporarily stopping it. The 3 comparisons would establish a high degree of reliability and confirm a working structure/function model of the brain to a degree of precision and refinement never seen before. The mind fairly boggles at this potential to utterly revise and gain much deeper understanding of how things work in our brains/minds. As a further benefit, it’s know that pain stimuli are processed in the brain through a “pain matrix model of about 10 known sites which mediate/modulate pain. using the MagStim point system, what each site does can to some extent be figured out. And if 1 or 2 certain areas, when stimulated, block pain completely, the implications would be considerable for pain control, esp. using placement of superficial scalp magnetic stimulation. on affected areas of the sensory cortex or subcortical targets. How this relates to creativity and its study is clear. By the COMP model, creativity is very much a form of recognition. That “Aha!’ moment. Again, a bit of dopamine release, too. But using the simple pennies method of counting, we could map creativity as better and better means are created for counting by each subject, at that point where they “get” the new idea either by description or when they suddenly realize/recognize a better method. In telling jokes, also, there would be a P300 when they get the joke as compared to when they did not, showing again, that the COMP underlies telling jokes and humor of most all types. This would confirm the COMP/dopamine boost aspect of known humor, rather convincingly. And using the fMRI/MEG plus MagStim points method, would show where in the brain this humor is being mediated. The same for music, for swearing center location, the conscience in the frontal lobes, and so forth. Exploring up and down the cortices of the brain, going over each gyrus in whole brain with ever finer and finer investigations. We see here, in the Praxis, the value of the comparison process which can not only empirically and introspectively see inside, and explain our minds/brains, but find ways to decode and understand our minds/brains. This is the power of the Comparison Process model. It means that the Tarab of Oum Kalthoum can be studied in those listening to her. It means that basic human thinking processes will be open to study, as well, from math, to music, from the sciences to the arts. Most all human brain activities of the cortex can then be more carefully studied. Let us think about one more subject, that of exactly how, what kind of stimulus is necessary for Long Term Memories to be laid down. Clearly, we know that it’s due to repeated reinforcements. The more we go over something, the more likely it is to be remembered, and more easily recalled. This is called facilitation in neurophysiological terms. But consider what happens during highly charged emotional moments. This will very likely enlighten us as to the nature of what is going on. During those times, such as Archimedes “Eureka!” moment, or those mountain top experiences we all have had, there is a LOT more dopamine release and much longer reinforcements to memory being made. How often do we have to purposefully reinforce those awe filled moments to recall them? Not often. It gets done by the high dopamine release those events create. That’s the secret, we see. Some dopamine release is seen at most all creative and recognition events. Some more so than others. It’s the dopamine boost, which begins the LTM event of protein synthesis and synapse creation, which creates the stability of the Long Term Memory traces, which Wilder Penfield found in his noteworthy studies on living brain in Montreal. Those stable memory traces are what create the platform upon which our consciousness and thinking are founded and stabilized. More in the article on “Brain Hardwiring. Further consider what happens when photic stimulation occurs. The brain will create an evoked potential which is visible on the EEG. Similar effects have been seen during music played with a heavy beat. These will entrain widely the brain. It can have almost an hypnotic effect on people alone or in groups. Thus the tarab of Cairo, the effects of rock groups, the crowds’ responses to a charismatic speaker. The effect of the 1-2, 1-2 of 4/4 time beats of the marching band we hear as it passes by. When we see flocks of birds flying, as they veer and fly together, to schooling of fishes, or the herd behaviors of animals running. How is this any different than our entrainment when people sing in choirs or play in a band or orchestra, together, when they become as one? This is the social entrainment effect. If CER were done with those in such collective groups, we would see the careful and close entrainment of visual and sensory evoked responses acting as one in those groups of players. The same with a good movie, where all were on the edges of their seat during scenes of brilliantly acted and produced scenes. Or in the listeners to those inspiring choirs, and orchestras. They would be entrained just a birds and fish fly together. What of empirical introspection? The investigation of CER can surely give insights and confirmations of it, esp. in known tasks, such as indexing and reading indexes looking for a specific target word or word placement in indexing. So, we see the Comparison Process as work, finding correspondences, relationships, associations and creativity, which bring us better understanding and the best performances of our species. http://newsoffice.mit.edu/2014/expanding-our-view-of-vision-0126 brain mapping using combined fMRI and MEG Target words: Recognition; cognition; Comparison Process; Cortical evoked potentials; Functional MRI (fMRI); Magnetic Electroencephalography (MEG); Magnetic stimulation of brain (MagStim); Humor, Long Term Memory. Dopamine

Sunday, May 11, 2014

Le Chanson Sans Fin: Table of Contents

Le Chanson Sans Fin Table of Contents 1. The Comparison Process, Introduction, Pt. 1 http://jochesh00.wordpress.com/2014/02/14/le-chanson-sans-fin-the-comparison-process-introduction/?relatedposts_hit=1&relatedposts_origin=22&relatedposts_position=0 2. The Comparison Process, Introduction, Pt. 2 http://jochesh00.wordpress.com/2014/02/14/le-chanson-sans-fin-the-comparison-process-pt-2/?relatedposts_hit=1&relatedposts_origin=3&relatedposts_position=1 3. The Comparison Process, Introduction, Pt. 3 http://jochesh00.wordpress.com/2014/02/15/le-chanson-sans-fin-the-comparison-process-pt-3/?relatedposts_hit=1&relatedposts_origin=7&relatedposts_position=0 4. The Comparison Process, The Explananda 1 http://jochesh00.wordpress.com/2014/02/28/the-comparison-process-explananda-pt-1/ 5. The Comparison Process, The Explananda 2 http://jochesh00.wordpress.com/2014/02/28/the-comparison-process-explananda-pt-2/ 6. The Comparison Process, The Explananda 3 http://jochesh00.wordpress.com/2014/03/04/comparison-process-explananda-pt-3/?relatedposts_hit=1&relatedposts_origin=17&relatedposts_position=1 7. The Comparison Process, The Explananda 4 http://jochesh00.wordpress.com/2014/03/15/the-comparison-process-comp-explananda-4/?relatedposts_hit=1&relatedposts_origin=38&relatedposts_position=0 8. The Comparison Process, The Explananda 5: Cosmology http://jochesh00.wordpress.com/2014/03/15/cosmology-and-the-comparison-process-comp-explananda-5/ 9. AI and the Comparison Process http://jochesh00.wordpress.com/2014/03/20/artificial-intelligence-ai-and-the-comparison-process-comp/ 10. Optical and Sensory Illusions, Creativity and the Comparison Process (COMP) http://jochesh00.wordpress.com/2014/03/06/opticalsensory-illusions-creativity-the-comp/ 11. The Emotional Continuum: Exploring Emotions with the Comparison Process http://jochesh00.wordpress.com/2014/04/02/the-emotional-continuum-exploring-emotions/ 12. Depths within Depths: the Nested Great Mysteries http://jochesh00.wordpress.com/2014/04/14/depths-within-depths-the-nested-great-mysteries/ 13. Language/Math, Description/Measurement, Least Energy Principle and AI http://jochesh00.wordpress.com/2014/04/09/languagemath-descriptionmeasurement-least-energy-principle-and-ai/ 14. The Continua, Yin/Yang, Dualities; Creativity and Prediction http://jochesh00.wordpress.com/2014/04/21/the-continua-yinyang-dualities-creativity-and-prediction/ 15. Empirical Introspection and the Comparison Process http://jochesh00.wordpress.com/2014/04/24/81/ 16. The Spark of Life and the Soul of Wit http://jochesh00.wordpress.com/2014/04/30/the-spark-of-life-and-the-soul-of-wit/ 17. The Praxis: Use of Cortical Evoked Responses (CER), functional MRI (fMRI), Magnetic Electroencephalography (MEG), and Magnetic Stimulation of brain (MagStim) to investigate recognition, creativity and the Comparison Process http://jochesh00.wordpress.com/2014/05/16/the-praxis/ 18. A Field Trip into the Mind http://jochesh00.wordpress.com/2014/05/21/106/ 19. Complex Systems, Boundary Events and Hierarchies http://jochesh00.wordpress.com/2014/06/11/complex-systems-boundary-events-and-hierarchies/

Friday, May 9, 2014

Cosmology and the Comparison Process (COMP); Explananda 5


By Herb Wiggins, discoverer/creator of the Comparison Process//COMP Theory/Model, 14 Mar. 2014, USA.

In cosmology, let’s lay the groundwork, and then show the breakthroughs which have and can come from the COMP’s creative capabilities. Humason and Hubble did a great deal of work in the early to mid 20th C. observing many interstellar objects via the Mt. Wilson Telescope, then the largest and best in the world. They established that galaxies existed outside of our own, and Humason’s arduous work in obtaining detailed images and spectra of near and distant galaxies showed that the universe was expanding upon the basis of the red shift of the galaxies. This expansion of the universe based upon the massive COMPARISON of these red shifted galaxies, etc. makes their work a well established scientific fact, which cannot be rationally disputed by educated and carefully thinking people. Those galaxies which are more distant have a greater red shift, due to their increased velocity moving them away from the earth, as the fixed point of reference, i.e., in comparison (COMP) to the earth. The further away these galaxies are located, the emission lines in the spectra obtained from these galaxies move to the left, to the red end of the spectrum. And indeed many have been found which are so distant, by COMP, that those emission lines have been moved nearly into the infrared. and probably some DO lie in the IR, too. But these are harder to look for, because not only will they be dimmer, but will require a different kind of detector to find.

But the point is this, and it seems somehow to have been missed. When we look at the emission spectra of the elements on the earth, they are clearly very unique to each element, that is each atom of a specific atomic number. This is true because the emission lines are most commonly created by the excitation of electrons in their discrete stable, outer levels of the atom, and when they jump back to lower levels in their positions around the nucleus, which is unique to each element, they give off a highly specific & characteristic set of emission lines. These lines are the same for each element. They are the same for the elements in the sun and the planets. They are the same for the nearby stars, and the more distant ones in our own Orion spiral arm of our galaxy, both outwards and inwards to the 1600 LY (light years) distant Orion group of nebulae. In the next spiral arms over the emission lines are the same, and exactly so. In the most distant parts of our galaxy we can see using the COMP, about 80K LY distant, where the SagDEG (Sagittarius Dwarf Elliptical Galaxy) is now passing through/near to the disc edge of our galaxy opposite ours, The stars there have the same emission lines, as they do on the earth, and in ALL the stars/objects in the intervening distances. The stars of the quite separate and discrete and different history and origin of SagDEG also show the same emission lines as we see on the earth. The spectra of the Large and Small Magellanic Clouds, satellite galaxies at 120K LY and further from out our own Milky Way galaxy, ALSO have the same emission lines. That of the nearest large galaxy, the Andromeda also has the same elemental emission lines. Exactly. The same emission lines are seen in a smaller spiral galaxy satellite to the Andromeda, the M33 galaxy.

And so on and on so far out as we can look with our most advanced telescopes. Both on earth and in space, we see the same, exact frequencies of the elemental emission lines in all of the other closer, or more distant galaxies, even those at incredible distances. This is the important point. A linear model of the universe’ expansion (probably incorrect as the universe is a complex system and so not linear) and those redshifts, gives the most distant visible galaxies(ignoring IR shifted distant galaxies, at about 14 Billion light years (B LY) away. They are also 14 B years back in time, too. and even at those distances, and in all the intervening distances, the emission lines are the same by the Comparison Process. THAT is utterly Extraordinary. We have then visual proof that the same elemental electron levels peculiar and particular to each element are the same all over the universe. We know that the electron levels are built upon the nuclear structure of the elements, and thus those are all the same from here on earth to the most distant and most ancient times, all the same. Thus nuclear physics and chemistry and biochemistry here on earth and everywhere in our UNI-verse are all the same.

But there is more. Einstein postulated that a photon’s path would be bent by passing thru a strong enough gravitational field. and his theory of relativity was confirmed in this by finding that the light of a star passing near to the sun bent exactly as Einstein predicted it would in the 1920′s, and whenever rechecked(COMP), since. Further, when we look at distant galaxies, we can see their images also bent by photons from the galaxies, passing very near to very large stars, too. In addition, Einstein stated that if the light from a very, very distant galaxy were to have interposed another large galaxy between the more distant and closer galaxy, on the line of sight of the earth, we would see distorted images of that distant galaxy around the interposed one. And those images, the Einstein arcs and crosses of the light of such distant galaxies, have been found in enormous numbers. Even at distances of nearly 10-12 B LY. And this not only means that again, the laws of gravity as we understand them so far, are the same here on the earth and near to our sun, but also as distantly as we can see, and in all the myriads of places in between–both distant in the past, and up to the present, i.e. distant in space and time compared to the earth.

As an aside:
There is yet one more interesting inference we need to look at. If these Einstein crosses have been seen, clearly in the visible light spectra, and if there is a continuing red shift, then in a very old universe such as ours is, there may be redshifts into the infrared frequencies. And correspondingly there should be visible, real, but more faint Einstein crosses and arcs, similar to what we see in the visible spectrum, tho as the IR images would be far more distant, would be correspondingly fainter and hard to see. We have the IRAS satellite. Are we looking for those? Imagine the change in cosmology and our estimate of the age of the universe should a number of IR Einstein crosses and arcs be seen. And what of microwave Einstein arcs?

If we don’t look, we won’t find. The Low Surface Brightness galaxies of David Malin’s astute observation, were found in huge size and numbers of very faint galaxies, whose existence had not even been expected by just this same COMP driven creativity. The existence of the very distant satellites of our sun, at highly eccentric orbits, were simply missed because no one would spend the time and money to look for weeks at a field of stars to see if those planetoids beyond Neptune could be found. But when astronomers looked, they found them by the 100′s, upwards of 3200 km. diameter bodies, too.

Do the cosmologists and astronomers dare to look for IR and Microwave Einstein arcs and crosses? And if not, why not? This would further find more highly relevant data to compare to what we know and confirm, or disconfirm current estimates of the universe’ age. These questions show once again, the creativity of the COMP.

This shows the power of the COMP. It can take known, real proven facts and then extend them, showing once again the creativity of the COMP and where creativity comes from. The Comparison Process, everywhere, immanent, working.

This is confirmation of the enormous structural integrity, constancy and stability of our universe, to have lasted at least 14 Billion years (BY) and even at the greatest known distances, in all directions of at least 14 B LY (probably longer and more distant, too). The fact that our universe appears to be accelerating its expansion, indicates its non-linear nature and if so, indicates our universe is far, far older than the linear model implies, upwards of 15-20 B years old. This then has protean implications. Because we know that life as we know it took about 4.5 BY to develop on earth, even tho early life was created a few billion years after the earth’s creation, that given the right conditions at any time and any place in our universe that can exist like ours. and further that these laws of nature will continue for billions of years into the future. Life could and probably has come about by evolution, not once, but unlimited numbers of times inside our galaxy alone. We are not alone in the universe. This is proved beyond all reasonable doubt, by the findings of Hubble and Humason, and the observations of the gravitational effects at similar huge distances in space & time. Therefore it’s wise to look for other life out there. and we have come 15 Billion Yrs. and VERY late to the game.

In addition, not only can we live here on the earth with the proper conditions, but we can live ANYWHERE and ANYWHEN in our universe, as well, through all spaces and all times. The physics and the chemistry are the same, everywhere, every part of it. We do indeed live in one UNI-verse.

But there is more, much more to this COMP of the distances and times of the emission spectra seen. How does the universe know and stay to be the same, in all times and all spaces? That is a weighty question. How does this function of everywhere in all spaces and all times, having the same physics and chemistry and gravity come about, too? Occam’s Razor, a form of the least energy principle(a complex system characteristic which rules our universe), states that the simplest explanation accounting for the salient facts, is usually the correct one.

And this is the most likely the simplest answer: The universe and all of its parts are the same because at the most fundamental and deepest quantum level, they are all in touch with other parts of the universe instantaneously. That means, in other words, every part of our universe is in constant and immediate contact with every other part, at the level which is very microscopic and small, where quantum level events take place. It means the observations of those laws of chemistry, physics and gravity, including relativity effects and rules, all come from the deepest quantum effects and laws from which our universe arises.

Well, that’s all very well and good, but do we have any other physical evidence for instantaneity in our universe? The answer is we have more solidly based evidences. First of all, Einstein’s thought experiment about what happened to a rider on a light beam. If we rode on it, we’d see that no time passed, because all time slows down more and more if one speeds up matter faster and faster against a relatively fixed position. At that point everything around the photon seems to happen instantaneously, while no time at all passes for the photon. Einstein’s creative COMP, you see, once again. When one hits light speed, all time stops for the photon. The twin never ages at all. And this is instantaneity. There is no time. For the photon everything outside of it happens all at once. There is no time for the photon.

Another well known example is that of the EPR paradox. Einstein, Rosen and Podolsky postulated that if one were to entangle a particle or photon with another, and then separated them, if the spin of one of the pair were measured, it’d instantly fix the spin of the other of the pair. Several years ago, Bell described a way to theoretically test this. and his Bell test has now confirmed, that indeed, the spin of the opposite particle is instantly created. Not at light speed. Not at more than light speed, but INSTANTANEOUSLY, to the point where the current measurement of the transfer of spin information between two particles is above the lower limit of 40,000 (!!!)) times Cee, the speed of light. which is about 6 digits of 99+% instantaneous speed. That’s convincing evidence of universality of instantaneity, too. It’s predicted to be so by Quantum mechanics. and it’s real and true.

Because it doesn’t matter how FAR the two particles are separated, the opposite spin will be instantaneously created in the other one, even if it is on the other side of the universe. and in all space and times distant, too. & if it can join two entangled particles together, then it can join everything else, too. It’s all tied together, you see, at the deepest quantum levels of our universe.
There is yet another evidence of this. We know that in a gravitational fields time slows down compared to a lower field. The higher the gravitational field strength, the slower the time goes. The lower the field strength, the faster events go compared to a standard reference clock. Events on the earth proceed, even at earth’s tiny gravity, slightly slower than do similar, highly synchronized atomic clocks orbiting the earth.

http://en.wikipedia.org/wiki/Gravitational_time_dilation

Now consider this further. What happens if we continue to reduce gravitational fields? Not only locally, but at great distances, say in the areas of the universe where there are great voids left in space? In those areas, 100′s of millions to billions of light years across, we should see a substantial increase in speeds of measurable events. Taking this further in a thought problem, if we should cut out all gravity altogether in some way, in such a way that there was NO mass and thus no gravity in the universe, what happens?

We would have instantaneity. So we can see by this careful reasoning from real events, that mass/gravity not only created time by slowing down an innate instantaneity, but in creating time, has created all space as well.
We see an expanding universe. This means that the density of mass/gravity fields is steadily decreasing. Mass & energy cannot be either created or destroyed totally in this universe. It can be converted to energy, but it still exists. This means that in the very long run, what we are now seeing might be the same kind of illusion, comparing (COMP ) the apparent flatness of a very large round globe, which missed the actual curvature, because we haven’t observed over time long enough. A very real decline in mass/energy density, which misses local time speeding up, observable only over millions-billions of years could be the same kind of illusion. We can’t yet see this because we haven’t observed long enough. Nor have we looked for it, either. Just like the trans-Neptune objects were not seen, because they weren’t looked for.

If one puts a neutron into a nucleus, it becomes more stable. If it’s ejected by a nucleus, it will decay in 15 minutes or so. So are to we assume that that the nucleus changes the decay rate of the neutron? Yes, because the locally ultra high gravitational density of the nucleus slows down time to such an extent, that the neutron’s lifespan is greatly prolonged.

Now consider well into the universe’ future, comparing the present epoch to where gravity/mass density has decreased to 1/10 of that found now The speed of neutron decay will increase inside and Outside of the nucleus.. The mass density decline will trigger instabilities in the neutron and thus in the isotopes of the elements. This will first affect very likely the higher atomic numbers as well. So there will be a shift from stability in the formerly not radioactive atoms, to becoming radioactive and decaying. This is where radioactive decay comes from. It explains neatly and clearly why it exists. It’s likely to be true by Occam’s Razor. Steadily over time with expansion of the universe, the largest and heaviest elements will begin to decay faster and faster, until the periodic table of stable elements is substantially smaller than it is now.

Again, massive use of the COMP, its power and creativity to explain and predict previously unknown and unsuspected phenomena in our complex system universe. In the huge voids in our universe, we should see this happening. INcreasing instability of matter (Again the complex systems characteristics of stability/instability, endlessly observed and real and used for explanations by the COMP.). Extending this logically, over time, when the density of the universe through exponentiating expansion comes about over 10′s of billiions of years, we’ll see matter coming apart at the high atomic numbered element end. and eventually, all matter will come apart, perhaps even the proton and electron. At those points, according to E=MCexp2, tremendous amounts of energy will be released involving masses beyond comprehension as even the 10 more massive than visible mass, dark matter, converts to energy.

There may be another tremendous explosion at that point. Comparing this to the Big Bang by the size of it.
This show the power of the COMP. It can take known, real proven facts and then extend them, showing once again the creativity of the COMP and where creativity comes from. The Comparison Process, everywhere, immanent, working. e have then the potentiality for the creation of another universe, do we not? A cyclical system of the creation & eventually destruction of one universe, leading to the creation of another. Almost Hindu.

These instantaneity findings postulate the existence of an underlying bosonic quantum level of which is instantaneity is a part. Where there is matter, in large enough amounts, then space/time can be created because it slows down this instantaneity and creates a universe. From this underlying bosonic ylem, we see in our own material universe, the Casimir effect, where the gravitational/mass effects not only sustain mass/gravity and space and time, but spontaneously are creating virtual particles all the time, thus continuing the mass/gravitational fields. From a massive quantum fluctuation of this underlying instantaneity condition, arose our universe and to it, it may eventually return, as mass/gravitational fields lose their strength from the expansion of the universe, and matter falls apart.

Its may be that the Casimir effect shows this constant, low level creation of virtual particles, which is this interconnection of all with everything else. It must take place by some means, this constant interactivity with the underlying postulated bosonic ylem, but entanglement seems to be one way. There are probably other evidences, too.

This shows the power of the COMP. It can take known, real proven facts and then extend them, showing once again the creativity of the COMP and where creativity comes from. The Comparison Process, everywhere, immanent, working.

Where does the electron and other particles/photons go when it’s QT’ing? Or any other particle for that matter? Via the instantaneous bosonic ylem of the universe, very likely.

There are probably others examples of instantaneity, but it’s likely that the entire universe is connected instantaneously with the rest of the universe by this underlying bosonic, instantaneous quantum level. Again, by COMP of the exactitude of the laws of physics over all the billions of LY and years, and the fact that for the photons all the universe happens all at once, and for the results of the Bell test. When particles become entangled, they may partake of this instantaneity. When quantum tunneling occurs, they may briefly pass thru this bosonic ylem of the quantum level which generated/generates our universe.

And it might also explain the probable existence of faster than light(FTL) events on the quantum level. In 1955 Wigner, who later won the Nobel Prize in physics, showed that alpha particles emitted in the radioactive decay of an isotope, occ. moved FTL. Hawking radiation from the probable evaporation of black holes (And by the COMP, of neutron stars to white dwarf matter) is consistent with Quantum Mechanics. The too early arrival of neutrinos even before the light reaches us from a supernova explosion, may be yet another example. Quantum mechanics does not prohibit FTL speeds. If true, and it seems likely, then the QT events seen in our enzymes which perform biochemical transformations, may also do FTL QT, because that is the ultimate in least energy/time production of such events. Again, by picking up the instantaneity velocity of travelling thru the underlying instantaneous bosonic ylem.

Bose-Einstein condensates may also show some of this instantaneity, when a mass of matter cooled to very near absolute zero becomes a single quantum state. and very nearly bosonic in nature, too. Even potentially transparent to mass passing thru it, which is a bosonic trait, not that of a fermion.

Let’s look at a simple process, to give some idea of this recursive, yet simple observation can give insights into the origin of our universe. what happens when a single high energy gamma ray is converted to an electron/positron pair? These have opposite charges and mass. The gamma ray has no charge and no mass as it’s a boson. The e–/e+ pair have mass, and are fermions, not bosons. From a boson, mass has been created in the fields of our universe, as well as anti-matter and charge. How this comes about is yet another mystery of the complex system of our universe. But it’s altogether real and true.

Bosons giving rise to matter/antimatter pairs. COMP shows that’s what may have happened at the big Bang, too. Some kind of quantum fluctuation occurred in the underlying boson ylem and there was the big bang. The asymmetry of particles, seems to have created more anti-matter than matter, which is why matter predominates in our universe.

But lets look further. When the gamma ray converts to the e-/e+ pair & energy, its energy is roughly halved. The e- goes on but if the e+ collides with another e-, they will annihilate either other, and an X-ray will be given off. The energy of the individual particles is declining. The ene4rgy is spreading out. The X-ray may be absorbed by an electron and then it will emit a lower energy photon, again. and so on and on until just IR is scattered around.

COMP shows us this is exactly governed by the Laws of Thermodynamics. The energy/mass will diffuse outwards, and the energies of the pair and so on, will steadily decline. This is entropy, the so called heat death of the universe principle. That is the way things go in our universe. From higher energies to lower. From more mass to less as hydrogen is converted to photons of energy in the stars, and then it spreads itself out, again. This is, in short the arrow of time, and it’s thermodynamic physics on the quantum level, on average, which creates this arrow of time. process of this sort do not reverse. It’s created again and again, in lots of other ways. It’s the second law of thermodynamics which creates the arrow of time, in our macroscopic universe, as well as at the quantum level, tho in the latter, it’s a probability which when summed up over 10 exp 23 times, shows us the laws of thermodynamics. Time moves in the direction it does by the laws of thermodynamics. The arrow of time IS the 2nd law of thermodynamics and the two are equivalent by the COMP.

The expansion of the universe is driven by the 2nd law. Energy tends to diffuse. Matter tends to spread out using energy. The mass/density of the universe tends to decline over eons. This is no accident, but lies at the core of the bosonic ylem which underlies our universe. Therefore, by the Comparison Process if the universe is expanding faster and faster, tho the expansion speed up is slight, we should see in this same way, the gamma ray converting into a e-/e+ pair, faster and faster, too, over the long haul. Just as we should see the neutron & isotopes decaying faster.

Why is the speed of light 300 K km/sec.? Current physics gives us no answer. But now we know that matter exists between no time and instantaneity, which are the same thing, their dual nature of the same event. In the early universe, with very high gravity, light was a lot slower. As the mass/density of the universe decreased to our epoch, Cee is what it is. On the way to instantaneity, light velocity may steadily speed up. We have not yet checked the speed of light off our earth to any great or accurate degree. In a much lower gravity, light should move faster than it does on the earth, too. We can’t do that experiment very well yet, because we are not living off the earth or very far from the sun. But when we do……. (grin)

 Tagged , , ,

Language/Math, Description/Measurement, Least Energy Principle and AI

Language/Math, Description/Measurement, Least Energy Principle and AI
By Herb Wiggins, Discoverer/Creator of the Comparison Process/COMP Theory/Model; 14 Mar. 2014

“Man is the measure of all things.” –Protagoras 5th C. BC

Table of contents
1. Inability of Mathematics to describe language; inability to describe biological, taxonomies and medical language and processes..
2. The universe is NOT mathematical, but partly describable with math.
3. Flexibility of language in descriptions markedly superior to math; useful biological/medical examples
3a. The comparative forms of adjectives as incontrovertible PROOF of the presence of the COMP in all language/descriptions.
4. Measuring is ALL a Comparison Process (COMP): distance, weights, time, etc.
5. Descriptions mostly cannot be measured. It lacks numericity used in the sciences.
6. Visual tracking as a predictive COMP; Butterfly chaotic flight and tracking; missile control by math/geometry versus avian tracking systems; human tracking while driving is much the same.
7. Predicting the future and the Least Energy Principle (LEP); value of the rule of 72;
collapse of the USSR and the LEP;
8. Stock market collapse of 2000 and predictions/prophecies.
9. Understanding the structure/function relationship of the comparison process in the cortex of brain; why it’s very hard to understand complex systems esp. of the cortex;
10. Can mathematics, if it cannot describe language much at all, describe human cortical cell functions which arise from the cortex?
11. Can present day math learn how to speak language, or write creatively?
12. A COMP possible solution to the problem of re-creating by machines, human cortical creativity; increasing speed of human creativity by computer modeling.
13. How do programmers create new programs, new operation processes, etc.?
A new form of relational mathematics is needed. Math needs to grow a new form, more descriptive as are languages.
14. The COMP which creates language is more important than mere grammar.
15. The use of empirical introspection to analyze and model programmer creativity processes, as it has that of scientific creativity. Creating creativity on computers by studying how programmers do their work.
16. Empirical introspective study of programmers’ skills and how their cortex’ output creates new programming. Successfully nderstanding programmers’ creativity can leads to a creative computer and substantially speeds up programming progress. Creativing creativity by computers will then be directly applicable to understanding language, emotions, and so forth and creating true AI.

1. The real problem has been for years that language and mathematics are not consonant. We can say everything in language, even complex mathematics, and we can write a great deal in language and are NOT able to translate that into math. For instance, the entire taxonomy of living and extinct species of all life, all the kingdoms and phylae, cannot be translated into mathematics. A bit of the descriptions can, but very little of it, either. Images of the living species cannot either. This is a real problem. The math does not exist which can describe a living species, except in trivial measurements, either.

In the same way, the entire compendium of medicine, the texts of each specialty, the physical exam, physical findings, differential diagnosis, complex system of steps of testing to a reasonably secure diagnosis, and the treatment protocols cannot be mathematized. We cannot describe the intricacies of psychology and psychiatry, let alone the anatomy and physiology and structure/function relationships derived from neurology in math either. It’s impossible with math as it exists at present.

In the same way we cannot translate a dictionary into mathematics, nor a novel, nor a play, nor a movie. Yet we can say and speak about all of mathematics. Teachers do this every day all over the world. The descriptions using words can describe math, but math cannot describe very much which is verbal.

2. I recall hearing many years ago at university that the universe was mathematics. I just looked at him and asked, then mathematize anatomy, the differential diagnosis and the entire DSM3!! He got very quiet and muttered something rude, and also logically irrelevant to the obvious. The universe is not any more mathematical than it’s English, French, or Latin, and those languages esp. in the biological world describe it far, far better than math ever can. In the arts and religions of the world, we can defy anyone to translate the Bible, New and Old Testaments into math, or for that matter, the Koran or the Bhagavad Gita, or the Buddhist texts. It can’t be done. Or to translate an entire movie into mathematical terms, or an opera or symphony? Impossible!! Clearly.

There is an extreme limit to the abilities of math to take on physical descriptions, esp. images. A picture is worth 1K words. An image would take hugely more, a very great deal more, perhaps 10k’s more using math!!. And neither could the math identify what the objects were, either. Esp. not even famous places.

3. Verbal descriptions on the other hand are very, very flexible and useful, as anyone in biological fields, including medicine, know from working every day. Let’s describe a beetle, for example. We can tell about size, altho we can use measurement to describe in more precise terms. But we use colors, and patterns of colors for the overall description. There are 2 antennae, 6 legs, often swept back in the Scarabaeidae family..There is a hard, protective, chitinous covering over the wings called the elytra. There is the cephalon (head), the thorax and the abdomen. Each of these in many beetle families has its own shape, such as the Coccinelidae, the lady bug family, where all are conforming to the rounded shape, tho the 3 major body division still are there. We describe these often with a drawing or image, so when we see them we can recognize them. The entire taxonomy of all beetles, and indeed all species known has been described using words. measurement is useful, but incidental to it. These descriptions are in fact sorts of measurements, tho they are qualitative, not quantitative. yet there are highly useful in description of almost all living forms.

3a. The most convincing demonstration of the ubiquity and that the Comparison Process is at the core of language and its descriptions are the comparative adjectives and forms. Endless and unlimited, just like the COMP. Here is the proof. Good and bad; Good, better, best, the trinary forms of the dualities, the comparative adjectives. Nice, nicer, nicest. Lowest, lower, low, high, higher, highest. Here is a Continuum built of two continua!!. Very much so. Two together. Comparing, combining, ever additive, endless. Very nice; somewhat nice; very, very nice. Endless comparatives. Take each letter of the alphabet and start listing each of the easiest to think of. Above, almost, below; before, a bit before, just before; After, nearly after, just after. Cool, cooler, coolest; close, closer, closest. Dull, duller, dullest; very dull, most dull. Happy, happier, happiest, very happy, much happiness, more happiness, most happiness. Etc., etc., etc., etc., etc., etc., etc., right to the end of the alphabet and in any modern language, find the same. Universal, real, existing, and solidly evidenced AND confirmed by unlimited examples, which anyone can create, any time.
Again, the COMP, endless, unlimited, undeniable, incontrovertible, essential, ever present, at the very core of description and language. The Comparison Process creates language and is the engine of language creation and usage.

4. In measuring, we use the Comparison Process overtly and completely. If we are to measure distance, we compare that to a known standard, be it a ruler, tape measure, or in surveying the theodolite which measures against the known phenomenon that the further away something is, that the size of it decreases by the square of the distance. This can be very precisely measured and then compared to the known standards to establish quickly and easily the sizes and dimensions of large areas of land without dragging around long ropes, chains and other formerly used methods. Each of these cases shows that the measuring COMPARES to a fixed gradated standard to arrive at a unit measure.

When we measure weights they are measured most accurately using a balance scale which measures a highly graded tension which is standardized against (compared to) a known weight. We step on a scale it’s comparing our weight to what is already standardized. All weights are measured using comparison.
When we measure time, we do so comparing to the UTI in Greenwich, UK, where the time is known and broadcast around the world by radio so the actual time can be known in each of the time zones. For more precise measurement, we compare the second to the vibrations of a quartz piece in a watch, which is precisely known by counting the vibrations/second and constantly counting that to create an accurate time piece. For more precision, we use the transition times of microwave radiation which occur when electrons rise to a higher level when absorbing a radiation then release it when falling to a lower level. This occurs at a very precise rate and the earliest effective clocks were accurate to 1 part per 10 Billion. So time is measured compared to the electron transition times between 2 electron levels in a suitable atom, usually Cesium. Time again is measured by this comparison.

When we synchronize we use the comparison process Two times. First the clock is standardized to noon, where the sun is at it’s highest point. That’s why Noon is used because the day was always measured from noon to noon. For obvious reasons as overcast days, this method had to be modified. Thus we standardize to noon, even today, worldwide, where the center of each time zone is offset 1 hour every 15 degrees latitude for each time zone east or west of Greenwich. Then we look at the clock, usually with a second hand/digital readout and compare our watch to when the second had reaches 12, for instance, and set the watch to precisely compare to the standard clock time. Comparison all the way through.

When we read time we do so by comparing events to a standard time keeper. When we measure speeds we measure the distance divided by time, against two comparison processes to establish distance/time, giving speed in meters/second, or whatever units to be used. In every case, we compare the event being measured against fixed standards. Measuring is clearly, plainly a pure comparison process.

Now look, how is description any different from measuring except there is not the valuable numbering system? It’s no different in fact. So when we state something we are actually using a qualitative description to measure a known quantity without numbers. Whether it’s a color which corresponds (Comparson Process) to known frequencies, or brightness using a photometer to measure the number of incident photons, it’s all the same thing.

Descriptions can be compared to some measurements. But some descriptions cannot BE measured, for instance when we call something a leg, or tail, or head, or wing. Those do not carry numericity, which is so valuable in terms of measuring events in the sciences. And indeed, it’s very hard to introduce numericity into the normal language. It’s been the great breakthru which has created the science, where ingenious, creative (COMP) methods have been used to measure where we could not measure before. It’s the number use, numericity, which has made science so successful because it permits more precise description than possible with verbal descriptions alone. It also creates more predictive capabilities,

6. Consider tracking of an object. Our visual systems are esp. good at this and devote a good deal of the nervous system to closely yoking each eye precisely in line withe other, so we don’t get double vision. This double input allows us to estimate distance using a parallax method, depth perception). It also allows us to determine which directions flying or moving objects/events are going so we can estimate where the object will be in a few seconds (bird flight), minutes(cloud movement wind speed), or hours (movements of the sun, moon, or fixed stars thru the sky. Calendars.) Our visual systems can then predict where they are going.

Now this is an interesting thing when we think about butterflies, because they fly in such a very irregular, almost chaotic way. Most people have seen this but not figured out WHY does the butterfly do this? It’s very easy. Birds also have visual tracking systems, and they can predict which way an object is going to go, because of the tracking system in their brains. So they can intercept an insect flying in a straight line more or less. Yum!!. Butterflies are very much larger and cannot easily avoid a bird, esp. with their colored wings. But, if they fly chaotically, how can the bird brain track it? There is no regularity for the bird to recognize and then target the insect. They escape very easily and so cannot be easily caught. It’s a survival mechanism based upon the bird’s tracking system which has a very hard time following a chaotic butterfly wing’s irregual flight.

How this compares to missile interception and fire contol is much the same. Essentially it can all be understood in a series of comparison processes. First, there must be detection, usually by radar. The targetting mechanism figures out by comparing successive radar impulses where the target is moving in space, and how fast, by measuring the speed at which the radar pulse bounces off the target. If it’s moving to the left, or right, then the directional system figures this out by comparing the time and location in an internal system set up for that, usually a mathmatical program which relates to geometry, that is a comparison to a 3D system. Then it compares the differences between a series of carefully timed pulses to determine the speed, and when those are done, it has a “lock”. The system gives a series of beeps or light blinkings and the operator of the missile fires it. The missile homes in on the target using constant radar updates to figure its position and if the target’s evasive maneuvers are not fast enough nor enough, then it is hit and damaged or destroyed.

The bird does this, but we don’t know how. Clearly it has to have some kind of internal representation using a neural system which can model the changes over a few dozen milliseconds, arrive at an approximation to where the insect will be in a few more seconds and dive towards it, comparing each position to the ones previously to continue to update the approximate spot the bug will be in the near future, connect this into the wing beats for diving and speeds of approach and then grab the bug in its beak within some range of movement depending on how long the neck is, and how fast the beak can shut on the bug. We’ve all seen them do it. And their capacities for intercepting flying insects is remarkable. We cannot duplicate this system, because the birds are constantly changing their positions, directions and so forth and tho the bird might not always capture the bug, it often does. Yum!! Comparison processing through out. And the bird does NOT use mathematics to do it, but neural networks, whatever those are.

Once we undestand that this sort of thing, i.e., the comparison process, is going on to measure, move towards and intercept, then we can more easily figure out how to duplicate in some way, this process. In driving cars we do the same thing. We know that if we speed up too much we will get to the light before it changes to green, and so we learn an internal algorithm related to how fast we are going and how long it’ll take to get to the light at a certain speed after it changes to green. So we do the same thing as a bird. I knew a student who was so good at this he could pick out of the air a fly buzzing about. His comparison Processes were working very quickly and very fast. No math involved, but the superb tracking and predicting system his brain had using the Comparison process.

The Comparison Process cannot just predict speeds and directional velocities, but it can also predict to some extent what people are going to say, or do in set circumstances. It can also sight down time lines, extrapolate from current data, and make a prediction that some event is going to occur. This is in fact a kind of prophecy. For instance we knew, many of us, that the USSR/communism/state socialism was doomed. In Nixon’s autobiography he talked about meeting the Dalai Lama, who stated that the USSR was not acting according to the rules governing human greed and incentive and so must eventually fail for this reason alone.

7. The real reason the USSR failed was the Least Energy Principle. This is a comparison process method, purely. It measures outcomes, compares them and finds the one which uses the least cost, time and distance in accomplishing a certain goal. It’s the basis of efficient production, work and all known tasks. The entire universe uses it in terms of a photon’s paths which are the most direct and the least energy, even thru a gravitational field. The orbits of the planets are least energy. The paths which cows take back from the fields to the milking barns in the afternoons are also least energy. The conformations of series of soap bubbles are also. From the trivial to the mighty galactic clusters, all is Least Energy. The windings and bends of river courses are also least energy for flowing waters.

This principle is seen everywhere, and is a basic tool anyone using the Comparison Process must know about and utilize. Because if a method uses less time, and less resources, and less distance of travel to get a single, set goal done, for instance, mining coal and getting that to the customers, or creating electricity with the least amount of waste of production and transportation, that advantage will build up. Using the Rule of 72, which measures doubling times related to an interest rate divided into 72, if a process of manufacturing by one factory is 10% more efficient than his competitor, given a similar marketing condition (yet another comparison process), then in 7 years, his advantage will be double that of his competitor. In 14 years 4 fold, and in 20 years he will dominate if not own the market.

This was largely what went on in the USSR. There was terrific waste in food production, at all levels, from planting, quality of seed which determined % of sprouting, cultivation, plants, lack of harvesting machines and tractors and so forth. And they could not get the food to market because of bad roads, bad trucks and inefficient storage and labor problems, they had to Fight the “Battle of the harvest” every year” where even students and factory workers had to turn out to get the food harvested, stored and shipped. This took away efficient education and production and affected the entire USSR during harvest times. While US farmers were only 2% of the population, Soviet farmers were 30-40% of the population to grow about the same amount of food. Comparatively, as it’s an outcome statistic.

This problem occurred all over the USSR in all areas. It got so they could not drill oil wells much more than about 10K feet down, in several days, where the US firms would do the same down to 25K feet in only a day, thus giving the US a huge comparative advantage in efficiency of drilling, more drilling and deeper, too. As a result USSR oil production began peaking out in May 1984. The Russkies knew this would happen and built many large, cheap, simple, graphite moderated nuclear power reactors called the RBMK-1000 models, in groups of 2 to 4. This was Chernobyl, and SosNovy Bor as well, including others. Everyone knows what happened there, esp. when an estimated 30% of Soviet workers were drunk most days, which continues to the present.

Upon this basis of inefficiency it was predicted the USSR would collapse if we held strong. Reagan increased pressure on them using direct embargoes of computer and other strategic goods and then forced them into a massively costly arms build up they could not afford. In 1991, the USSR collapsed due to its inefficiencies, many of which have not been reduced even today. This was no surprise to most of us.

8. Before the stock market crash of 2000, the “Economist” of London had 2 front page cover articles about the USA’s stock market bubble, where prices were WAY in excess of reasonable, some with price/earnings ratios of 50-60 to one and some of mathematical infinity because of no dividends. I can recall those two front page cover article images. Further when sitting at a dinner meeting with some associates in March 2000 telling them the our stock markets would collapse and to be ready to get out quickly to cut losses. There were two responses. One wife said, “The stock market can’t collapse. All of our pension money is in stocks.” I looked at her and said, “How can I be overdrawn? I’ve still got checks!!” And another fat and rather overconfident person said, “No one can predict the future at all.” “The London Economist believes they can. and it’s good enough for me.” I said.

In April it collapsed, the Dow falling from $12+K to $7K and the NASDAQ from $2300 to about $700..Many suffered serious losses. None of those persons EVER acknowledged to me what had happened to the market. Another acquaintance of mine made $50K on the fall in prices. Further, “A prophet is without honor even in his own land.” The gift of prophecy of Cassandra, pious daughter of Priam of Troy, was well recognized, but the gods had cursed her. No one would believe her. This is the hidden power of the ancient Greek myths. Do you see how all of these things fit together, creatively using the Comparison Process?

Future predictions ARE possible using the Comparison Process. It’s the gift of prophecy. If you know enough and can wrap your concepts, creatively around events tightly enough to discern the velocity and direction, you know where the thing will land hard. LBJ had this gift. So did the seer, Winston Churchill as reported in C.P Snow’s "The Variety of Men". This is another gift of the Comparison Process.

Understanding this, recall that Bayesian mathematical methods can create predictive values and are used widely in machine recognition programs using voice or image recognition. In this way, these programs are doing a simple, Comparison process. How we recognize voices and persons, the same way many other animals do, too. Recognition is a very important part of the COMP, as has been repeatedly shown before.
Recognition of words, landmarks, the creation of maps, and so forth. The Comparison Process is Bayesian plus and resides in all human cortices, making recognition, creating creativity, creating and understanding language, math and many, many other tasks, constantly while we are awake, and often in dreams, too. But I have digressed in order to make more important points about the COMP.

9. A further problem is understanding the major functions of the cortical cell columns of human brain devoted to the Comparison Process. The next question is how does the neurophysiology of the 6 layers of the cell columns create the processes which result in the COMP? And that question is an insolvable one at present. Using the structure/function relationship and an analogy with E = MCsquared, it can be understood better. When Einstein wrote his famous equation relating matter/energy, it was in the 1910′s. Nuclear fission did not come along until the 1940′s and with fusion, about 6-7 years later. Now at last at Cardarache, France, the International Thermonuclear Exp. Reactor (ITER) will be coming on line well over breakeven within a few years. That’s 100 years of lag time before the Structure of the S/F relationship was solved on the left side compared to the right side.

Now, the Structure of the cortical cell columns creates the COMP. We know what that is. But what is the structure which does that, neurophysiologically? We don’t know, and there is an impenetrable block on this, too. It’s the N-body problem. We cannot figure out using current math/computation power what happens with N= or greater than 3 is, either. The equations go to such complexity/chaos, not even the best computers can easily solve them. Consider that the number of interacting neurons PLUS neurochemicals is in the 10K’s at least in the cortical columns. The number of genes interacting to create the human body is in the range of 25K, interacting with probably more than 25K MORE chemicals/biochemicals. When we cannot solve for N>3, how can that be done, when in fact each of those neurons might well be interacting via synapses with 1000′s of others?

So it will take a while to figure that out. The difficult we do today, the impossible takes a bit longer, to paraphrase the wag. It took us thousands of years to figure out what the cortex did, in a basic, fundamental way and will take us a lot longer to work out the unlimited workings of the Comparison Process in the cortex.

But the point is we have the mathematical Bayesian predictive values which can create basic machine recognition of voice, fingerprints, and even some simple images. But these statistical methods don’t give us language, but we have the functional origin of language, which has not heretofore been known. It’s the COMP, clearly, that repeating simple process which is the right side of the S/F equation. It will take a while before we can generate all the details of how the COMP creates a real, existing language, altho we have the E = MCsquared of that, the COMP. The same for personality disorders, let alone the emotional system, though some headway has been made recently. Now we need to solve the N-body problem for the neurophysiology/genetics/embryology of the brain cortical cell columns.

10. Let us treat description in the same way. Math cannot give us much in the way of translating all but the simplest language into numbers/equations. Describe the colors and sky of a beautiful Western sunset. Language can give us some idea. It can even give us with certain known landmarks, where that sunset took place, at Point Loma in San Diego or looking at the alpen glow in the Colorado Rockies in the Frazier River area. Math can create a digital summary of the image, as we use Jpegs all the time. But it cannot give a meaningful description of what is being seen using those numbers in the Jpeg. That’s qualitatively/quantitatively a wholly different task.

Consider this, and it’s the critical one. Can mathematics give us the way to create language? No. Can it at present give us a way to re-create creativity, modelling in some way how Einstein, Darwin and Wallace used the Comparison Process to create Relativity and evolutionary theory? No. Until it can, then true, complete AI cannot come about. Until math can create relational methods, comparison processes which re-create the complexities of language, which the Comparison Process does every day in our cortices, then math will not be able to handle meanings and much else. This limit to mathematics must limit its use in creating AI which can model realistically, human cortical functions.

Now consider measurement. We use rulers, tapes, etc. to measure lengths distances, clearly comparing the gradations on those tools to arrive at the values we get. When we measure colors of light, we can analyze the brightness with photometers and the saturation and amplitudes/frequencies of the colors. But that would take a very long time to do. Our visual cortices do that all the time with just a few 100′s ms. of work. When we compare images in our minds, while we are thinking about events in existence, such as Darwin’s finches, can the mathematics recognize what is going on? Would it have the judgement and sense to realize what this means, as did Darwin and then Wallace in his own way in Indonesia?

11. For the same reason that mathematics, even of the Bayesian kind cannot figure out language, tho it can detect targets, compute their trajectories and hit those targets, it’s a long way from that simpler task to understanding language enough to speak it, except in a stereotyped, pre-programmed, limited way. In order for that to happen there must be, very likely, a number of very important breakthroughs in pure and applied mathematics to make this happen.

12. Let’s consider an easier course of action. The Comparison process in our brains is a massive, simple process which creates creativity. It does this by means unknown to us. We CAN however describe what is happening by looking at the process using the COMP. We know there is stream of consciousness going on, where a lot of processors are doing the work, by associations and finding the relationships among events/words by the COMP. So if we can speed up this creativity, then we will be closer to creating a system which can model the Comparison Processes going on in the cortex.

How do we do this? The COMP is a self organizing, ordering system. It’s a big problem solving process operating in our cortex. It can look at a lot of comparisons, possibly using parallel processing, and come up with answers by this same creativity. The COMP can create a lot of ideas, but as one scientist said, to be good you have to have a lot of ideas. But good ideas are a dime a dozen. This implies some natural filtering is going on. The Least Energy Principle is a major one. The structure/function relationship is another. When progress is made in understanding language better, that can be translated into creativity, that is finding out what words/ideas/events have in common and finding bridging concepts, too. These creative recombinings of words/idea/images is the clue we need. When we create sentences which describe clearly what we have never seen before, that is creativity purely. The COMP does this all the time, tho. It’s uniquely creative.
What kinds of methods are used in programming to create recognition? What are the structure/function relationships of how those many programs compare to each other? This should give a common basis in which they are acting to create, roughly, the same process which can ID fingerprints, faces, or voice commands.

The Vocoder can convert an amplitude/frequency pattern of voice into an electrical signal, which can then be converted to microwave signal, be received by antennae, and then reverse translated into more electrical signals, shipped by fiber optics to the site nearest the receiving cell phone, which then receives the microwave signal, turning in back into sound by electromechanical vibrations which create the voice of the sender.

But this gives nothing useful about language, because the device does not understand at all the meanings and use of language. It does a greart job of transmitting with good clarity the signal and doing the job of translating microwave signals to sound and vice versa, but it’s empty of meaning.

What is the difference between this and the voice recognition systems being used? This computer can transliterate voice commands into written words, then act upon that word, producing the search which gives the answers to the person’s mobile phone. And then he can free his hands to do other things. However, meaning is still not there.

13. What would it take to create that meaning, that understanding of language? That’s why true AI has not yet been found. Math doesn’t yet understand the complex relationship among the words. Why we put words in the order they must be to create meaning. The words must relate to meaning. There must be relationships among those words to create meaning and the computer MUST understand/know what those are. When THAT can be done, and it can be done using the analyses using the COMP, then true AI using language can be created. That is the step missing, the step which the Comparison Process can give us. It’s the relationships among the words that give us the meaning of the word “can”. It’s NOT grammar of itself. The word string must make sense, it must have internal consistencies among the meanings of the words.

14. Take the sentence, “The Beans were in the Can.” This makes sense. “The can was in the beans” doesn’t make the same kind of sense. So it’s the sequence which determines meaning, too. This is the aspect of grammar, which is necessary, but NOT sufficient to create languiage, as has been shown before using the COMP. Knowing whether a word is a noun or adverb is not the point. It’s the complex relationship among the words which creates verbal logic, as well. And this is why AI has failed for now to create either meaningful sentences and to understand/interpret them, because the COMP was not invoked, which DOES give meaning by comparing words to each other. This gives meaning by context. Until the contextual sense can be performed by computers, by careful analysis modeling the COMP, then AI will not easily speak good English, or sensibly, either. AI CANnot understand the word, can.

15. Let us analyze using the COMP how the programmer thinks creatively. When that is done, using the COMP’s empirical introspection, then we will see how new programming methods are created. Programming is NOT science. It’s an art like creating a musical composition or creative verbal writing. It’s creative. There is no mathematical description of it possible at this time, because we do NOT have a mathematics of verbal relationships which creates meaning. However, we CAN study a number of creative programmers and learn what steps they take, using comparison processes which do that kind of creative work. and it will be a form of the Comparison Process. We can compare their work to each other and learn from those comparisons. Once that is known, computers can be programmed to create new creative computer programming methods, by trial and error, the usual way creativity proceeds. And once that can be done, progress in the field of programming will expand exponentially because each new method which is created, can be compared to the others and this will create more and more new programming methods to solve the problem of AI. We create the tools to create the tools which create the solutions.

This is what the Comparison Process model can give us. An empirical introspective approach to the living art of the cortical processes of creativity in the programmers. The computers don’t create programming progress, human brains’ cortical processors do. and if we understand that, then creating better and more methods to solve any problem will be much easier. and go much faster. Otherwise it’s all trial and error and it’s unlikely to go anywhere very soon, unless someone gets extraordinarily lucky. But winning the lottery is exceedingly unlikely in this case, altho many are willing to bet on it. As is said, playing the lottery is likely a voluntary tax on those who don’t know the laws of probability. We MUST play when the odds are for us, not against us.

As Einstein so wisely wrote, “An epistemological advance always proceeds progress in physics.” The same is true of AI. The Comparison Process is that epistemological advance in understanding the massive comparison processors in the human brain/mind interface. We don’t have to understand the complex, N-body problem of the brain’s neurophysiology, which creates the COMP. We HAVE the needed function already identified and it’s the COMP.

16. We only have to understand the Function of how programmers are creative and what skills they are using as any professional does when he works so much better and faster than others without those skills. We must learn more about programmers’ creativity. It’s a good bet a good many of those are already known. Those series of programming skills which make him a better, faster, more accurate programmer and creative besides. That is the key here. Understand the creativity of the programmers, and design that into a computer to create more computer driven/originating creativity. That by itself will give us AI far faster. There is no reason to believe otherwise. There is no theoretical reason, given the success of basic voice/image recognition programming methods to believe anything but that it can be done with work, trial and error, and an understanding of the basics. Otherwise, without the COMP it could take 100 years. And there would be little or no understanding of how it worked, either.

We know how Einstein, Wallace./Darwin, Edison, Archimedes and others created their new understandings. We need only to compare THOSE examples to those used by the programmers to find new solutions. When the COMP enhances creative progress by direct utilization by the programmers, then progress in creating computer/mathematical models of the human language will proceed very quickly because the same processes which create new creativity for computer programmers, will, create computer human language and language use and understanding. It’s recursive, self-reflexive and will likely work. One method of creating a COMP model using machines can create all the rest of AI. Create the tools which create the tools and the rest follows.

I wrote all of these creative articles simply by using the COMP, in less than 3 months’ time in a mid 60′s year old, who should not have much creativity at all. The COMP and its re-inforcing dopamine high can give that to many persons. And what could it do to boost that in a 30 year old, already creative computer programmer and analyst? The mind fairly boggles. See the rest at:
jochesh00.wordpress.com

This is what the COMP portends. If we understand understanding, if we can create creativity, then there are far more and better answers.

 Tagged , , , , , , ,

The Continua; Yin/Yang, Dualities; Creativity, and Prediction

By Herb Wiggins, Discoverer/Creator of the Comparison Process/COMP Theory/Model; 14 Mar. 2014

“Every structure has its capabilities and its limits.”

“Observe the wisdom of the Ant.”
— Proverbs 6:6-8, 30:24-25

Contents:
1. The Continua and dualities as the source of the same, opposites and complementary forms.
2. Kinds of the continua EM spectra, periodic table of elements and isotopes; listing of subatomic particles; biochemicals, IUPAC listings;
evolutionary Tree of Life
3. The Electromagnetic spectrum continuum, extending and creating the EM spectrum from the visible light range. Describing and measurement; Many examples of discovery/creativity along the continuum
4. Sound continuum.
5. Motion and speeds. from at rest to near light speed.
6. Introducing the Exponential Barriers (ExponBars) as part of natural laws; what these can mean as limits to measurements.
7. Sense of heat, as a source of Heat/energy continua; comparison process of detecting heat by the skin. ExponBar of absolute zero and highest heats approachable by matter. Continuum of Absolute Zero to Cee.
8. Touch, Hardness and pressure continuum. Moh’s Scale of comparison process; Pascal continuum of hardness; Pauli Exclusion Principle (PEP).
9. Thesaurus as the best source of possible continua.
10. More ExponBars: particle physics, Heisenberg Uncertainty Principle; Bell’s test of measurement of spin in entangled particles/photons.
11. ExponBar of the Perfect line or circle measurement thought experiment
12. The continuum of time, from the Big Bang to present
13. The continuum of the brightness vs. number of galaxies and the Low Surface Brightness Galaxies, Disney/Malin.
14. The chemical continuum.
15. DeBroglie matter/wave graph.
16. Taste, discussion deferred.
17. Feynman diagrams as a special case of continua for describing particle interactions simply, another case of creativity.
18. The dialectic as a special case of creativity using the dualities along an continuum.
19. Matter density continuum, and the discontinuities.
20. Evolutionary Tree of Life
21. Plate tectonic model
22. Verbal continua describe in very fine details complex systems, arriving at an understanding of N-body solutions without using mathematics, a capability of COMP denied to mathematics.
23 Traveling salesman problem
24. The complex medical history, physical examination, differential diagnosis, and treatment protocol tables used in the practice of medicine as yet again a case of a complex system yielding to COMP methods. N-body solution using the COMP of language to do what math cannot.

1. The Continua are modern, updated, vastly extended forms of what in older times were considered Dualities, such as love and hate; up and down, light and dark, good and evil, and indeed any pair of synonyms/antonyms listed, many of which do not have numericity, tho it’s been extended to tall/short; lean and overweight, and the many, unlimited (Comparison Process), other ways each of these can be stated, almost endlessly.

2. In the sciences the electromagnetic spectrum of light is one of the commonest. most Visible of the continua. There are very many others, the continuum of the periodic table of the elements including all of the isotopes, well self-organized and clearly shown how they are all related ot each other in terms of nuclear/electronic structures, proton/neutron numbers, and so forth. There is the continuum of the smallest particles, the stable neutrinos, up to the unstable mesons and so forth, then to the stable protons, electrons and neutrons; thence up to atoms, to the compounds, inorganic, organic, and endless biochemicals of the polypeptide/protein chains, up to the continuum of the tree of life, upon which all known species are located, generally organized according to phyla if possible, up to the great apes, our closest cousins and to the latest series of modern humans, Neanderthal man, Cro-magnon, and modern humans.

3. The EM spectrum is a good one to start with, because it’s a continuum of light in all of its various forms. At first we had the visual sense, a system with black-shades of grey-white and dark/light, and all those related dualities in our languages and the many others. Colors were also seen, which have very ancient names as well. This is the continuum of light given to us by our visual system from the rods which give us black/white to the cones which give us the colors, ROYGBIV.

The study of optics goes way back to ancient times, as they had ground convex lenses they used to magnify smaller objects. But not much happened until Newton passed a white beam of sunlight thru a prism and saw the spectrum of light, and realized at once that it was the rainbow, which is the naturally occurring form. He made the discovery that the rainbow is created by diffraction of light by water droplets. But he understood it because he could create a rainbow at will by using a prism and sunlight. The frequencies/wavelengths are related via a couple of equations, E (energy) = H/ (Plank’s constant) times the frequency of light, small “v”. Wavelengths can be related to this, but the Wiki article will show all of that. This was the creative insight of Newton. He saw at once that both the rainbow and visible light spectrum were the same by the comparison process.

The EM spectrum/continuum begins at the lowest(duality) frequency end, the long wavelengths of alternating current fields, then up(duality) to radio waves, microwaves, infrared, the Red/Orange/Yellow/Green/BIV series of visible light, then Ultraviolet (UV, ultra being yet another relational duality), X-rays, gamma rays and finally, cosmic rays, the latter are largely iron atoms accelerated to near light speed, or Cee. This is a physical, real continuum, and the relationships of each frequency to the others can be read by looking at the frequencies, going from lower to higher, again yet another continuum. Every part is related to each other part by frequency, wavelength and energy. So it’s not only created by the comparison process, but just like a dictionary can be read by the COMP, because it’s been ordered by numericity. This is where measurement of light came in. Numbers in terms of frequencies were applied to ROYGBIV, which is what science does in order to more accurately describe/measure phenomena. And in measuring, this is done by comparing an arbitrary metric standard against that which is being measured.

But now look. Our eyes “measured” the light as well by comparing dark to light, which is the amount of photons coming in, few to large numbers and all the shades of grey in between. This is physiological description by our visual systems. And our cones measured light by creating colors which corresponded to frequencies, too, tho we didn’t seem to know the order of frequencies from lower to higher, either, before Newton. But yet, our eyes did. The higher frequencies of light interact with rhodopsin more energetically, than the lower, and this is how the red/green/blue base colors came about. We just didn’t know it, but our visual system, we see in retrospect, did.

Eventually, it was discovered that there was more to light than just visible light, tho how our eyes “tune” themselves to the brightest, relatively non-injurious frequencies and match those of our sun, is also apparent. We can see better when it’s brightest, because there are more photons to send information to our eyes. Therefore the most efficient way to “see” is by using the brightest part of the solar spectrum, I.E. the Yellow-greens, to which our cones are the most sensitive for seeing in color. This is no accident. It’s a least energy principle (LEP) approach. Using the COMP we can see this.

IR was found, radio waves found by detecting electric sparks which created waves which created slight currents in antennae, and the radio/TV/microwave systems we all use today were created in response, again by the COMP. By extending the EM spectrum to the higher frequencies, UV, X-Ray, gamma ray and cosmic rays using instruments, which basically translate light invisible to our eyes into forms by which we can detect it via our senses, then all of that continuum was found. Note that our senses were the starting place for this progress. In creating a continuum with the spectrum, then extending in a creative way that continuum and out popped the entire EM spectrum. It was ordered by the comparison process, measured by combining it with mathematics, and became a great deal more understood, organized, and predictable. All mediated by the COMP.

4. Let’s look at sound, which our ears pick up, too. We organize it in music by loudness which is the force of sound(Newtons/meter-squared) waves hitting our eardrums/second, which was first measured by our ears as quiet and loud sounds and all those in between. We also heard in our musical instruments the lower pitches and the higher pitches. and we organized our string and wind instruments, as well as percussion instruments in the same way. It’s very analogous to how the sciences organized sight, isn’t it? By a relational scale of soft and loud and low tones and higher tones, which is also the comparison process, too, And also the duality which it can create, too. It’s relational, not absolute. We can hear a set of intervals, compare them in terms of rhythm and pitch and we can ID the melody, “Mary had a Little Lamb”. and in the same way any other melody we “know”, that is we have in our musical Long Term Memory. It doesn’t matter what key the melody is in, it’s all related to the intervals of the notes. This is no different for speech, language, vision, and so forth. It’s all the same, the COMP, which re-Cognizes, I.E. re-KNOWS what it’s heard before. It’s the basic cognitive process, the COMP.

The series of pitches is the frequency of the tones, from middle C at 128 Hz to the C tones above and below, 256 and 64. All relational, all ordered, all knowable, all predictable so we can play a tune. But we don’t NEED to know the key the tune is played in, just the relationship of the notes to each other. We “read” the sheet music in just the same way we read words. We compare the sheet music notation to the keys on the piano, the frits on the guitar, the levers and holes on the wind instruments, etc, each of them corresponding to what we see on the sheet. It’s the same thing as reading, the comparison process, exactly, just altered for tonality and processed in our right and left hemispheres in the music/hearing centers.
From those notes we could hear, or not, depending upon deafness, we realized that some people could not hear tones others could by comparing people’s hearing. Some could not hear softer sounds, also. So it was asked, what if we play a higher note on the wind instrument? And through that realization was found more of the sound continuum, ultrasound whistles were made, and ultrasonic frequencies were discovered. It goes very, very high, each of it higher and higher frequencies, and each of it more and more energetic, too, just like light. Eventually, infrasonic tones were found, and the frequencies were lower than those audible. The same continuum, you see? The same line.

5. Let’s take motions and speeds. Essentially we “see” a movement across our line of sight by comparing each position to another at intervals in our eyes. The visual system smooths out those jumps so we see it as continuous. In the same way, repeated flashing of images on a screen too fast for our eyes to see each of them, sets up the illusion of movement. We call these motion pictures, now videos. The movement can be measured, also, again comparing distance the object travels compared to time. That is, it’s a ratio, a proportion, a comparison process.

Speeds can be seen as no speed, then slow and up to very, very fast. Again, teh slow and fast duality. and the continuum is built around it, faster than we can detect and slower. From being at rest, compared to a fixed spot, to very close to the speed of light as seen by our instruments, which describes the current continuum, which is also measurable. Objects can move so fast our eyes don’t see them. For this reason high speed photography was developed in order, instrumentally to allow our visual systems to detect very fast processes.

But we know that all speeds are relative, that is compared to a fixed spot. But there is no absolute resting spot. The ultimate speed is that of light speed, according to physics from several years ago. We approach that speed coming from a full stop but only with difficulty. The more energy we put into a mass or object to make it move faster, the more it takes to speed up to increase to light speed. Energy = 1/2 Mass times V (velocity).

6. Einstein showed that we could never reach light speed using normal acceleration. We move up an exponential barrier (ExponBar), and we can see if we were taking 1/2 of the distance to a spot, then 1/4, then 1/8, 1/6, etc., never get there. The actual equation relating this fact is in the Lorentz/Fitzgerald equation. At usual speed as a fraction of light, it’s the above. At very high speeds approaching a significant fraction of Cee, the equations are seen. E=1/2MV squ. is an approximation, not the reality. This is a highly confirmed fact. And this is what created the exponential barrier. Will return to the ExponBar soon.

Essentially our nervous systems can also create motions using muscle contractions. We can throw a ball, move a tennis racket to hit a ball, the same with a bat in cricket or baseball, or in Jai a’lai. We can run fast, or walk fast, or slowly or not at all. Thus the same rest, slow, fast duality which is seen, also creates the continuum although much slower than many processes, in which we live and act. This is what the motor strips/cerebellum and the rest of our motor systems do. Mediated so far in some way by the comparison process going on in the motor strips/cerebellum. But will not tackle that yet.

Touch is also a sense which we have, and there are many descriptive aspects to it, many dualities. We can feel if something is cold or hot, cool or warm, soft or hard, smooth or rough, wet or dry, etc., but will not review all of the human senses of touch which are multiplicit.

7. Taking cold and hot, the major duality of temperature, as it’s now known, we can detect these differences, feel a fever, or if something feels too cold. These are also comparison processes. We can describe these things, although within some major limits, also. How this related to the continuum of cold and hot is obvious. Eventually means were created to judge more accurately than our senses, to measure temperature. This si basically how much energy a substance holds. If it holds more, it’s hotter than if less. We can heat something up and make it hotter by many means, but the scale starts near, but never at Absolute zero, and then extends up to very high, well past plasma levels in our sun, to much higher. You see all of our adjectives used here are comparatives, too, not absolutes. Again, the comparison Process.

How does the skin know what temperature it is? By comparing itself at the skin’s resting temp to that is which being touched. If the skin is warm, anything warmer than the skin will feel warm. Anything cooler, will feel cooler. Anything scalding will burn and create pain, so that too is another scale, so far not well understood. The temperature scales of the skin are comparison processes. If the skin is very cool on a cold day, putting even a modestly warm object against the skin makes it feel very uncomfortably HOT!!. Putting something colder than 50 degrees against very warm skin on a hot day, can give a feeling of extreme cold. Temperature sensation uses the comparison process to determine relative temperature differences, thus hot/cold, warm/cool, etc. This is simply a fact.

This then has established a continuum, once more from near absolute zero, compared to water’s freezing and boiling points, the Comparison Process, we once more see without too much surprise by now because of its ubiquity. But approaching Absolute zero is very difficult. Because the closer we get to it, the more energy it takes to get there!!. This is in fact yet another exponential barrier, the ExponBar.It’s been very difficult to find the equation which should be like or match the Fitzgerald-Lorenz equation for approaching the speed of light. Thus temperature is bounded at the highest levels by the exponential barrier of trying to reach light speed, and at the lowest level, absolute zero, by a similar exponential barrier. Wouldn’t it be interesting if the equations were equivalent?
http://en.wikipedia.org/wiki/Absolute_zero

The highest possible temps we can find go thru the states of solid, liquid, gas, plasmas and then still higher yet. As these are the speeds of moving atoms/particles, it can be understood, that the velocity of particles cannot exceed the speed of light, and so the highest possible temperatures there have an exponential barrier, yet again. So with heat we have an ExponBar at the low end from Absolute zero to Cee, and yet another exponential barrier at teh high end. What does this mean? More later. These are yet two more ExponBars. Hmm. Are we going to keep on seeing these? Yes. And they are ubiquitous and unending, which sounds like what? The COMP.

8. Touch also extends to soft and hardness, as relative scales. If something is too hard it can hurt us, by causing pain. If soft we can feel the difference. Again, the duality. and once more again, the continuum. Moh’s scale was once used to measure the hardness of rocks. Geologists assigned certain hardnesses to talc, limestone, quartz, corundum (sapphire) and diamond (mined) in ascending order of hardness. Again, softer or harder, comparison process. Again a measurement against substances compared to each other. If the substance could scratch something else by comparison, then it was teh same or harder. And this was how the scale was determined. Once again, the COMP. Are we having a pattern recognition event again?

Eventually the Pascal scale of pressure was set up, measuring from resting state, of no pressure, to the highest state possible, when diamonds could be scratched/fractured by pressure, and not just any diamond, but a Carbon-13 diamond, which at about 125 Billion Pascals (GPA) is the hardest substance known.
But we are not done yet. The force which determines hardness is that of electron repulsion between the interacting objects, say quartz and glass. Each will scratch the other most times. They are of the same hardness both being mostly SiO2. The scratching breaks the bonds of the solid SiO2, and thus scratching it. The Pascal is measured relative to certain standards, also, once again showing the comparison process in its fullest glory as the basis of all measurement.

Here we have another continuum, do we not? based upon the senses originally and so on and on. Mathematized and measuring, yet again from the dualities.arising from the sensations.

What is the hardness of the electron and proton? It’s the Pauli Exclusion Principle(PEP) which forbids two fermions from occupying the same space at the same time. The pressure which prevents us from passing our hand through steel is just that. But how hard is the proton? We know there is neutronium, created by ultra high gravitational fields where almost all protons and electrons are compressed into neutrons. Moving up that hardness line, we see white dwarf star matter. Moving down further is only one thing, the black hole, which cannot be easily examined. This is the hardness scale, the continuum as we now know it, lined full of discoveries and creativity events.

Surprisingly enough the PEP is exactly the source of almost all of our motor vehicle laws. Don’t try to violate the PEP, that is collide with anything!!. Simple, real, exact, basic. Again the creativity of the comparison process, simplifying and increasing our understanding of events.

But are there any other? Turn to a thesaurus and see how many are known. It’s the endlessness of the comparison process all over again, is it not?

How about some more Exponbars? First, in particle physics. Particle physicists have been using more and more energies and higher and higher velocities accelerated for higher and higher costs for the last 70 years in order to investigate matter, basically by speeding it up to near light speed, an ExponBar in and of itself. And every time they built a bigger and bigger machine, until the largest, the Hadron Collector was created at CERN on the French/Swiss border. That one cost more than all of the others, almost combined. In order to find the Higgs, it took a lot fo time and energy, roughly the cost of building the machine, $10 B and then the cost of the experiments themselves. They found and confirmed the figures at a single site, using a single team, then published. But the facts are, it needs to be at least twice confirmed by separate teams, at a separate sites to be reliable and certain.. Will $10B+ be spent again? it’s not likely, nor is it affordable. This is the ExponBar at work, again. Also called the law of diminishing returns.

Let’s take the Heisenberg Uncertainty Principle(HUP), which states that we can determine the spin of an electron, but we cannot simultaneously determine its exact location, or vice versa. If we do try to do so, what do we get? BingO!, the ExponBar. The HUP is yet another form of the general class, ExponBar. This has not been understood or known before.

If we do Bell’s test to determine the speed of exchange of information between the 1st entangled particle/photon of a pair, it then determines that of the other, an instantaneous speed of information transfer between the two particles/photons. So we find at first that to be at least under the speed of light. So we try it again and it rises to several times Cee. We try it again, with more and more energy/cost/time and we find it 1000′s of times Cee. The latest experiment shows 40,000 times Cee, rising up the ExponBar again. Hmm. What is going on?

When to try to find that perfect line, or perfect circle, we again find the Exponential Barrier when we do that. Is it in fact a limit to our systems of measurement, or is it real? What Does the ExponBar signify? No matter what system we use, no matter how we do it, we will always find the ExponBar at the limits of our abilities to measure something by the comparison Process. No matter where, no matter when, no matter how. There is a limit to knowledge built into the system of which the speed of light, absolute zero temperature and the HUP were the first known. There are endless others, too.

12. Let’s go back to some more dualities/continua, such as time. Past and future. Life and death. Young and old. Early and later. Morning and evening. Day and night. Each of these terms relates to the others by the COMP, please note. This is no accident. This is the way measurement/descriptions work.

We measure time relative to the system in use in each culture. All of them have a single point which was year one. The establishment of Rome the city, the birth of Jeshua Ben Josef, the Jewish Calendar, the 1st year of Ramesses 2′s reign, the date of the Hegira. The date of the Nile indundation in Khemet (ancient Egypt), etc. Each of these time lines measures a series of repeating intervals using the lunar or solar method of comparison. Time like everything else described or measured is always measured relative to a set standard. Nothing is absolute, we can start anywhere and end anywhere, but we use utility and practical value and the least energy principle to set these up so they weill be stable.

At first time was the endless cycle of the years from the rising of Sirius heliacally, to the frist day of spring when the length of the day was equally sp[lit between day and night. Measuring against the fixed stars, or against a set number of days done by in the lunar calendar, but ALWAYS, always compared to some standard, a COMP. Eventually, we learned that time arose LOT longer before us than in front of us. The evidences ketp piling up, the long durations of erosion of the mountains and plains by weather, water and rivers. The breakdown of radioactive isotopes. The immense times necessary to explain the worldwide evidence of sedimentary rock layers, and so forth. The age of the earth was found to be first thousands, then millions and then billions of years old.

The continuum of time was being created as we understand it today. Just like all of the others initially based upon comparison processes relating words to each other, just as we still do today. “See you in the morning.”
Explained/defined by early in the day after sunrise and not night, either, but part of day.Again comparison process. Constant, measured against stable processes such as astronomical observations, clocks, atomic clocks and even the processes going on in pulsars.

Much creativity and discovery can be found inside of these continua. The Colors were found by newton’s prism. Radio waves by Tesla and Marconi. X-rays by Roentgen, and so forth. The sound continuum also had its discovers, as did that of the other continua, such as the Tree of Life, the listing of all the species of life on earth and how they all related to each other by the comparison Process. That too a continuum.

13. Another continuum which has recently been added to is that of the kinds of galaxies. A very thoughtful astronomer named Michael Disney plotted a graph of the brightness of the known galaxies against their number. He found no smooth tail off as would be expected. So he reasoned there must be huge numbers, 10′s of billions of hard to see low brightness(COMP) galaxies. Using CCD’s (charge coupled devices) David Malin of the Australian Astronomical Observatory, which can sense extremely low numbers of photons which normal photographic paper cannot detect, and long exposure times, he found them, the Low Surface Brightness Galaxies. Billions of them including a real monster, the 650,000 LY’s diameter Malin-1!! A wonderful discovery, again made by extending, creatively, a known continuum. Just as has been seen so many scores of times before.

14. So this provides yet another way to creativity, does it not? Find a continuum, and extend it. Trial and error. Sooner or later, we will make more discoveries based upon this simple process. For instance, when organic molecules are made, we can add all sorts of groups such as -OH, COOH, CH3, etc. to them. By simply listing the known molecules and adding onto them, we can create endless numbers of new compounds, can we not? As long as we know the rules for adding on groups. This is the Comparison process which creates word strings, the number line, the alphabet, etc. This is essentially the basis of the hugely cited Combinatorial Chemistry method, which in the early 1990′s created such a stir, so to say. It’s also the basis of the current revolution in pharmaceuticals, because at first it used to take months to make a few compounds, of say, new antibiotics. Now, using the method, 100K drugs can be synthesized in a short time in micro quantities and simultaneously tested for effectiveness. Including antibiotics. Since no bacteria known can be resistant to 100K antibiotics, this means, does it not, an effective end run around bacterial resistance?

15. In the DeBroglie matter/wave graph, listing at one end the photons and then moving up thru the electrons, the protons and so forth we can construct a matter wave line. At some point where the waves begin to become very hard to detect, that is the borderline between the quantum event level and the macroscopic level of existence, which we call normal. Such a duality between waves/matter characteristics can be set up to measure the transition between probabilistic behaviors/quantum effects dominate and deterministic mass action behaviors of chemistry, the deterministic, stoichiometric rules of chemistry. This is how the border zone can be more easily discovered. Set up the continua from where quantum effects prevail, such as quantum tunneling, to where they do not occur, or are much less likely to occur. Diagram it. make it real and visual.

16. Taste is a more complicated sense, but will be treated later.

17. When using the Feynman diagrams, we see that continua have been created, which can neatly show relationships far more easily than talking about them or doing the mathematical work to show it. Thus the system is more energy efficient than the math, and this was why Feynman’s diagrams superseded Schwinger’s ponderously slow, taking weeks of complicated, mathematical methods. Feynman’s solution was a creative genius at work, again, the comparison process. Don’t do the math, do the diagrams following simple rules. We see his genius at work, introspectively, using the COMP. The comparison process allows us to look inside the mind, empirically.

Look around, find some dualities, create a continuum around them and make some discoveries. Or make a graph of two characteristics against each other, create a continuum and make new discoveries. That’s another way creativity can be done.

18. Regarding the dialectic, this can also be understood as a kind of continuum arising from the dualities of the thesis/antithesis, which when combined create a synthesis. This explains the limited but real creativity of the dialectic, though it has largely been ignored from some time as being not as useful as first claimed. It’s a duality which can create a continuum of the form, A + opposite A > synthesis. There are complementary dualities such as male/female, right/wrong, versus true opposites, such as yes/no; good/evil; up/down; in/out, well check the thesauri for the rest.

19. The densities of matter are also a continuum. Compare the most dense, the black hole, to neutron star matter, then white dwarf compact matter, each with its alinear discontinuities due to the above, then to compressed matter, to solids, liquids, gases (vapours) to plasmas. Then up to vacuums, to space vacuum of about 1 hydrogen atom per cubic meter, then to space which contains a good deal of dark matter (neutrino gas) to lower densities, and from places in space where there is a marked Casimir effect to even lower effects. These latter cannot yet be quantitated to any great degree. This is where discovery and creativity come in. In time we will be able to find out where the neutrino gas is of higher density VS lower and how to quantitate it, too, just as the Casimir effect will differ in various areas of the universe due to gravitational/mass density changes near large galaxies and out in the great voids in space between the star clusters,seen by astronomical survyes. Again, the density continuum, with many, many discoveries/creativity events laying upon that great line, and many more to be found.

Let’s name at 3 major complex systems, the Tree of Life, the plate tectonic model, and the Hertzsprung-Russell Diagram (HRD) of the known star types.

20. The evolutionary Tree of Life was established by comparing all known living species, including viruses and organizing them as to classes wherein they shared major characteristics with others. For instance, the insects have 6 legs, and are found in the Arthropoda class where the 10 legged decapods such as shrimp and many 100′s of species or plankton and other forms are found. The spiders are in the Octapoda because they have 8 legs, the centipedes and millipedes because they have many many legs, and the chitinous segmented worms correspond to many of these, too, just a lot more legs.

Each of these are related to all of the others by the branching diagrams, simply by following it down to where the tree branched, such as when human species, Homo sapiens(socialis) branched from Cro-magnon, same H. sapiens; and where Homo neanderthalensis (Neanderthal man) branched off the line. Where Homo erectus (java Man and Beijing man separated out. To the new finds of H. floresiensis on the island of Flores, Indonesia, to the latest Human form, H. denisovensis found in central Asia. There are a number of branches of early man yet to be found, again, lying on yet again, fertile continuum of human evolution.

We cannot extend this family line into the future, except by pointing out all of the variations of the wolf, which has been domesticated into the 100′s of known dog breeds, as a VERY prominent example of evolution in action, which is undeniable. In this case man was the evolver, not a more natural cause. Same is true of all domesticated varieties few plants, now numbering in the 100,000′s. Again, discovery/creation of new variations of plants/animals is yet again, the creativity we necessarily find along a continuum, in the specific case of the tree of life. All the myriad ways, from the simple to the complex, the comparison process at work in creating new forms of life as well as cataloging, organizing and understanding what has been found. Le Chanson Sans Fin, universel.

21. Looking at the plate tectonic model will reveal something very unexpected and useful. Essentially, using the basic concept of the geological plate, which can be seen here: the geologists created a working model of how geological processes act and where they occur. This is in fact a sort of round comparison process continuum. All plates are related to other plates. No plate is absolute and their complex movements and descriptions are based upon a few simple rules/processes. Again, a visual, diagrammatic, 3D model. Where a few lines are worth 1000 words and umpteen pages of math, if at all.

http://vulcan.wr.usgs.gov/Glossary/PlateTectonics/Maps/map_plate_tectonics
_world.html
The simplified version.

http://volcanoes.usgs.gov/about/edu/dynamicplanet/nutshell.php

The geological processes are sea floor spreading due to upwelling in oceanic ridges; subduction zones where seafloor is moving down under continental margins, creating 1. volcanic zones (the Cascade mountains, N. Am) from subduction materials melting and rising up, and 2. subduction zone earthquakes the most powerful of all quakes; normal fault movements created, such as the San Andreas fault; hot spots on the ocean floors and those related effects. The East African rift system Spreading zone is seen as a land version of seafloor spreading zones.

Listed from the first plates found, sea floor spreading in the mid-Atlantic rises, which separate Africa/South America in the south to Europe/North Am in the north. Then the rest of the circular continuum was constructed around that.classical extending of the continuum until it was far, far more complete, tho more work is going on now. Each new plate found was another creative act/discovery and much more is yet to be found. The finding of living systems completely independent of the sun in the East Pacific Rise, yet another kind of discovery, also lies on the spheroidal geometry of plate tectonics. With more such spots to be found, undoubtedly very ancient ones as well.

22. Plate tectonics is complex systems theory, a descriptive solution of the N-body problem, is it not? No math, but visual descriptions, occasional measurements in 3-D which show how the geological processes work. We see that the COMP can create a predictive theory based upon descriptions which are NOT mathematical, yet real and useful. We model successfully complex systems, but have a great deal of trouble developing these systems using math, esp. biological systems. Where they do work, they show the structure/function relationships using programming. This also is probably another way to AI modeling of the human brain. What works for complex systems such as plate tectonics should logically, by analogy work for living systems and organs and genetics, once we have enough of the basics to model along the same way as in plate tectonics. Which, come to think of it was how the characteristics of the comparison process (work in progress) were found recognition with LTM, ordering (empirical introspection), etc., It should even work to model the complex system which is our solar system.

23. The traveling salesman problem has been solved by bees.
http://www.wired.com/2012/09/bumblebee-traveling-salesman/

Imitated by man and then computerized once the COMP had created a translation of what went on with the bees and how we can understand it in human language. Again, the translation of a bee creation/discovery into human terms/language, made possible by comparison processes. Again, trial and error as a basic part of creativity.

Bees did NOT solve this problem using math, any more than birds solve complex trajectories by math, either. But they did use a form of the COMP, because they have recognition, thus the comparison process of the hymenopteran analogous system. Ours can solve this problem, too, using the COMP. We translate the bees’ method into descriptive language. It’s roughly analogous to the mathematical series approach. The first few elements of the series are not as exact. Over time, it becomes a good solution. And again, the Least Energy Principle in action!! This is yet another example of how the COMP works and what it can do, which math can’t. Do you get it yet?

And the bees DO have maps AND language, because they have recognition, too. They can recognize hive mates versus enemies/not self. They know their hives and where those are. They can fly out and come back. These are LTM (Long Term Memory) and the COMP, just like birds know where their nests/territories are, just like they know who their mates, foes, food, and predators are, too.

http://www.mpiwg-berlin.mpg.de/en/news/features/feature5/
Karl Von Frisch’ pioneering work into seeing what went on inside the bees’ nervous systems (empirical introspection).

24. The complex medical differential diagnosis is just more of the same complex system descriptions, lying on a complex continuum, which relate in all known details the kinds of diseases of the various organ systems, and how to diagnose and treat them, tho far, far more complicated, too. See Duchenne Muscular Dystrophy article (in progress) for more details on the processes/sequences of medical creativity and discovery, which has been and will be done. This is the unifying process underlying the Comparison Process. This is its unlimited utility and value. It can produce a COMP solution to the N body problem.

Find your continua, describe them and understand, create and discover more. That’s a form of problem solving/creativity.