> Depending on what's being studied, I don't think
> it's necessary, or even desirable, to have
> majority (let alone complete) agreement with
> whatever turns out to be the facts.
"Species" is irrelevant". "Peer review" is irrelevant. "Human knowledge" is irrelevant. There are no "humans" merely 7+ billion individuals.
All that exists anywhere are individuals and all individuals everywhere (other than us) think alike. It's so easy for us to see species etc because our brain likes to categorize and organize. We even think in such terms because we must learn language not only to communicate but to learn what others did who came long before. Language itself forces things into categories and types. Our math does this as well. Our perspective tends to grossly overestimate the importance of these things and even hide the fact it's not necessary to think in such terms.
> I mean, if
> three people on Earth discover that the planet is
> in fact going to be destroyed 53 years from now,
> would we be far better off if all possible
> resources focused on how to get us out of that
> mess rather than "wasting" years of valuable
> damage control time just to corroborate that the
> end is morally, ethically, spiritually,
> physically, positively, absolutely, undeniably
> and reliably nigh?
Consciousness is provided to all life to estimate the odds of the outcome of various behaviors. The "brain" is merely a biological machine to calculate these odds which is perceived by the individual as consciousness. Animals which survive to adulthood simply have a great deal of experience learning what works and what doesn't. A rabbit's reality is neither defined nor known because "rabbits" (read "all individuals with these characteristics") lack a complex language to pass on knowledge. Each "rabbit" is a scientist which must start "all over" at birth to learn the ways of reality.
If three people believe anything it has no bearing on reality. It also has no bearing on other individuals unless they are believed. If the individuals are the three finest scientists it changes nothing except to make the story more believable and more likely to be supported with evidence and logic.
> Again, maybe it's a tough
> line to draw, and maybe there isn't really a line
> at all, but more like a fuzzy, sketched line; a
> spectrum or confidence band that's based on the
> logic base of the audience. Maybe that, too, is
> built into the DNA code as one way Nature hedge's
> its "bet" on each new "mutation". Again, random?
These are "human" concerns that are unconnected from reality.
All ideas and all thought is individual. No group of rabbits and no bunch of humans can think or ever had an idea.
To consciousness reality is probabilities or confidence bands but groups can't access these except to the degree they can be legitimately quantified.
Humans are social animals by nature and appear as a mostly homogenous group because of the way we think. We are adept at making taxonomies and are the first to occupy one.
> Anyway, it took many years to turn Einstein's
> hypothesis into a testable theory. Physics didn't
> shut down in the interim though. If I recall
> correctly, not all components of his Reliativity
> have been verified yet. Science has its
> foundations in speculation. Some fail validation
> while others pass. Quantitative science is also
> based on being able to independently test and
> corroborate the validity of a speculation.
It's still entirely possible that other theory exists to explain observation. There are some good hypotheses out there.
> At my orientation in grad school, my department
> chairman lined up all of us newbies and asked us
> "What does it take to become a Ph.D.?" Most
> of the students suggested the standard academic
> and research achievement stuff, and he nodded but
> then at some point interrupted and said "You
> can't be a Ph.D. until you learn how to say 'I
> don't know'. But then you damn well better know
> how to go about trying to find the answer!".
I agree with your teacher and would take it a step or two further. It's not enough to merely say "I don't know" but rather one must sally forth and seek what he doesn't know. More progress results from finding anomalies than any other form of serendipity. One should try to learn to see anomalies preferentially to beliefs. This may not be for everyone since there may be no way for most individuals to both specialize and learn to see anomalies. I don't know if I could do it because I don't know if I could have specialized in any major branch of science since my math skills were never so high. I even had some trouble with calculus.
Humans (all individuals) simply think differently. The way we think is derived from a confusion of Ancient Language overlain with massive amounts of knowledge from diverse sources made possible by language. We don't notice that we each have different beliefs because we so often arrive at the same answer. We don't notice that each listener takes a different meaning because there is so rarely a test and we prefer not to think of it. With our language it is hard to imagine another way to think but it is the way all other life and ancient people thought. If I'm right the only way to study this would be to devise experiments accordingly. Or we could try to program a computer to process information like animals and let it make the necessary observations or comb the relevant data bases.
Theory is stuck with Einstein in the 1920's. I don't know what the problem is.
But from my perspective it seems we might be at the limits of the tool (scientific metaphysics) that we use to understand reality. 100 years is a long time to be stuck but Egyptology can do it standing on their heads. ;)
Edited 1 time(s). Last edit at 11-Jan-18 16:23 by cladking.