Discussion of everything related to the Theory of Evolution.
what the hell is current Darwinian theory???
Anyway, the 300k random mutations are not those SINEs and LINEs, but those making the genomes different, as you might notice, if you wanted. I think, that the retrotransposones are using some kind of homology insertion, so it's pretty much not so surprising. And even more, it's highly understandable, that these elements won't be inserted into protein-coding areas, because that would be lethal in many cases and thus such mutations are eliminated by natural selection. Pretty nice, isn't it?
Cis or trans? That's what matters.
Hmm.. I thought I had made this clear in my précis in a previous post on this thread, however for the sake of clarity let me repeat.
“The fundamental tenets of the modern standard theory are random mutations of genetic material followed by natural selection operating on populations, the process is gradual and this drives towards speciation.(new species)”
A central tenet being in the words of Simon Conway Morris “that evolution is for all intents and purposes open-ended and indeterminate in terms of predictable outcomes.”
If you don’t find either mine or Simon Conway Morris’s précis sufficient, how about the description of the current ideas on evolution referred to as the Modern Synthesis, presented by Futuyma
"The major tenets of the evolutionary synthesis, then, were that populations contain genetic variation that arises by random (ie. not adaptively directed) mutation and recombination; that populations evolve by changes in gene frequency brought about by random genetic drift, gene flow, and especially natural selection; that most adaptive genetic variants have individually slight phenotypic effects so that phenotypic changes are gradual (although some alleles with discrete effects may be advantageous, as in certain color polymorphisms); that diversification comes about by speciation, which normally entails the gradual evolution of reproductive isolation among populations; and that these processes, continued for sufficiently long, give rise to changes of such great magnitude as to warrant the designation of higher taxonomic levels (genera, families, and so forth)."
- Futuyma, D.J. in Evolutionary Biology, Sinauer Associates, 1986; p.12
So as far as description goes, can we agree to be on the same page
I can then continue
So are we on the same page as to what theory is being discussed?
The current Darwinian theory that I am referring to is that which I qualified previously and for the sake of removing any misunderstanding I deferred to the Futuyma description.
Is it in order for me to respond to your comments now?
You're right there is no theory named as “Current Darwinian theory” anymore than there is an “evolutionary theory” as such.
There are different evolutionary theories around.
The standard theory known as the Modern Synthesis or Neo Darwinian Synthesis is the one I have referred to as the current Darwinian theory as described by among others Futuyma. I had thought I made that clear, but obviously not
Punctuated Equilibrium – not based on the gradualism of the Darwinian explanation
http://www.pbs.org/wgbh/evolution/libra ... 35_01.html
Evo Devo – again not based on the gradualism of Darwinian explanation
And now the Post Modern Synthesis – You will find it here
The Modern/Neo Darwinian Synthesis is the current standard theory and the one I assume that you Jack and others refer to as evolutionary theory.
I hope this clears matters up
Interesting though your private hypothesis is, you are factually wrong when you say that the “300k random mutations are not those SINEs and LINEs”
Please read the paper carefully. In particular may I refer you to the section
Co-Localization of SINEs in Rat and Mouse
The very first sentence of that section reads :-
“Despite the different fates of SINE families, the number of SINEs inserted after speciation in each lineage is remarkably similar: 300,000 copies.”
Gains and losses of DNA :- reads
“In addition to large rearrangements and segmental duplications, genome architecture is strongly influenced by insertion and deletion events that add and remove DNA over evolutionary time.”
The fact is that SINEs being retrotransposons are mutagens, their insertion/deletions are mutational events and by their very nature are therefore Random events.
Now please try to understand the section
Co-Localization of SINEs in Rat and Mouse
Francis Collins calculated there were some 300,000 mutational insertion/deletions in both the Rat and Mouse lineages.
What he found remarkable was that the resulting density patterns of each separate lineage
after all their individual histories were almost carbon copies of each other.
Now why, did he find the density patterns remarkable?
Because he looks at the data through the prism of the “Modern/ Neo Darwinian Synthesis”
According to that theory the Rat and Mouse had a Common ancestor and they diverged from that ancestor some 12-22 million years ago, forming the two separate species.
Since there was now two separate lineages with their own histories of mutational impacts over millions of years on their genomes, and again according to the theory, these mutational impacts were random and undirected, how is it that the data shows, not only were the density patterns highly Non Random but also, the individual patterns were almost exact copies of each other.
The other observation he noted was that the same patterns exist in the equivalent Alu (which are SINES) densities of the human genome. This pattern also exists in other mammalian genomes.
That is why he refers to the patterns as being remarkable.
Now it only requires a modicum of schoolboy logic to conclude that either the data is wrong or the prism through which that data is be interpreted is faulty.
Since some three hundred scientists have been involved in the project and they are satisfied with the veracity of the data, my conclusion is that the theory is wrong.
It is not wrong just at the peripheral edges, but right at its very heart.
Remember according to the theory.
Random undirected mutations in the genomes are what Natural Selection filters.
Now there is more that this study reveals but I think that is enough to be getting on with.
May I return to the Lenski experiment just to nail down this data on Non randomness.
http://myxo.css.msu.edu/lenski/pdf/2008 ... t%20al.pdf
Please refer to the “Discussion and Future directions” of this paper (page6) and note the replays of the experiments from archived generations that are referred to.
Adaptations by mutations under stress are a well known phenomenon. If such adaptations are the result of totally random mutations as the theory teaches, then each resultant population must be different. However, repeating the final stages of this experiment on samples preserved from earlier generations of the original population, before it developed the citrate-feeding capability, resulted in identical citrate-feeding populations, all of which emerged after the same total number of generations.
How could the random mutations of each replay produce the same result. The answer is obvious. Those mutations were not random but highly Non Random.
This fact is also borne out by Raymond Huey et al. at the University of Washington in Seattle. They discovered that populations of fruit flies on three separate continents have independently evolved identical gene changes within just two decades (that's 20 years, not 20 million years), apparently to cope with global warming.
http://www.newscientist.com/article/dn9 ... rming.html
The fact that identical genetic changes as in Darwin's finches were duplicated within a short time; the fact that the genetic changes in the fruit flies on three separate continents were identical; and the fact that many samples of early generations of E. coli produced the same adaptations and identical populations, prove that:
These genetic changes are not the result of random mutations. (If they were random, the results would not have been identical.)Therefore the ability to adapt must be pre-programmed into the genetic codes
The mutations can still be random (I expect a tendency / predilection toward a particular TYPE of mutation - the inversions mentioned - might be involved, though), but selection can stabilize the same useful mutation when it appears in different populations dealing with the same environmental shift.
I find it difficult to follow your line of reasoning.
If something is random it is by nature unpredictable. There cannot be a “tendency / predilection toward anything if it is random. If there was this tendency then it is not random.
What is the difference between the expression you use and say, “random pattern or random order or random logic”?
I appreciate we all use oxymorons in everyday language. (One of my favourites in this economic climate is the term “rising deficits”). However to use an oxymoron to define data is not scientific.
Non-randomness explains the data quite easily does it not?
Why therefore is this not acceptable?
True randomness as a mathematical abstraction and life do not completely intersects. Mutation can be generated randomly but some things might directly impact the visible outcome:
- the mechanisms generating the mutations might act randomly but not be able to generate all type of mutations. For example DNA repair mechanisms can insert mutations, but will be constrained by the repair they make (ie replacing G oxidized to 8oxoG by T (IIRC), the location of the oxidation will still be random, the outcome will not)
- Some mutations will immediately be lethal, so you will never see them in the experiments, but because Lenski's experiment deals with millions of bacteria, the death toll will not be visible in terms of growth rate
- Some mutations will provide an immediate and sensible advantage to the mutant, and will have a greater chance to be fixed quickly, and so appear to be more frequent than other mutations.
- Some physical characteristics of the DNA (frequency of translation and of replication, interaction with other proteins, physical location in the cell and accessibility to external mutagens...) might make some part of the genome more or less likely to see a mutation appear so the chance of mutations appearing in a given section of the genome are not equally distributed.
- And some path in the evolutionary landscape might be more likely and so appear favored (while not necessarily being the more efficient).
So my point is that the appearance of non randomness can be created by the accumulation of random events and that this appearance cannot be taken as a proof of design. The flow of water out of a tap is created by the mostly movements of billions of individual water (and other salts) molecules. The path of any individual molecule cannot be predicted with certainty. And yet the general movement can be accurately predicted and modeled.
Science has proof without any certainty. Creationists have certainty without
any proof. (Ashley Montague)
May I correct you, randomness is not a mathematical abstraction.
If a sequence is unpredictable, we call it "random". Stated another way, outcomes which lack discernible patterns are said to be "random".
We don't define randomness by what it is, but by what it isn't. Of course an apparently "random" sequence may have an underlying order that we just haven’t yet understood.
The problem is, we are not finding apparently random sequences, the data is revealing, as you acknowledge, apparently non-random appearances.
If they are random as you seem to wish to assert then the onus must be on you to show how it is so.
With respect all you have done is to provide another hypothesis to support the first one of randomness. This is not evidence.
Occam's razor is a logical principal, it states that one should not make more assumptions than the minimum needed. It underpins all scientific modelling and theory. It admonishes us to choose from a set of otherwise equivalent models of a given phenomenon, the simplest one.
In any given model, Occam's razor helps us to "shave off" those concepts, variables or constructs that are not really needed to explain the phenomenon. By doing that, developing the model will become much easier, and there is less chance of introducing inconsistencies, ambiguities and redundancies.
I am not providing proof of design.
I have been providing data that shows the non-random sequences that falsify the Darwinian concept of randomness. An essential requisite of any scientific theory is that it must be falsifiable.
So may I raise the question again
Non-randomness does explain the data quite easily, does it not?
Why therefore is this not acceptable?
Who is online
Users browsing this forum: No registered users and 4 guests