Banner
    Are Quarks And Leptons Elementary Or Composite ?
    By Tommaso Dorigo | February 21st 2010 04:14 PM | 51 comments | Print | E-mail | Track Comments
    About Tommaso

    I am an experimental particle physicist working with the CMS experiment at CERN. In my spare time I play chess, abuse the piano, and aim my dobson...

    View Tommaso's Profile
    There are twenty-four elementary fermions in the standard model. Sure, they are arranged in a very tidy, symmetrical structure of three families of eight fermions (two leptons and six quarks), which is not too unpleasant to behold. And of course, if one is willing to forget the fact that the quantum-chromodynamical charge of quarks does make them different, then the picture is even tidier: 12 fermions, six of them quarks and six of them leptons, arranged in three families of four.

    Tidy or not, the two-digit number of matter fields in the standard model may suggest that these bodies are not, in fact, elementary. What does "elementary" mean, after all ? Well, elementary has several meanings: something elementary is something simple; something that does not have any structure, and cannot be divided into simpler parts; but also something that is the basic building block of more complex structures. Do we really need twenty-four different elementary building blocks to generate the universe ?

    The question is at the basis of investigations for a substructure of quarks and leptons. Of course, to investigate the structure of things that we consider point-like, we need the highest-energy probes available. The hope is that we will experience something like what Lord Rutherford did a hundred years ago: by shooting alpha particles at a gold foil, he saw some of them deflected at very large angles, as shown in the figure on the right. In Rutherford's own words,

    "It was quite the most incredible event that ever happened to me in my life. It was almost as incredible as if you fired a 15-inch shell at a piece of tissue paper and it came back and hit you."

    Today's analogue to a back-scattering of alpha particles, in the proton-antiproton scattering at 2 TeV provided by the Tevatron collider, is an excess of events with very large energy, emitted at large angle from the beams. The similarity may not appear so clear to you, but please consider: the momentum transfer from the gold atom to a back-scattered alpha particle is very large, and it indicates a strong force acting on the projectiles, produced by some hard sub-structure within the atom. Similarly, the large energy of the products of a hadron collision, radiated transversally from the incoming beams, may indicate that we are starting to see some substructure in the quarks.

    In 1996 CDF found in the Run I data a large excess of events with a jet of hadrons emitted with very high energy. You can see the excess in the original figure from the 1996 paper below.



    The excess of high-Et jets (which you can see in the form of a upward deviation of the black points with error bars from the line at zero) could be the result of a underestimate of the cross-section for "normal" quantum-chromodynamical processes, or the first footprint of a quark compositeness. The effect made headlines back then, but it eventually died away once it was discovered that by tweaking the momentum distribution of gluons in the proton the number of energetic collisions could be boosted up to match the observations.

    I can almost hear some readers screaming profanities as they leave this site in rage, after fighting with their guiltless neurons to make any sense of the above sentence. For those who are left, an explanation of what I mean above is in order.

    When you collide a proton and an antiproton, each of them possessing an energy of 900 giga-electronvolts (the beam energy in Run I), you are not supposed to see the full 1800 GeV in your detector, transported by new particles flying out in all directions. That is, you expect that most of the incoming energy will be retained by the two incoming protons, or what has become of them after the collision; these energetic remnants will escape through the same apertures that allowed the projectiles in. Only a smaller, variable amount of energy will be emitted at large angles from the beams, and will thus be detectable by your instrument.

    The reason for the above is that what are colliding are two constituents -a quark and an antiquark, or a gluon and a quark, etcetera. And these transport only a small fraction of the energy of the proton which contains them. How much energy each of them is expected to carry is indeed encoded in so-called "parton distribution functions", or "momentum distributions".

    Now, it is not necessary to look for high-energy jets of hadrons to search for compositeness. It turns out that the modifications to the theory due to the structure of what we call "elementary" today are detectable by observing any kind of process yielding large energy emitted at large angle from the beams, because these very high-momentum-transfer collisions get their rate modified from the original standard model rate. Take the graph below as an example.



    In the figure, you can see that the invariant mass of pairs of leptons expected by the DZERO collaboration has a very different high-mass shape from the standard model expectation (red curve) if one assumes that fermions are composite (purple and blue lines). The modifications to the equations due to compositeness may in truth both increase or decrease the rate of high-energy processes; in both cases, they are still detectable by comparing expectations with the observed spectrum.

    DZERO analyzed a very large dataset of electron-positron pairs, and recently produced a comparison of the mass spectrum with theory, extracting limits to the "scale" of compositeness -the energy above which the effect becomes important. The dielectron mass distribution extracted by DZERO is shown below.



    As you can see, the data is well represented by the sum of backgrounds from QCD and Drell-Yan processes as predicted by the standard model (the dashed blue line). There is no excess nor deficit in any region of the spectrum, and this may be used to set limits to the energy scale at which compositeness would start to make itself felt -which is inversely proportional to the distance scale at which a hypothetical substructure of quarks and leptons would be present. In the figure, a particular possibility for compositeness is shown by the purple line, which is clearly inconsistent with the data.


    A summary of the limits obtained by DZERO on a variety of different models of compositeness is shown on the left. The bars show the range of the compositeness scale which is excluded for constructive (right) or destructive (left) interference between the terms introduced in the interactions by the new physics process and the standard model terms. I will omit discussing the detail of each of the models here, but if you are interested please have a look at the public document made available in the DZERO site.

    Will we ever discover that quarks and leptons are not elementary ? That would indeed be a revolution in physics. I may be old-fashioned, but I like too much the standard model to believe that its beauty is an artifact, an accidental arrangement. But the search for compositeness should and will continue. The Large Hadron Collider will push the investigation to distance scales an order of magnitude smaller. Now, we found a structure in the atom by using MeV-energy projectiles; and we found a structure in the proton by using GeV-energy projectiles. Maybe using TeV-energy probes we will again strike gold. Maybe.

    Comments

    I'm reminded of a statement made by Enrico Fermi, "If I knew there were going to be so many particles, I would have become a botanist instead."

    It seems from what you have written that we may very well be at that point again. I too have become rather comfortable with the standard model. But if the energy levels indicate that there are still even smaller constituents than quarks and leptons, I guess it's just something we're just going to have to get used to.

    One disturbing thought though. If it turns out that quarks and leptons are composites of even smaller particles, how do we know that's the end of it? How do we know that this may not be an infinite regression? Can we ever be certain that we have finally found the truly elemental?

    Maybe Voltaire was right when he said, "God is a comedian playing to an audience too afraid to laugh."
    dorigo
    Nice quotes Eric - both of them. The question is one of the toughest to answer. Maybe we in fact will never reach the bottom. That does not mean we should not continue looking -the things we learn to change our life every once in a while.

    Cheers,
    T.
    John Starrett
    I, for one, would be perfectly happy if we discovered it's turtles all the way down. There is something elegant about the idea that the whole shebang is just pattern and not stuff.
    John Starrett
    This is a stunning question. It has been presumed for so long that we had finally reached the end of the compositional line on particles, with leptons and quarks. It's good to challenge that when there's cause. I'm amazed the key challenging data comes from 1996. Anyway, how would a "structureless" particle like a muon have anything "clock-like" within it to drive probabilistic decay? Note that nuclear decay can be expressed in terms e.g. of alpha clumps tunneling time etc. As for the standard model, I thought it was already toast due to neutrino oscillation. But doesn't dark matter and dark energy turn everything upside down anyway?

    Don't be too quick on the muon, Neil. Don't forget that it wasn't all that long ago that we thought it didn't have any mass. And, therefore we didn't know that it could change flavors while traveling through dense objects like the Sun and Earth.

    Remember the saying, "The universe is not only stranger than you imagine, but stranger than you can possibly imagine."? ;-)
    dorigo
    Hi Neal,

    the weak decay of muons -same thing as beta decay of neutrons- is "probabilistic" as you call it, or rather, governed by an exponential law with fixed half-life, because of a quantum mechanical rule which is not too different from the quantum tunnelling of alpha particles you mention. The mechanics of weak decay is not too complicated to explain actually - I might consider posting about it one day.
    The discovery of neutrino oscillations are to some "new physics" beyond the standard model. To others, like me, they are just a rearrangement of the furniture in the living-room. As for dark things, let us leave them to cosmologists. The particle interpretation of dark matter is not proven yet.

    Cheers,
    T.
    I have never found an explanation of how a muon can decide to turn into a bunch of other fundamental particles - I know there are rules governing what it can turn into based on conserved quantities. But what actually happens at the moment the muon ceases to exist and its decay products come into being?

    dorigo
    Imagine the muon as emitting a W boson. The muon does not have enough energy to emit a real W, but it constantly emits and reabsorbs a virtual one; each time, it becomes a muon neutrino, until its "muon-ness" is reconstituted by the reabsorption. There is a finite probability that the W, in turn, splits into a pair of virtual fermions while it is "in the air", before it gets reabsorbed. Notice that the virtual W cannot borrow from the original muon more energy than half the muon mass: therefore, any fermion pair that the W splits into is virtual; only an electron-neutrino pair has less mass, and is thus allowed.

    What therefore we can picture is that at some point the muon emits a virtual W, the W before rejoining with the neutrino to form back the original muon splits into a real electron-neutrino pair. At that point the muon neutrino is screwed: it does not have anything to reabsorb! The muon has decayed for good.

    A constant probability that the virtual W splits into a real electron-neutrino pair is just what it takes to explain an exponential decay rate for the muon. Mind you, this explanation is not to be taken literally; but it is a nice way to picture the process.

    Cheers,
    T.
    lumidek
    Dear Tommaso, I am not sure you have satisfied the reader with this (very relevant and detailed) picture. I think that the reader was actually asking:
    How does Nature dare not to be deterministic? Who will tame the bitch [your term] into behaving according to some deterministic variables, and what those deterministic variables will be for a muon? ;-)
    hmm... thanks for the explanation. I think I can understand it as when the muon emits the W the energy to do so is "stolen" from the mass of the muon along with its electric charge and thus the remainder we are left with, after the W is emitted, is a neutrino.

    dorigo
    Yes, in a way this is so. Mind that at most the virtual W can steal half the rest mass of the W, in the muon rest frame. This is because for the decaying muon the nu and the W are leaving in opposite direction, each with momentum equal (roughly) to half its mass.

    I will have a post on these things out tomorrow.
    Cheers,
    T.
    So another way of thinking about it is that the neutrino and muon are really the same basic object? But its a question of changing the properties of that object (add or remove electrical charge, increase or decrease the mass)?

    Mark

    dorigo
    Well, to the extent that we are talking about elementary particles, stripping one of its mass or charge is about as much as one can do to them.

    Cheers,
    T.
    Eric: Note for clarity, the muon is a lepton with significant mass (around 106 MeV, over 200 times the electron) whereas the neutrino has at most a few MeV. There is the family of electron, muon, and tau; then the family of neutrinos associated with each of those (and their antiparticles, making for 12 total.)

    Neil, please excuse my error. I was thinking about the neutrino and not the muon when I wrote what I did. Of course, you are correct. I'm just a little tired. It's the neutrino that changes flavors when traveling through dense objects. Every neutrino starts out as an electron neutrino in the core of the sun and can change to a muon neutrino or a Tau neutrino, which accounts for the total number of solar neutrinos predicted. Sorry about that Niel and thanks for catching my error. : )
    You do not need dense matter for neutrino flavor oscillation to happen.

    AFAIK anonymous is right about density - clearly the chance of interaction is less for less density but I can't see a reason for there to be a threshold (not that Eric necessarily implied one. BTW more and more times people make these tired mistakes about what they really know as we tend towards less and less sleep.) But is it a simple linear relation? I mean, could there be some degree of global influence there?

    BTW even though neutrinos are so weakly interacting, a supernova emits enough to kill living organisms nearby (even if they could protect themselves from everything else - so you can't hide in the center of a rocky planet. I wonder about absorption coefficient in neutronium, like of a neutron star.) Finally, think of the wonderful views if we could make neutrino telescopes - see stars hidden behind any amount of gas and dust.

    Yes, lethal amounts of neutrinos are most definitely associated with supernovae. And because they are so weakly interacting, they're a great way to be alerted to an impending supernova, since they arrive here at Earth before the photons do.

    I'm excited to read from Tommaso's comment below that they are constructing a neutrino telescope. I agree that it will be so cool once they get the imaging part of it down.
    dorigo
    Hi Neil,
    we are making neutrino telescopes in fact... But focusing them into an image is the trickiest part :)
    Cheers,
    T.
    I didn't know that, Tommaso. Now, that is really cool!
    When I say you do not need dense matter I mean you do not need any matter at all. Flavor oscillation by itself is not related with any kind of interaction with matter. Of course the electron neutrino eigenmass espectrum is modified by neutral current interactions with the electron gas in the sun (MSW effect) and this modifies the vacuum oscillation probability. But for instance, the first evidence of flavor oscillation was provided by atmospheric muon neutrinos detected in Super-Kamiokande.

    Yes, anon (you aren't really a troll, good comments) I see articles about vacuum oscillation too.
    For example: http://supernova.lbl.gov/~evlinder/umass/neu.html
    Matter can also instigate it. I had been thinking of the old days of massless neutrinos. In that case, they presumably couldn't change due to not having a distinct proper time (since being luxons) to embody the decay probabilities. Another interesting angle is: if you could go fast enough past their direction of travel, it would reverse relative chirality and make "in effect" a new type of neutrino (from the handness built into their *creation* being reversed.)

    Muons - yes, QM "rule" is similar, I mean as per "structure" and components. Yes a post would be useful, tx. As for neutrino telescopes, yes we are detecting them and sorting by energy etc. (ie, spectroscopy.) It would be great to have imagery someday, which is one tall order - is it?

    Young man, if I could remember the names of these particles, I would have been a botanist. — Enrico Fermi. Quoted in Helge Kragh, Quantum Generations (1999). According Dictionary of Science Quotations.

    As for "How do we know that this may not be an infinite regression?", there was the program of nuclear democracy carried in the early sixties, and that eventually transmuted into String Theory. My own version of it, where quarks and leptons are still allowed to be elementary, appears as a invited post in the older blog of Tommaso (discussion is still welcome there)

    rholley
    As part of a comment on the LHC, I wrote this on Scientific Blogging just over a year ago, but I think it is even more relevant here:

    «Many years ago, I attented a talk by D. T. Lewis of the Laboratory of the Government Chemist, in which he aimed to simplify the jungle of subatomic particles on the basis of two proposed entities called the tamaid and the bach (Welsh for "bit" and "little"), as published in Nature.  With the Standard Model simplification seemed to have been achieved with the quarks, but then along came charm, colour, flavour, Old Uncle Tom Cobley and all.  I wonder, will the LHC turn up something new which will simplify that lot down again, maybe to tamaids and bachs?»
    Robert H. Olley / Quondam Physics Department / University of Reading / England
    lumidek
    Well, a particular evolution - such as substructure involving more point-like particles inside - may repeat several times, but the "revolutionary charge" of it decreases. In the particular situation, the evidence and even the motivation - both theoretical and experimental - of such a substructure is largely non-existent.
    The leptons and quarks are still "made out of" something, like "string bits", if perturbative string theory is a good approximation, but the idea that it must be always the same thing, another layer of point-like substructure, is a naive preconception. To say the least, it surely breaks at the Planck scale where no "finer" point-like subcomponents can exist.
    Perhaps someone will resurrect the "rishon" model (Harari, Shupe, Seiberg.) It might turn out, we cannot even specify a definitive breakdown, and that the presumed "composition" of these particles is ultimately tied to relative context and interaction energies. Maybe that will fit in with string theory, but I presume each "string" corresponds to some ultimately fundamental particle like a lepton or quark. So how many strings in each particle is how many "constituents"? LuMo probably has an opinion.

    http://en.wikipedia.org/wiki/Rishon_Model

    lumidek
    Dear Neil, it's a fun model but I don't think that anyone believes it seriously. Unlike the ordinary quarks, these rishons don't simplify the diversity of the known particles. Recall how many hadrons were explained by a few quarks. For rishons, you have to add hypercolor to a color, and nothing really simplifies.
    There are no known incorporations of rishons in string theory. It may be because no one has tried but I actually doubt you would succeed.

    Now, the compositeness is surely a vague question and depends on the context, but only if the coupling is strong. For example, a magnetic monopole is a heavy composite - it comes from a classical, topologically nontrivial solution for the fields creating the electrically charged particles. It's as composite as you can get - infinitely many photons and additional fields that hold the monopole together have to conspire to create the "vortex" in the field.

    And as other - more important - papers by Seiberg and others showed, these monopoles may interact and behave in an indistinguishable way when the coupling between the electrically charged particle gets strong. S-duality means that magnetic monopoles at coupling 1/g behave in the same way as electric monopoles at coupling g. This is the simplest example, but Seiberg and Witten, and Seiberg himself, found more complex manifestations of the fact that there's no "objective" way to distinguish electrically charged "elementary" particles and the magnetic monopoles that seem "composite".

    However, when you keep the coupling weak, "g" much smaller than one, then the compositeness becomes absolute: it is not a matter of convention. In particular, if you take the string coupling "g" to be much smaller than one, then strings are the lightest objects, and they're also the objects that everything else is made out of. So they're elementary building blocks and everything else is composite (and usually heavier).

    In that limit, the first one that was understood in string theory, you can say damn sharply how many strings there are in well-known particles, and the answer is always one. Of course, this number or statement gets "renormalized" - much like the statement that there are "three quarks" within a proton. But in the same sense as protons contain three quarks, quarks and leptons contain one string each - in all known classes of realistic vacua of string theory.

    It makes no sense to ask how many "constituents" a single string has. In the perturbative expansion, the string *is* the most fundamental constituent you can have. There's one of them. I just wanted to emphasize that a string is not a point-like particle. You may interpret a closed string as a loop of pearls - string bits - that hold together because of some forces between the bits. This loop of pearls can be quantized and the spectrum gives you what string theory usually teaches you. This is a heuristic picture but in some sense, a string is equivalent to infinitely many bound point-like particles. Perturbatively, it has infinitely many degrees of freedom. But it is "one" infinity, not two or three.

    Non-perturbatively, strings become strongly coupled and we get the full theory of quantum gravity. Quantum gravity actually seems to be holographic - it has many fewer degrees of freedom than a local quantum field theory based on point-like particles. The inequality goes in the opposite way here.

    One must carefully formulate the questions. The question "how many particles there are" usually fails to be strictly well-defined in theories as sophisticated as quantum field theory or more. The number is usually just a matter of a description and an approximation: it can't be given an operational meaning just like the question how many angels you find on the tip of a needle.
    Magnetic monopoles, S-duality, strings, quantum gravity... nothing but speculations based on speculations, you are living in a fantasy world Lubos.

    Yes, Lubos. Can you tell me what is the smallest bit of information in a particle. If there is infinite degrees of freedom maybe the quantum computers will never be built. In fact Anton Zeilinger et.co. gets the Wolf-Prize this year for their work on this. Also illuminated matter are entangled, as you claimed it would not be. Entanglement is not only quantal. And the periodic table is hierarchial, why not the elemental particles. Some say the Higgs particles are as many as five. Two interval are already pointed at. One light and one very heavy. How is the hierarchy in the heavy Higgs?

    lumidek
    Hi Anonymous!
    All realistic physical systems have an infinite-dimensional Hilbert space. For example, a simple Hydrogen atom can be found in 1s, 2s, 2p, 3s, 3p... and so on, and there are additional labels that I could have omitted. To pick one of the states - e.g. the electron in 3s - you would need an infinite number of (quantum) bits if all the states of the Hydrogen atom were equally likely.

    However, they're not equally likely. In the region of a large "n", the Hydrogen atom is approaching the continuum (ionization). It is becoming large. And it's unlikely for the electron to be exactly in the 2010th excited state, for example. If you calculate the information carried by the data about the state of the electron but if you take into account the fact that some states (the less excited) are more likely than others, the total information becomes finite.

    It doesn't really matter whether you consider a single Hydrogen atom or quantum field theory or quantum physics of one string or quantum physics of the whole string theory which allows any number of strings. All these physical situations have an infinite-dimensional Hilbert space that is completely isomorphic to the Hilbert space of the Hydrogen atom. In all cases, you may choose a "discrete", countable basis - so that the basis vectors may be ordered and numbered by integers - like in the case of the Hydrogen atom. (Even continuous degrees of freedom may be rewritten in a discrete basis - recall e.g. the harmonic oscillator that has nice discrete energy eigenstates that are enough to describe any continuous position of the electron, too).

    So if quantum mechanics is valid, and everything indicates that it is, quantum computers may surely be built in principle. How good ones will be built in practice is an open question. But people have actually already used quantum computers in a helpful way - in some simulations of chemistry that would be very hard for classical computers.  It was on my blog a few months ago.

    The smallest bit of information is arbitrarily low, arbitrarily close to zero. If you have e.g. a two-state system that has the probability "p" to be "0" and "1-p" to be "1", it carries the entropy -p.ln(p) - (1-p) ln (1-p). Divide it by ln(2) to get the result in ordinary bits. If either p or 1-p is close to zero, the information carried by such a single unit of information will go to zero, too. Of course, how much information you actually get in a single case will depend on whether you will hear 0 or 1. The less likely one gives you more information. Every time something unusual happens, you learn more than if the expected all stuff takes place.

    Anton Zeilinger surely does fun work and understand quantum mechanics well, and deserves some prizes - although the real essence what he's doing has been known for 30-85 years.

    I didn't understand your comments about entanglement. Entanglement is usually possible in quantum mechanics only - at least, its classical counterparts are so simple that they don't deserve this complicated name. The periodic table is hierarchical because the periodicity increases. It increases because the new shells one has to fill are growing bigger - they have more boxes in them. Every time you fill a whole shell, you get an inert gas. You add one electron above the shell, and you get a cousin of hydrogen or lithium or Na or K ...

    There may be several Higgses, indeed. Supersymmetric standard models require at least two neutral ones, and then charged ones. Instead of 4 real degrees of freedom, minus 3 removed from broken symmetries, you have 8 real degrees of freedom, minus 3 removed by fixing the symmetries, so there are indeed 5 of them left. They must all be pretty light for the physics to work. The hierarchy problem could still be there, except that supersymmetry actually explains why the Higgses must - or at least may - be kept naturally light.

    Cheers
    LM


    Zeilinger write: The simplest proposition is an answer to a single yes-or-no question. It is impossible to imagine a more elementary proposition.
    We suggested that a natural understanding of some essential features of quantum physics, such as the irreducible randomness of individual events, quantum complementary and entanglement, result when one accepts that the most elementary physical system represents the truth value to one single elementary proposition. Alternatively, one can say that the most elementary system carries one bit of information. http://www.quantum.at/research/quantum-information-theory/information-th...

    How can this ever be possible if what you say is right?

    Zeilinger searches for the essence of quantum mechanics--the irreducible kernel from which everything else flows. He believes that he has found it. If he is right, all the mysteries of the quantum world will turn out to be inescapable consequences of a single, simple idea.
    Quantum theory describes the world with astonishing precision, whether applied to elementary particles a hundred thousand times smaller than atoms or to currents in superconducting rings a billion times larger. And yet it seems to present a catalogue of intertwined conundrums. The most fundamental is quantisation, the notion that energy, spin and other quantities only come in discrete steps.
    Zeilinger thinks that before we can truly understand quantum theory, it must be connected in some way to what we know and feel. The problem, he says, is the lack of a simple underlying principle, an Urprinzip.
    http://homepage.univie.ac.at/Caslav.Brukner/media/bit.html

    So, descrete steps, isn't this hierarchy? And also consciousness (observer) and emotions must be part of it.

    What if it would be so, that it is our intentions of how many Higgs we will find, and what mass they will have that steer the LHC? It is not at al impossible.

    lumidek
    Dear Anonymous,
    one bit or qubit of information is the minimal nonzero information that can be carried by a system with a finite number of equally likely choices. That's because 2 is the smallest integer with a positive logarithm.


    But the information or entropy is not a multiple of one bit in general. It is given by continuous formulae, and in almost all situations, it is a fractional multiple of one bit. Even a simple thing such as a system with 3 equally likely states carries ln(3)/ln(2) = 1.58 bits  or so. But one can get much more complicated and general values, too.

    I don't understand what you think that Zeilinger is looking for - it seems to make no sense.

    Best
    LM
    To Lubos, not at all dear
    "When physicists try to calculate the properties of a quantum theory of gravity, they find quantities that become infinite -- infinities that are so bad they can't be removed by mathematical gambits that work in other areas of physics.
    Previous attempts at removing the fatal infinities in quantum gravity calculations collapsed when researchers discovered that you would need an infinite number of parameters. The problem stems from the point-like and thus infinitesimally small fundamental particles in the theories, so some physicists have developed string theory as a possible approach: instead of point particles, the fundamental entities are vibrating loops of string. But string theory is beset with its own difficulties, as it lays out a "landscape" of possibilities with an astronomical number of scenarios." http://www.sciencedaily.com­ /releases/2009/08/090817143556.htm

    This is exavtly the problem. Infiniteness must mean nonlocal particles. If these infinite problem is escaped there must be (introduced) some 'collapsing' functions in string theory, giving finiteness and pointlike particles. Otherwise there is a clear contradiction in what you say..

    Is there any 'bottom' in the Universe? Or is the 'bottom' in oscillations of non-local fields, exactly as in the Platonic solids and the aether concept. Aether was the fifth element, and composite, but Einstein showed only it was not material. He did not showed it didn't exist. What if there are no bottom at all, only entropy hierarchies like the Einsteinian gravity?

    This Hilbert space, is it only waves or strings? How would a non-local string be? What would be its cosmological and Planck constant or entropy? Isn't those waves or strings massless?

    Every number is not 'nice'. There are infinites too, like primes, that are very important. Also this points to a 'bottomless' universe with hierarchy. Particles comes from decay of hadrons, quarks, muons etc. p-adic scaled?

    You did not want to discuss that heavy Z-boson, 500-800 GeV. It would perhaps be a composite, if it turns out it is a valid result. Why so little research in that interval?

    You have surely guessed who I am. That obnoxious crackpot, as you said. You prefered to discuss by deleting my answers, but here you can't do that. Sorry Tommaso for this (hope you will not delete it :)).
    Ulla.

    lumidek
    Sorry, I have no clue what you're talking about.
    A particular number of solutions to particular equations is a mathematical fact, or a prediction, if you wish to use physics terminology, not a "problem". You may get emotional about mathematical facts, but that's just your psychological problem, not a problem of a theory.

    At any rate, this issue has nothing whatsoever to do with nonlocality or Z-bosons or other things that you have mixed into your comment. You should avoid drugs.
    That is your opinion, not mine.

    You have a very bad custom to become very ignorant when it suits you. I know the behaviour, since my brother is exactly like that :)

    Thanks Tommaso.

    (As we already see intimated, hence some further developed notion.)

    Semantically, the conceptual difference between composite and elementary particles is subtle, if not infinitesimal...

    http://tinyurl.com/ybxo5ko

    Particle on the right is supposedly composite, but it wouldn't differ in its properties very much from the elementary one on the left.

    In AWT observable reality appear like unparticle stuff similar to Perlin noise from sufficiently global perspective. From more distant perspective such noise appears like nested foam of stringy stuff.

    http://tinyurl.com/yfafclb

    Are these density fluctuations composite or elementary? Can some object remain composite, if it's not composed from unique objects of the same class?

    Here is a different view on particles as waves.
    Space exists and everywhere in space there are waves (Zero Point Energy, Quantum Vacuum Fluctuations).
    Matter 'particles' are just region in space where the waves are coherent.
    The two fundamental 'particles', the electron and positron are just opposite phase spherical standing waves. When we collide high energy positrons and electrons we get the charm quark / gluon - but it is just another more complex wave pattern.
    All matter is just coherent patterns of waves in space (full of incoherent waves).
    See;
    http://www.spaceandmotion.com/truth-statements-physical-reality.htm
    http://www.spaceandmotion.com/wave-equations-elastic-medium-space.htm

    Thus our metaphysics changes from the motion of matter particles in space and time to the wave motion of space that causes matter and time.
    This is the most simple foundation for describing reality - and really the most obvious once considered.
    Geoff Haselhurst

    This hierarchial question is not new. Holger Nielsen, remember the guy that said God would prevent LHC from producing Higgs.

    "an exceptionally strongly bound highly exotic meson of 6 top quarks and 6 antitop quarks bound together by Higgs exchange just in such a way as to produce a degenerate vacuum with this type of exotic meson forming a Bose-condensate."
    " if for some reason the second minimum in the Weinberg-Salam Higgs effective potential were of the order of some grand unifying scale or the Planck scale or a fundamental scale, then the ratio of this scale to the weak scale would be explained to have to be an exponentially big ratio from the derived multiple point principle in the present article. In this sense we can claim that the multiple point principle solved the question as to why so big a scale ratio problem, a problem which is really behind the more technical hierarchy problem."
    "But even with only a mild assumption that the order of magnitude of the Higgs field in the high Higgs VEV alternative vacuum we got a very good value for the top quark mass 173GeV +- 6GeV . Taking our previous Multiple Point fitting most seriously with three degenerate vacua in the Standard Model alone we actually could claim that Higgs-mass of 115GeV/c2 seemingly found at L.E.P. is quite well matching as being our prediction. "
    http://arxiv.org/PS_cache/hep-th/pdf/0701/0701018v1.pdf

    Also in TGD is a beautiful hierarchy, and he claim too that the hierarchy solves the prediction of Higgs particle mass.

    "...elementary has several meanings: something elementary is something simple; something that does not have any structure, and cannot be divided into simpler parts; but also something that is the basic building block of more complex structures."

    I think an electron is not elementary. When you hit an elementary object with another elementary one, only elastic processes are possible. In case of electrons the collisions are always inelastic (photon emission) so
    1) electron cannot be treated separately form EMF (they are permanently coupled)
    2) forming such a compound system, electron is not elementary. We may consider radiation as an electron feature (inelastic channel).

    Vladimir.

    A Coulomb potential can "mediate" the charge interaction, for example. I explain it in my articles, by the way. See "Atom as a 'Dressed' Nucleus", for example.

    Regards,

    Vladimir.

    dorigo
    If that is true, what mediates the elastic scattering between pointlike objects Vladimir ? Your definition of "elementary" implies the existence of such processes, but you do not explain how they take place.

    Cheers,
    T.
    rholley
    A point which would have made complete sense to Ruđer Josip Bošković (Ruggiero Giuseppe Boscovich, 1711 – 1787).

    He is, to my mind, one of the few post-Classical philosophers who have proved really useful to science.  Borrowing words from English Wikipedia, “his atomic theory, given as a clear, precisely-formulated system utilizing principles of Newtonian mechanics inspired Michael Faraday to develop field theory for electromagnetic interaction.

    Or in addition, Google translated from Hrvatska Wikipedija:
    Atom reduced to a central point around which the wider clouds attractive-repulsive forces (chemistry, field).
    Robert H. Olley / Quondam Physics Department / University of Reading / England
    John Starrett
    This gossip from a friend: "...during the CERN Video Press Conference last week, Dr. Pushpa Bhat let slip that the CMS experiment had produced evidence that leptons are composite not elemental."

    You didn't hear it from me.
    John Starrett
    Whatever happened to the term "strations" which I heard decades ago. This avoids the need to "decide" any "fundamental" particle is fundamental until we can reach higher energies. We can just say the quark, electron or whatever is a "Level X, Y or Z Stration".

    Vincent Abbott.

    dorigo
    No comment... If these are roses, they will blossom!
    Cheers,
    T.
    If electrons, positrons and Photon/Gluons are the real elementary building blocks of the material world, then all other Leptons and Quarks should be composites of one or more electron/positrons in combination with one or more Gluons/Photons. See: 3-Dimensional String Based Alternative Particle Model. http://vixra.org/abs/1103.0002 In figure 6, you will find the Leptons: (anti-) Muon and Tau to be the same as respectively the (anti-) green-Down and Red-Charm quark so fully deterministic. http://bigbang-entanglement.blogspot.com/
    Please, to read in www.jjanson2004.republika.pl the a work " Multi Quarks sructures inside the leptons"

    Hi Janson, and others. You should consider the option of creating a single pdf document from your notes and uploading it to the particle section of vixra.org. This have some adventages; it provides an independent date for your work, and it allow for different versions v1 v2 v3 etc of a same article , allowing to see the evolution of the work, if you incorporate new bibliography for instance. And, very important. it provides visibility, if someone is curious about the amateur approaches to preons, sure he is going to check vixra, but it is only random luck if he find some particular website with a few google searches. Be sure you do a good abstract with the right keywords.

    I am not sure if English is compulsory. If your work in in a different tongue, it could be good to include the original in the same pdf, or using the version contron for this, setting the v1 and odd versions to original language, and v2 for english. Or course you should check with the vixra administrators for better options.

    Of course, quarks are composite!!!! Somebody may say now 'come on, we haven’t seen it', and the truth is that we have seen several indications of it. The first one was found in 1956 by Hofstadter when he determined the charge distributions of both nucleons. (one can see them around p. 450 (depending on the edition) of the Berkeley Physics Course, vol. 1 (Mechanics)). We clearly see that both nucleons have two layers (shells) of internal constituents. Unfortunately these results were put aside from 1964 on due to the success of the quark model and of QCD later on. From 1985 on we began to see more indications of compositeness, but we were so enthusiastic with the Standard Model that we didn’t pay much attention to them. A partial list of them: 1) in 1983 the European Muon Collaboration (EMC) at CERN found that the quarks of nucleons are slower when the nucleons are inside nuclei; 2) in 1988 the SLAC E143 Collaboration and the Spin Muon Collaboration found that the three quarks of the proton account for only half of its total spin (other subsequent collaborations (EMC in 1989 and Hermes in 2007) have confirmed this result which is called THE PROTON SPIN PUZZLE); 3) in 1995 CDF at Fermilab found hard collisions among quarks indicating that they have constituents (this was not published because CDF didn’t reach a final consensus); 4) Prof. Gerald Miller at Argonne (Phys. Rev. Lett. 99, 112001 (2007)) found that close to its center the neutron has a negative charge equal to -1/3e (inside the positive region with +1/2e); 5) new measurements of the EMC effect have been carried out by J. Arrington et al. at Jefferson Lab and they have shown that the effect is much stronger than was previously observed; 6) the ad hoc Kobayashi-Maskawa matrix elements; 7) the null charge dipole moment of the deuteron and its non-null charge quadrupole moment etc.
    Gerald Miller wrongly attibuted to d quarks the -1/3e charge at the neutron center, but as the neutron ia a udd system we know (from QCD) that none of the 3 quarks spends much time at the center.
    The relevant paper on this subject is Weak decays of hadrons reveal compositeness of quarks which can be accessed from Google (it is at the top of the lists on the subjects Weak decays of hadrons, Decays of Hadrons and Weak decays).

    Therefore, we should go back and probe further the nucleons in the low energy scale, and carry on Miller's experiment with the proton.