There's still a few red-hot chillies in there to chew through yet!...
Stepping back to the slits: For the photons example for self-interference, that suggests that the requirement is not for coherent light but instead just for monochromatic light. Is that the case?
At what point does the interference fail as the two slits are placed further apart?
Quote:
Take neutrons from a given source, say a reactor. Various energies/momenta/wavelengths. Have them head towards a lattice of some ( non-absorbing ) material that has a path/channel through it with a suitably different lattice spacing to the surrounding volume. If you make that pathway long enough then short-wavelength ( as per de Broglie's formula ) neutrons will not emerge from the other end of that path. The long wavelength ones will...
I would expect a similar effect with the Young's slits as you make the material for the slits thicker (greater depth for the slits).
An interesting question is:
Do you get a constant for "separation + depth" for the frequency cut off?...
Here's a java applet on Young's double-slit experiment – you can adjust the frequency of light used, the distance between the slits, and you can also adjust the distance to the detector screen from the slits:
No substitute for an optical bench, but you can get an idea of what's expected to happen when those parameters are adjusted. Some really good ideas, Martin, for using different media, etc. It's possible these days to engineer a 'metamaterial' with a negative index of refraction, so I was thinking that might also be another way to adjust various parameters ....
There's still a few red-hot chillies in there to chew through yet!...
Vegemite's the go! Weird taste but very nourishing ...... :-)
Quote:
Stepping back to the slits: For the photons example for self-interference, that suggests that the requirement is not for coherent light but instead just for monochromatic light. Is that the case?
Ah, a single photon is going to be monochromatic with respect to itself! And coherent with respect to itself for that matter. The interference is between alternate possible paths which give the same indistinguishable result, for a single photon. If you detect at a particular point you have to consider phase differences between paths that lead to that point as if they had all been travelled simultaneously. However the nature of QM predictions are that you only get a probability for outcomes, and one photon does not an ensemble make. A probability is a ratio between two numbers, the numerator representing the outcome(s) of interest and the denominator representing the total set available for the circumstances. Hence a probability will range from zero ( never ever, no way ) to one ( every time, no doubt ).
So if we'd like to demonstrate these various behaviours we must have a group of events of some size to form a picture of the underlying probability distribution. The counts obtained from our photomultiplier ( or density of changes in a photographic emulsion etc ) is a sampling of that probability function. The more photons we involve the better/closer/accurate the 'tracking' of the probability becomes. We assume, or attempt to prepare, the photons reasonably similiarly, though there will be irreducible variation ( ie. Heisenberg uncertainty ). We could use photons of wildly varying energy/wavelength/frequency and chuck them all in to interfere, but that wouldn't display much QM. It'd resemble a big classical smear, as it should!
Quote:
At what point does the interference fail as the two slits are placed further apart?
It doesn't fail so much as become hard to define. The overall rule is that the slit separation and pattern features vary inversely. So wide slits give a narrow gap between an interference maximum and the immediately adjacent minimum, and vice versa. Thus at some degree of slit gap the pattern variations at our distal detection plane become unresolvable. This is why headlights way down the road must come close enough for us to decide if it's a motorbike ( one lamp ) or a car ( two lamps ). Or why a wide aperture lens gives a sharper picture than a narrower one, and the tendency for astronomical telescopes to be built with greater radius - or do tricks with arrays of them with much the same effect.
Quote:
Do you get a constant for "separation + depth" for the frequency cut off?...
I don't know, though that has an intuitive feel to it. But alas the problem with QM is that our intuition regularly augers in! :-)
That's a great applet Chipper! Nice demo indeed. Terrible explanatory notes but, seemingly classically based. The coherence comment : I've done Young's with a slit lamp - that source being a gas discharge where the relative phases will be all over the shop. Ditto for polarization. As for the superposition principle - it always applies for EM! Note that Young himself didn't have access to a laser or polarizing optical components!
That's why it really is best to think of "waves" as only an approximate model that can be applied to QM.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
OK... Here's hoping for the 'killer' slit experiment...
We have lasers/masers that produce coherent bunches of photons. Can the phase and polarisation be (reasonably accurately) controlled?
The thought experiment is that the intensity of the fringes could be 'tuned' if so.
Two thoughts are:
1: That we have misshapen gaussian 'blobs' wobbling with whatever oscillations that each get through one or simultaneously both slits;
2: Is what we are seeing a 'perturbation' in the space-time fabric for something that isn't actually there? Pretty much as positive hole conduction is modelled in p-type semiconductors (as an absence of electrons)?...
So what does hold an electron together in the first place? It is all negative so it should rip itself apart! ;-)
When two unknown Dutch physicists, Goudsmit and Uhlenbeck, put forward the hypothesis of a spin in the electron to explain double spectral lines in the atom, a great Swiss theoretical physicist, Wolfgang Pauli, demonstrated that this was not possible since the surface of the electron, according to the classical electron radius, would be rotating at a speed greater than that of light. Frightened, the two tried to recall their article but it was already being printed. They were right, of course, but never got a Nobel Prize.
Tullio
When two unknown Dutch physicists, Goudsmit and Uhlenbeck, put forward the hypothesis of a spin in the electron to explain double spectral lines in the atom, a great Swiss theoretical physicist, Wolfgang Pauli, demonstrated that this was not possible since the surface of the electron, according to the classical electron radius, would be rotating at a speed greater than that of light. Frightened, the two tried to recall their article but it was already being printed. They were right, of course, but never got a Nobel Prize.
Tullio
I found a really good lecture (translated by J.H. van der Waals) given by Goudsmit where he tells the story from his perspective, very honest and down-to-earth: The discovery of the electron spin – S.A. Goudsmit
It's hard to see how an electron is a single thing, especially an excited electron: a positron and and electron combine to form a photon, which can then be absorbed by another electron. So, how much matter/antimatter (i.e., how many photons) can fit in one electron?
Maybe the deficit of antimatter exists as positrons flowing backwards through time, which would be experienced as electrons by matter moving forward through time ...?
... It's hard to see how an electron is a single thing, especially an excited electron: a positron and and electron combine to form a photon, which can then be absorbed by another electron. So, how much matter/antimatter (i.e., how many photons) can fit in one electron?
Is there any limit to how many photons can 'occupy' the same point in space?
Can photons be 'disassociated' into an electron and a positron?
So, can you get 'heavy' electrons? Or are such entities known by another name already?
... It's hard to see how an electron is a single thing, especially an excited electron: a positron and and electron combine to form a photon, which can then be absorbed by another electron. So, how much matter/antimatter (i.e., how many photons) can fit in one electron?
Is there any limit to how many photons can 'occupy' the same point in space?
Can photons be 'disassociated' into an electron and a positron?
So, can you get 'heavy' electrons? Or are such entities known by another name already?
Cheers,
Martin
Photons are bosons, so there is no exclusion principle for them and they can occupy the same cell in phase space. An energetic photon can materialize in an electron-positron pair and then recombine. This can be shown by a Feynman diagram. A muon is a heavy electron.
Photons are bosons, so there is no exclusion principle for them and they can occupy the same cell in phase space. An energetic photon can materialize in an electron-positron pair and then recombine. This can be shown by a Feynman diagram. A muon is a heavy electron.
Eeeeeek... This is quickly getting deeper. Good summary of photons at Wikipedia as to be expected.
What would prompt a photon to transform into an electron and positron? Is any energy converted in doing that?
Could a photon be oscillating between the states of photon and electron-positron pair?
Photons are bosons, so there is no exclusion principle for them and they can occupy the same cell in phase space. An energetic photon can materialize in an electron-positron pair and then recombine. This can be shown by a Feynman diagram. A muon is a heavy electron.
Eeeeeek... This is quickly getting deeper. Good summary of photons at Wikipedia as to be expected.
What would prompt a photon to transform into an electron and positron? Is any energy converted in doing that?
Could a photon be oscillating between the states of photon and electron-positron pair?
Cheers,
Martin
This is deep water. I can only refer you to Feynman's lectures, you can find them also on line. Cheers.
The modern program of science ( QM particularly ) is that one can only restrain a theory's components by the success in prediction of observables. Thus non-observable components are unconstrained and can have any form provided the predictions run true to experiment.
[ Simplicity is also preferred, and perhaps some elegance etc too ... ]
Take the atom around the time of Bohr's model. The theory used definite orbits, radii and whatnot giving some success with energy level prediction using an overlay of empirically defined rules. However it was found that one can't actually observe an electron orbit. To get sufficient resolution to 'image' it by using some photon, say, the energy corresponding to the wavelength used results in ionization. Unfair but true! So if you had a group of suitable atoms, in some known energy states that didn't imply electrons had to be seen with orbits. So the orbit concept was then discarded - atomic systems with specific energies were deemed to not have well defined momenta and paths.
Take a photon passing from point A to point B. Then who is to know what happened in between? Only what is consistent with our only firm empirical knowledge - that a photon left A and later one like it arrived at B. Now the Heisenberg principle goes one better than just defining the mutual precision of certain measurable quantities. It also implies that some things can be unmeasurable, but can be deemed to have occurred precisely because we'll never know. Hence the idea of real vs virtual particles. A real particle is one that interacts or influences events in a detectable way. A virtual one may ( or may not ) have been present either way.
Why bother conceiving of, and loading up a theory with unmeasurable objects? Probably because it's useful for some purpose. Oddly enough if one includes in the Feynman diagrams - which categorise the various unobserved alternative paths alluded to earlier in this thread - things like a photon turning into a virtual positron/electron pair which then shortly recombine back to a photon, then it yields all known behaviours of photons. It would predict results consistent with the 'shortest distance between two points' character of photon travel for instance.
But if you take this type of program and apply it to the much more complex scenarios of particle physics, say, it gives an excellent predictive capability for the relative rates of decay modes of particles. Those relative rates come from probabilities based on the amplitudes for the modes, which in turn depend on the virtual particles schemes. But the rates are an observable - some of this, so much of that, and more of the other thing.
With the energy form of the uncertainty relations :
dE * dt > h
[ dE = energy uncertainty, dt = time uncertainty, h = Planck's constant ]
then I can make particles with any total energy ( dE ) I like ( from 'empty' vacuum if I'm keen ) provided dt is short enough so that
dE * dt the hole evaporates ....... eventually :-)
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
RE: RE: Just to extend
)
Here's a java applet on Young's double-slit experiment – you can adjust the frequency of light used, the distance between the slits, and you can also adjust the distance to the detector screen from the slits:
Java applet on Interference
No substitute for an optical bench, but you can get an idea of what's expected to happen when those parameters are adjusted. Some really good ideas, Martin, for using different media, etc. It's possible these days to engineer a 'metamaterial' with a negative index of refraction, so I was thinking that might also be another way to adjust various parameters ....
RE: There's still a few
)
Vegemite's the go! Weird taste but very nourishing ...... :-)
Ah, a single photon is going to be monochromatic with respect to itself! And coherent with respect to itself for that matter. The interference is between alternate possible paths which give the same indistinguishable result, for a single photon. If you detect at a particular point you have to consider phase differences between paths that lead to that point as if they had all been travelled simultaneously. However the nature of QM predictions are that you only get a probability for outcomes, and one photon does not an ensemble make. A probability is a ratio between two numbers, the numerator representing the outcome(s) of interest and the denominator representing the total set available for the circumstances. Hence a probability will range from zero ( never ever, no way ) to one ( every time, no doubt ).
So if we'd like to demonstrate these various behaviours we must have a group of events of some size to form a picture of the underlying probability distribution. The counts obtained from our photomultiplier ( or density of changes in a photographic emulsion etc ) is a sampling of that probability function. The more photons we involve the better/closer/accurate the 'tracking' of the probability becomes. We assume, or attempt to prepare, the photons reasonably similiarly, though there will be irreducible variation ( ie. Heisenberg uncertainty ). We could use photons of wildly varying energy/wavelength/frequency and chuck them all in to interfere, but that wouldn't display much QM. It'd resemble a big classical smear, as it should!
It doesn't fail so much as become hard to define. The overall rule is that the slit separation and pattern features vary inversely. So wide slits give a narrow gap between an interference maximum and the immediately adjacent minimum, and vice versa. Thus at some degree of slit gap the pattern variations at our distal detection plane become unresolvable. This is why headlights way down the road must come close enough for us to decide if it's a motorbike ( one lamp ) or a car ( two lamps ). Or why a wide aperture lens gives a sharper picture than a narrower one, and the tendency for astronomical telescopes to be built with greater radius - or do tricks with arrays of them with much the same effect.
I don't know, though that has an intuitive feel to it. But alas the problem with QM is that our intuition regularly augers in! :-)
That's a great applet Chipper! Nice demo indeed. Terrible explanatory notes but, seemingly classically based. The coherence comment : I've done Young's with a slit lamp - that source being a gas discharge where the relative phases will be all over the shop. Ditto for polarization. As for the superposition principle - it always applies for EM! Note that Young himself didn't have access to a laser or polarizing optical components!
That's why it really is best to think of "waves" as only an approximate model that can be applied to QM.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
OK... Here's hoping for the
)
OK... Here's hoping for the 'killer' slit experiment...
We have lasers/masers that produce coherent bunches of photons. Can the phase and polarisation be (reasonably accurately) controlled?
The thought experiment is that the intensity of the fringes could be 'tuned' if so.
Two thoughts are:
1: That we have misshapen gaussian 'blobs' wobbling with whatever oscillations that each get through one or simultaneously both slits;
2: Is what we are seeing a 'perturbation' in the space-time fabric for something that isn't actually there? Pretty much as positive hole conduction is modelled in p-type semiconductors (as an absence of electrons)?...
So what does hold an electron together in the first place? It is all negative so it should rip itself apart! ;-)
Cheers,
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
When two unknown Dutch
)
When two unknown Dutch physicists, Goudsmit and Uhlenbeck, put forward the hypothesis of a spin in the electron to explain double spectral lines in the atom, a great Swiss theoretical physicist, Wolfgang Pauli, demonstrated that this was not possible since the surface of the electron, according to the classical electron radius, would be rotating at a speed greater than that of light. Frightened, the two tried to recall their article but it was already being printed. They were right, of course, but never got a Nobel Prize.
Tullio
RE: When two unknown Dutch
)
I found a really good lecture (translated by J.H. van der Waals) given by Goudsmit where he tells the story from his perspective, very honest and down-to-earth: The discovery of the electron spin – S.A. Goudsmit
It's hard to see how an electron is a single thing, especially an excited electron: a positron and and electron combine to form a photon, which can then be absorbed by another electron. So, how much matter/antimatter (i.e., how many photons) can fit in one electron?
Maybe the deficit of antimatter exists as positrons flowing backwards through time, which would be experienced as electrons by matter moving forward through time ...?
RE: ... It's hard to see
)
Is there any limit to how many photons can 'occupy' the same point in space?
Can photons be 'disassociated' into an electron and a positron?
So, can you get 'heavy' electrons? Or are such entities known by another name already?
Cheers,
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
RE: RE: ... It's hard to
)
Photons are bosons, so there is no exclusion principle for them and they can occupy the same cell in phase space. An energetic photon can materialize in an electron-positron pair and then recombine. This can be shown by a Feynman diagram. A muon is a heavy electron.
RE: Photons are bosons, so
)
Eeeeeek... This is quickly getting deeper. Good summary of photons at Wikipedia as to be expected.
What would prompt a photon to transform into an electron and positron? Is any energy converted in doing that?
Could a photon be oscillating between the states of photon and electron-positron pair?
Cheers,
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
RE: RE: Photons are
)
This is deep water. I can only refer you to Feynman's lectures, you can find them also on line. Cheers.
Ah, well now we're into weird
)
Ah, well now we're into weird stuff for sure. :-)
The modern program of science ( QM particularly ) is that one can only restrain a theory's components by the success in prediction of observables. Thus non-observable components are unconstrained and can have any form provided the predictions run true to experiment.
[ Simplicity is also preferred, and perhaps some elegance etc too ... ]
Take the atom around the time of Bohr's model. The theory used definite orbits, radii and whatnot giving some success with energy level prediction using an overlay of empirically defined rules. However it was found that one can't actually observe an electron orbit. To get sufficient resolution to 'image' it by using some photon, say, the energy corresponding to the wavelength used results in ionization. Unfair but true! So if you had a group of suitable atoms, in some known energy states that didn't imply electrons had to be seen with orbits. So the orbit concept was then discarded - atomic systems with specific energies were deemed to not have well defined momenta and paths.
Take a photon passing from point A to point B. Then who is to know what happened in between? Only what is consistent with our only firm empirical knowledge - that a photon left A and later one like it arrived at B. Now the Heisenberg principle goes one better than just defining the mutual precision of certain measurable quantities. It also implies that some things can be unmeasurable, but can be deemed to have occurred precisely because we'll never know. Hence the idea of real vs virtual particles. A real particle is one that interacts or influences events in a detectable way. A virtual one may ( or may not ) have been present either way.
Why bother conceiving of, and loading up a theory with unmeasurable objects? Probably because it's useful for some purpose. Oddly enough if one includes in the Feynman diagrams - which categorise the various unobserved alternative paths alluded to earlier in this thread - things like a photon turning into a virtual positron/electron pair which then shortly recombine back to a photon, then it yields all known behaviours of photons. It would predict results consistent with the 'shortest distance between two points' character of photon travel for instance.
But if you take this type of program and apply it to the much more complex scenarios of particle physics, say, it gives an excellent predictive capability for the relative rates of decay modes of particles. Those relative rates come from probabilities based on the amplitudes for the modes, which in turn depend on the virtual particles schemes. But the rates are an observable - some of this, so much of that, and more of the other thing.
With the energy form of the uncertainty relations :
dE * dt > h
[ dE = energy uncertainty, dt = time uncertainty, h = Planck's constant ]
then I can make particles with any total energy ( dE ) I like ( from 'empty' vacuum if I'm keen ) provided dt is short enough so that
dE * dt the hole evaporates ....... eventually :-)
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal