Wednesday, 14 March 2012 - Jan 2022
Time may not be what you think it is, Distance
may not be what you observe, Space may be
something else altogether. Maybe gravity can finally
be understood.
|
Economics and the Natural Sciences
There are parallels between economics and the natural sciences. The law of conservation of energy and the law of conservation of financial resources appear evident in how market forces behave. Market forces represent a closed system in which growth does not take place without being cancelled out by the equal and opposite interaction of demand and supply. This cancelling out inhibits growth and gives rise to either inflation or deflation. This paper will analyse some of the parallels between contemporary economics and observe how similar ideas concerning reality and the foundation of the natural sciences share substantive logical impasses with contemporary economics that may hinder the capacity of human beings to fully understand the nature of reality and how this influences the evolution of the natural sciences. Using a purely theoretical approach this paper will attempt to draw inferences backed by a teleological flow of logic the reality of which are open to debate. Consequently, there is no better place to begin than with demand and supply. Amongst the most renowned scientists in the field of physics is Albert Einstein. In the same way contemporary economics uses Supply and Demand, Einstein’s theory of reality or Relativity Theory uses two basic structures to build an understanding of the Universe, these are, Space and Time. The known Universe can be described as being governed by Space-Time, it is therefore a Space-Time Universe. This concept forms the basic building blocks of Einstein’s model of the Universe and how it works or functions.
What the diagram demonstrates is that the entire technical structure of conventional physics depends on the interaction of Space and Time without which it appears no logical inferences can be made and upon which nearly all calculations depend to arrive at mathematically accurate descriptions or predictions concerning the nature of the Universe. However, as it is with the expenditure fallacy that hinders the ability to end poverty in contemporary economics there may be fundamental fallacies in this model of the Universe that hinder progress in physics. It is possible at this stage to identify fundamental inaccuracies in the Einstein-ian view or explanation concerning the nature of the Universe. The first critical inaccuracy in this model may stem from how it identifies and describes Space. Einstein may make two fundamental mistakes in the model he uses for his basic understanding of the Universe. Firstly, he assumes that distance and Space are one and the same. For example the measurable distance between a proton and an electron at the quantum level entails that there is “Space” between them. This assumption may be logically inaccurate. “The Space-Time Universe proposed by Einstein could be flawed for ….[the] reason that in his analysis Einstein may not clearly differentiate between “distance” and “Space”. This can lead to a number of inaccurate descriptions about the nature of Space and Time. The concept of distance belongs to a..construct in which ‘Space’ being a vacuum is accommodated or accepted to validate distance or separation between objects [or matter]. However, Space… [may] not be the same as distance... In other words one may talk about the distance between the earth and the moon, however, it would then be incorrect using the same principle to theorise that there is any Space between the earth and the moon..If someone were to say to you, ‘Look I need some Space.’ Technically, it would be completely different from saying, ‘Look, I need some distance.’ To give you distance they could simply move further away from you, to give you Space they could remain right next to you and hand you Space…. Einstein might point at the Sun theorising on how the [Space-Time] continuum functions and say it is in Space, when in fact it is not in Space since this conclusion could not have been made without factoring in distance.”[1] This flaw in Einstein’s model is made glaringly obvious by two facts of his analysis, firstly Einstein groups together Space and Time. Time in his analysis is inseparable from distance; in other words by incorporating Time in his interpretation of Space Einstein automatically computes distance in the workings of his theories, which, as we shall go on to discern, may be a fundamentally flawed method. Secondly, it becomes clear Einstein is aware of these weaknesses in his own model as he ascribes its flaws or what it fails to explain to the existence of the Ether. Einstein seems compelled to accept the existence of an ether where he states, “Newtonian action at a distance is only apparently immediate action at a distance, but in truth is conveyed by a medium permeating space, whether by movements or by elastic deformation of this medium. Thus the endeavour toward a unified view of the nature of forces leads to the hypothesis of an ether. This hypothesis, to be sure, did not at first bring with it any advance in the theory of gravitation or in physics generally, so that it became customary to treat Newton's law of force as an axiom not further reducible.”[2] Einstein further notes the characteristics of the ether “Within matter it takes part in the motion of matter and in empty space it has everywhere a velocity; so that the ether has a definitely assigned velocity throughout the whole of space.”[3] However he further states that, “More careful reflection teaches us, however, that the special theory of relativity does not compel us to deny the ether. We may assume the existence of an ether; only we must give up ascribing a definite state of motion to it, i.e. we must by abstraction take from it the last mechanical characteristic which Lorentz had still left it. We shall see later that this point of view, the conceivability of which shall at once endeavour to make more intelligible by a somewhat halting comparison, is justified by the results of the general theory of relativity.”[4] The fact that Einstein gives up ascribing a state of motion to the ether allows us to conclude that this may entail it has no definitive motion since it does not have distance and consequently is devoid of time and vice versa. Despite this the fact that he still goes on to explain the Theory of Relativity and Special Theory using absolute motion demonstrates that it is a top view theory and therefore though its inferences may be accurate, they will be accurate only to the top view model. It may therefore be concluded that Einstein’s model is in fact inaccurately theorised on Space and on a presumption based on distance. Let us examine this argument diagrammatically.
If we can for now, to catch the teleological flow of this logic accept that Space and distance are not the same this enables us to correct Einstein’s model by replacing Space with distance in the diagram. If distance and Space are not the same then where are Space and the ether? Turning diagram A on its side reveals that the intersection X of distance and Time may in fact still be inaccurate since distance and Time only appear to intersect when viewed from the vantage point of the model upon which Einstein based the logic used to describe how the Universe functions which can be described as the front or "top view". To the contrary when viewed from the side it is found that Einstein’s fundamental model for “Space and Time” are incomplete in that the two may not in reality intersect, they only appear to do so from the top view.
The fundamental model on which modern physics is built depends on the intersection at X in diagram A. For example, in trying to determine how long it would take to travel to the moon one might use distance and durational time (X). Chapman (2010) explains that “Every particle or object in the Universe is described by a "world line" that describes its position in time and space. If two or more world lines intersect, an event or occurrence takes place. The "distance" or "interval" between any two events can be accurately described by means of a combination of space and time, but not by either of these separately. The space-time of four dimensions (three for space and one for time) in which all events in the Universe occur is called the space-time continuum.”[5] This represents, what we saw earlier in the first diagram, that is, Einstein’s model uses distance for Space. Einstein’s view that Space and Time, like demand and supply in Economics, must intersect is a faulty perception based problem. This "intersection theory" is found to be untrue when observed from the side view where Time and Space (distance being Space in his model) do not in fact intersect. Consequently, the world line if improperly applied can be a fundamental misinterpretation based on perception of how laws in physics function; as what is observed is not always what occurs. The workings of Einstein’s space-time continuum and some of the inferences made based on it may be no more than a mirage when observed outside the paradigm that is the top view. As we have shown there may be no actual or fixed intersection between distance and time. The potential inexistence of this connection is capable of reducing the value of distance to 0 or make it a non-existent aspect of the continuum. Space separates, calibrates and predetermines Einstein’s notion of Space (distance) and Time. This can entail for instance that there is in fact no distance between the moon and the earth. If there is no distance as a result of the earth and the moon occupying the same Space then it can be concluded that Einstein’s model also incorrectly labels Time. There is no “durational time” required to cover the distance, since in actuality the distance is 0 (earth and moon occupy the same Space), therefore Time is 0, making the progression of Time in the “Space-Time” continuum inaccurate or simply an illusion created by the model’s top view interpretation of the Universe observed in diagram A. The concept that time does not exist is one that will take a while for the scientific establishment to digest, however, there are ever increasing signs this property may eventually be understood. Folger (2007) reveals, “Efforts to understand time below the Planck scale have led to an exceedingly strange juncture in physics. The problem, in brief, is that time may not exist at the most fundamental level of physical reality. If so, then what is time? And why is it so obviously and tyrannically omnipresent in our own experience? “The meaning of time has become terribly problematic in contemporary physics,” says Simon Saunders, a philosopher of physics at the University of Oxford.”[6]
What is labelled as “Time” in Einstein’s model is in fact a form of chronological decay or motion observed in matter taking place in the absence of Time validated by the fact that in reality the intersection X is governed by the separation Y. Economists may make this same fundamental mistake when they presume the intersection of demand and supply creates an equilibrium that generates economic growth, like Einstein’s model they mistakenly attribute economic growth (distance) to stability in the intersection X when in fact stability and economic growth are not the same, as distance and Space are technically not one and the same.
The Mystery of Growth in Economics and Origin of Gravity in Physics
The same way physics tends to have difficulty pinning down the source or origin of gravity economics has difficulty identifying and understanding the origin of wealth and economic growth. If the answer to these problems where comprehensively known and understood; poverty would not exist and the ability to control and manipulate gravity would be a common aspect of human technological civilization. Both these problems may be perception related. The logical deduction that Time does not exist (Time=0) does not confound mathematical models in physics, what it implies is that a calculation where 10 seconds elapses will mean that motion took place while time stood still, for example, a speed of 100 meters per second entails that the same calculations for “time” is used in the equation, however, analytically time itself should not be considered to have elapsed, 10 seconds for instance becomes 10 cycles or motions; it is a conversion of the idea not a loss of the idea itself. Similarly, the object travelled 100 meters, however, since in Einstein’s corrected model there is no distance covered (distance =0) something is covered, but analytically it is not distance. The conversion of this idea entails it changes from 100 meters to 100 motions. 100 meters per second changes from a spatial concept to 100 motions per cycle (100 motions being equal to 1 cycle) which is a frequency based on the relativity of the movement of objects in relation to one another in the absence of Time and distance; this entails clocks technically measure the absence not the progression of time, however, even this is suspect since from the side view the progression of Time would be considered a primitive human concept, the absence of time being a cornerstone of how the Universe functions. The measurements in physics remain the same but the fundamental properties with which they are associated change, creating a conceptual paradigm shift in how this phenomenon is understood; instead of seeing a limb as an independent force or object we instead attempt to see how the limb is structured and what it consists of and what it is attached to. What we have just analysed is that what is experienced at the intersection X in economics or physics is relative. The distance between the earth and the moon, for example, is governed by laws of physics which in turn are rendered predictable by the properties created by the intersection X; however, the intersection at X is governed by properties of Y. This new model devoid of durational time and distance is more practical since it presumes to use much less energy and resources to create the experiential or top view Universe. Think of it this way, we do not need to expand the size of a laptop’s screen to the extent of the heavens to study the stars, it is impractical to do this as it would use up vast resources, instead we compress the image to fit a 15 inch screen; the Universe may use the same approach. By the earth and the moon occupying the same space and using other properties to define the “distance” between objects the Universe uses less “energy” or effort and operates more efficiently; distance can be maximized without sacrificing “space”. Time is discarded (remains zero or unchanging) to create a continuum allowing motion in matter (which is confused for durational time) to be extrapolated over it. The chronological progression of time as human beings understand it even in its scientific context is no different from markers covered over distance travelled; both this kind of time and distance travelled are a wave form, that is, an illusion or a form of paramnesia required for the human mind to process its own reality. Since there is no distance between objects the idea that gravity, weight or mass is created by action at a distance without a medium and created by acceleration proposed by the genius, Isaac Newton, may also be inaccurate. The apple striking Newton can in fact be interpreted as a front view description of the event that is said to have directed Newton to arrive at a deduction based on a perception formed on a premise induced by the effect of the falling apple. In the same way there is no distance between the earth and the moon, there was no distance between the apple and Newton’s forehead, objects have no actual or substantive volume or mass and therefore no genuine weight. Mass and time are useful for the experiential Universe, but are not practical or efficient to the mechanics of how the Universe is created (that is, the operational Universe) it is not scientifically practical for matter or objects to be of excessive volume or weight and of primitive top view "Space" itself to be of great “distance” or of time to be of a burdensome duration [i.e. there is no past being maintained as a physical reality waiting for a time traveler to come back to]; these will all be inevitably seen as very crude ways of understanding the Universe and the physics that applies to it. The same applies to Time travel. The belief in the ability to go back in time is classical example of what happens when Einstein refers to Time as Motion. When this mistake is made it appears possible to physicists that a time machine can be built to take someone back in time, the maths and means to do this can be shown. However, this is misleading because Time is not Motion, that is, a clock ticking, even an atomic one is not Time, its Motion. Motion is Time in Einstein’s approach to understanding gravity, which is why science has not gained the ability to control gravity to this day. However, I identify True Time (Tt) is in fact similar to absolute zero where there is the absence of Einsteinian Time (Te). Te is in fact Motion not True Time (Tt). Similarly, when Einstein refers to Space (Se - Einsteinian Space), it is not true Space (St) but Geodesics, Geometry or a form of Euclidean Geometry, i.e., the Distance. True Space (St) is the 1s and 0s of information (code) from which Space is fundamentally created as it may pertain to or be understood by quantum computing. Einstein’s Model needs to be corrected to understand this problem, however, for the most part science today does not make this distinction and therefore makes mistakes in interpreting how physical forces work even if the math appears to add up.
Let me try to explain more succinctly why it is practically impossible in physics to go back in Time as Einstein may postulate. To begin with the human physical world takes place in the absence of time. This means the here and now or present is equivalent to Time=0. Time being at absolute zero means there is no past and no future, these states fundamentally do not exist as a "place" you can travel to. The present is the only reality, therefore, to travel to the past or future involves leaving absolute zero, the absence of Time or the present. However, once you step outside of Time (absolute zero) everything experienced is not real, in that it cannot be interacted with, it has no free will, events cannot be changed or altered, it is just a record. This record can be compared to a hologram or more accurately compared to virtual reality (VR). It can be described as a record or recording of the past preserved in Space-Time that cannot be changed, but that can be interacted with in the 1st, 2nd or 3rd person. These records for the persons accessing them are preserved in Space-Time and can therefore be described as a type of hologram or VR that is indistinguishable from reality or the real world. This is like walking through the light of a star. As you walk through the light toward the star you see its future if you turn and walk away from the star you see the light it shines ahead, that is, its past, you are therefore accessing natural, stored historic records. You cannot interact with any of these because they are not in absolute zero or True Time (Tt), that is, they are not in a condition that can be endowed with the present or absolute zero (Tt). To enter the present you have to exit the star's light and step on the star itself, where there is no time. Only then does the here and now present itself, and become interactive in the sense that events can be changed. When Einstein talks about Time (Te), it is not True Time (Tt or Tr) because it is Motion taking place with the mistaken understanding that Time elapses which is simply not possible because if time elapses as he believed, then what is being observed is not based on the present (Tr) and therefore it is a measurement of a recording or something that is not real, but an accurate rendition (record). Therefore, when you talk about going back in time or into the future nothing that is done there can have any bearing on the present because it is outside absolute zero. For instance, when you sit down to watch a movie or a television series the fact that you are seeing it for the first time entails that you do not know the outcome of decisions and choices made by actors on the screen. Therefore, as what you are watching progresses it would appear as though free will is active and the outcomes on the screen are unpredictable. However, what is being observed and experienced is a recording. The same applies to to consciousness. What is lacking is the capacity to discern that what is being observed and experienced is a record. Time shift takes place such that the "captive" mind is incapable of discerning the limitations of its own consciousness. This captive state skews or affects modern scientific observations and knowledge about Time. For instance, Einstein does not make a distinction between Tr and Te, his assumption is that the past and the future are real places you can go to, that is, where you can experience zero time, when in fact this is false, because like a movie recording on a VCR this past and this future can only be changed in Tr or Tt, the true present. However, in the same way a person can learn from their history and use it to alter their decisions in the present, they can see the future and alter their decisions in the present, but once an event occurs it becomes fixed and cannot be altered. In other words, what will happen in the past and will take place in the future can both only be altered in the true present where Time does not elapse. To create a time machine that steps outside of this (the true present/Time not elapsing) anything the physicist encounters as the past or future is no different from a recording or hologram because Motion (Te), or the movement of mass and matter, stops or becomes suspended (becomes a hologram or recording). For instance, going back in time will take a scientist to a record of the past. Since this record was generated using Time it can be viewed as an accurate historic record preserved in Space-Time and accessible in locations during jumps. The record itself cannot be altered. If the scientist interacts with the record or that period in history, any of these interactions will generate virtual reality (VR). Technically, this will be no different from the scientist sitting down to watch a documentary on television, except with events being accurately recounted. The level of conscious immersion will be higher and it can be determined by the scientist's level of self awareness such that it can be on par with VR.
Motion (Te) can only take place in the absence of Time (Tt). Technically one of the few ways a person can possibly discern if the world they experience is a fixed record (Te), rather than the true present (Tr or Tt) is by attempting to measure if a time-shift is in place, for example, through the comparison of two clocks. If the two clocks are not scientifically identical then as real as the world they experience my seem the indication will be that it is VR or like a simulation, which they have not evolved the ability to distinguish from true present (Tr). This deviation in time between clocks can be likened to the "totem" used by Leonardo Dicaprio in Inception. If the totem kept spinning then he knew it was still the dream state .e.g. Te, if the totem stopped spinning and fell he was awake e.g. it was the true present (Tr). Similarly, the ability for separate clocks to either have the same time or not is possibly one of the few means of humanity being able to detect whether it is in recorded VR (Te) or in real time (Tr).
At present human beings experience reality at Se-Te. They do not have the capacity to tell that this is a hologram or recording of sorts, neither do they have the mental ability to discern there is no free will in this state. This attests to the kind of power time shift may have over human consciousness. The only possible means of distinguishing these different states is for a person to experience a shift from Se-Te to St-Tt, only then may it be possible to tell that the previous state was not the present, fairly much like waking from a dream. Proof of this separation of humanity from St-Tt can be proven by testing scientifically for the lack of simultaneity for distant events. The time shift of human conscious reality or existence from St-Tt to the confined and more limited Se-Te may be shocking to science to discover and have empirical evidence for, yet once again religion seems way ahead of science in its knowledge of this fact where it states humanity was banished from Eden, into a safer albeit harsher less enviable existence, the description of which fits a time shift from St-Tt to Se-Te.
The fact that time has a direct affect on how consciousness and perception are processed is likely to make this kind of potentially mind bending disassociation and the ability to determine states in time, what is real and what is not, (like sea sickness when travelling on water or re-learning how to move in zero gravity) and is very likely to be one of the aspects of technologies that tamper with time humanity will have to learn and be trained in how to navigate through. When the technology that allows jumps through Space-Time becomes available, it is very likely vessels or passengers will be required to have specialized time dampeners to precisely control time-shift disassociation before this kind of travel can be used.
As you can see in the diagram above there is no past and no future to "time travel" into. There is only the true Present depicted by the blue circle. The blue circle represents the Side View or True Time (Tt) where Time=0 and True Space (St) or code/information/spirit. You exist in the Top View (Einstein's Space (Se) and Time (Te) at B or the "backup") where you experience the chronology of A in a kind of time-delayed-safe-zone in the past that appears to you as the present. The evidence this is not the real present but a history of it is in the fact that the clock at the bottom of a building and the top of a building or GPS clocks on earth and in space have different times, proof that you are not A but B. What you believe is the future, does not exist because it is in fact the real or true present, that is, true Space-Time (St-Tt). To go back in time would instead create a jump to another part of the universe/multiverse (see Julian and Mandelbrot sets or treat the blue and red circles as such) and during the jump you would have access to or see a true record of events or history of those locations in the orange circle of Distance-Motion which is in fact Einstein's Space-Time (Se-Te) and a jump into the future, is not the "future" but an excursion or view toward the true present A. Should anything catastrophic happen to the universe at A, the fail-safe is that it can be recovered or "resurrected" at B to restore A, since anything lost at A, is unrecoverable in this way. This model of true Space-Time (St-Tt) and Einstein's Space-Time (Se-Te) which is in fact "Distance-Motion" where Distance refers to Geodesics or Geometry and Motion is Time are sufficient to give a theoretical physicist a better framework for more accurately formulating and understanding the universe and forces acting within it. |
If a person understands that Tt is different from Te, he or she will know that a jump back in time, will not cause time travel, but will instead create a jump from Te through True Space (St) into a new geographical location in (Se), in the time shifted present, see the section later on Julia sets and Mandelbrot sets. The universe uses the chronology humanity views as the past and future to divide the existence into separate universes to create a multiverse rather than waste this energy preserving a past. Why? Efficiency: the universe will not waste energy and resources preserving records that can be stored with minimal resources as records or soft memory preserved in Space-Time, when it would rather use this Space-Time hardware to create a separate, functional independent and inherently unique universe. Efficiency and prudent use of resources is key to how the universe functions. This is why, should a jump to the past or future be attempted, what is accessed during the jump is the VR or record (soft memory in Space-Time) while what emerges is a new physical location or "hardware" in the geography of Space-Time.
You are a simulation, physics can prove it - TedxSalford by
Astrophysicist, cosmologist and Nobel Prize winner Professor George Smoot
There is actually nothing strange or alarming about Professor Smoot's Tedx lecture, although I don't agree with every part of his talk. As physicists move toward a greater understanding of quantum mechanics and quantum computing, when it comes to what the universe is made of it becomes ever more practical to compare it with code or information, which is Space (St) distinguished from Einstein's Geometric Space (Se). What is presumptuous and disorienting is the belief held by humanity that it created code which is comparable to the belief that the sun and the universe once rotated around the earth. If inorganic substances such as metals are able to communicate with one another at the quantum level (using the "fields do not exist approach") and as a result give rise to the movement of electrons (electricity) and their own physical movement and mobility through this communication, for example, magnets, magnetic particles and other forms of electromotive force (which light itself relies on) then it is not improbable that some of the earliest forms of life and intelligence could have been inorganic. These could very well have naturally evolved in complexity into superconductors, formed rare alloys, rare earth elements or compounds growing like a biological organism that relies on a form of quantum bio-mechanics in place of a biological system found in organic life. Being inorganic and capable of using ever more complex compounds and magnetic fields to move particles and minerals around they could evolve in intelligence and complexity over billions of years, from the very inception of a universe and through various types of fields could cover vast areas of Space. Objects and materials that appear as bland inorganic structures need not be regarded as lifeless simply because they do not show life-signs expected of organic life (this would be like finding the limb of an animal still kicking and because the rest of the carcass cannot be found, assuming that the writhing limb is just an unintelligent, "non-living" thing or force in physics to which a law is ascribed, which is the attitude toward magnets). The need for growth, like the biological need to feed, would lead to ever increasing processing power in these inorganic materials and would naturally create or merge with a quantum realm alongside which or from which organic life emerged. For a scientist to say the universe in its inorganic complexity is not an intelligent living thing, in this context, seems rather naive and somewhat ignorant. Electromagnetic fields and electrical activity associated with the brain, cell and limbic activity are fundamentally more inorganic than they are organic, in the sense that this activity can take place without the need for an organic physiology. There is nothing strange, special or bizarre about this unless a person cannot think outside conventional approaches and cannot at least try to broaden the scope of reason. Many physicists have already transitioned from traditional views that have been overtaken by the advancements being made in an age where information technology is reshaping ideas and scientific concepts. Human beings tend to have a kind of arrogance that emerges from ignorance, regardless of education or the lack of it. For instance, a person can reach out and pick up a glass of water and drink from it, on the other hand a magnet can pick itself up and attach itself to another magnet or metal surface, intelligence, life and purpose is ascribed to one action and not the other. This is a kind of arrogance.
link to video
Neodymium Magnets and Super Ferromagnetic Putty
Neodymium Magnets and Super Ferromagnetic Putty
If a scientist does find a way to build a machine to jump into the past they would find the time in the past they tried to jump into instead turns out to be a geographical location in the multiverse that uses the location in time to instead create a location in space (a Portal or Star-Gate of sorts), if they persist in the pursuit of this technology they are likely to instead also stumble upon how a new universe is created .i.e. a means of generating or unleashing a catastrophic force or generating and controlling tremendous amounts of useful energy. It may also be important to note that the universe will not waste energy and resources creating duplicate universes (so called parallel universes), it will instead use this resource to allow each universe existing in parallel to form and develop independently and uniquely due to the fact that variation and unique traits are naturally more valuable than endless duplicates. However, there may, now and again, arise two parallel universes that are binary or almost identical. It would probably be best to refer to these as binary or twin universes rather than parallel universes because they will be rare, "parallel" as it is currently used tends to imply that nearly identical universes are the norm or are conventional, when in fact not. These natural records, stored in Space-Time, are likely to be considered an invaluable resource, since not not only do they preserve history by time and location, they also offer a VR library of genuine, uncut, life experiences that can be reviewed as anyone pleases, in 1st, 2nd or 3rd person depending on how remote the audience wants to be from objects, actors and events being watched or experienced. These records can have many uses including verifying the truth, accuracy and quality of information which would be useful in many industries and types of work. Its probably worth noting that these VR records of history are likely to be highly detailed being telescopic, macroscopic and microscopic which entails that how the universe came to be, from inception to date, from an astronomical perspective can be reviewed as well as the historic activity at the cellular and atomic level, which would form an invaluable first hand resource for use in scientific research, weighed against previous knowledge that relied heavily on assumptions. These concepts and whether they can be realized can only be verified through the development of technologies capable of shifting time, currently outside the reach of modern science.
The interesting aspect of this though, is that if a physicist did find a way of building a time machine, they simply need to review their math and theory for how the device operates and what it does because in essence what they have done is created a means of generating large amounts of energy, which can be very useful and/or a means of teleportation between locations in the multiverse (without the need for a space craft) which may be a discovery that is also just as significant as time travel. However, during the "time travel jump", which in fact turns out to be teleportation, it will indeed appear as though they are going into the past, but these images the traveler observes are in fact records of the past between the two locations that exist as a kind of by-product of the jump (recording), not the past itself (just like watching the terrain from a car window except that what is observed is recorded history based on the locations being crossed or jumped through). These records which are comprehensive (include location, objects, thoughts and feelings deepening on which person they are accessed in) will be accessed or reviewed in 1st, 2nd or 3rd person in the space in between the jump from one geographical location in the multiverse to another. The universe keeps a time stamp or accurate location based record of everything that has ever occurred in a given location since its inception and physicists can learn to access this accurate history rather than a biased re-telling of history by individuals (the way DNA is used to gain more accurate information on events or crimes) by time travelling, which in this context means observing the past or future outside of the present (outside of Time=0). For instance, if a physicist at a location uses the time machine (which Einstein's physics shows can be built) makes a jump back in time of 24hrs to the very same location, then he or she will stay in the same place and will not go back in time by 24 hours, they will instead be able to observe a record of everything that occurred in that location (where the jump took place) over the 24 hour period during the jump. This is no different from playing back a File, CD, cassette in a VCR or watching streamed video online. A time machine is thus a device for viewing accurate records of history stored in Space-Time (e.g. a jump through time in the same location), it is a means of generating energy that can be used in industry or to power devices and it is also a means of teleportation, depending on how it is operated. This technology would be useful in science for providing empirical evidence on the history of any location in the universe from inception to date and useful in education for showing students first hand what actually took place in history. It would also be useful for determining or reviewing legal cases since the Time-Space historic records can be accessed for an audience to see what actually transpired in a case, the same way the discovery of DNA exonerated innocent people wrongfully incarcerated. By following the design for creating a time machine (which physics shows is possible, especially when re-worked to account for difference between Te and Tt) what has been explained above is what the outcome of the device would be.
The reason why the historic records between locations appear to be a past that a time traveler can create a time machine and jump into is very simply because a scientist is led to believe, by Einstein, that Motion (ticking clocks) is Time, when this is in fact not true Time (Tr). A physicist therefore has to make this correction or adjustment in his or her analysis to arrive at truly accurate or more informed descriptions of how the universe actually works: he or she must make distinctions between Tt and Te as well as St and Se to interpret observations and outcomes in physics accurately, as they are not one and the same. For instance, physicists like to say the measurement of time between a clock at the bottom of a tall building and its top will yield different results. Then what is being measured is not the real world or present (T=0), what is in fact being measured is what the physicist would measure if they had already built the time machine entered it and where observing the past or future, not the present. The "time travel" recording or hologram is evident in the time difference or distortion between the two clocks, which is not real, it is empirical evidence or an example of interacting with a recording or hologram (which is what the time machine would do with much greater depth by going back to access further records in history). However, because Einstein does not differentiate between True Time (Tt) and Motion (Te) they believe the time differentiation between the clocks is real, that is, taking place in the present when they are in fact making an accurate measurement of a recording or hologram (records of history stored in Space-Time), which they mistake for the present because they are making the measurement close to or believing it to be T=0 (the real present). The difference in time between the clocks is empirical evidence that time travel is possible, with the exception being that it will yield true historic records of the past rather than move people back in time. If this where not true the time between the clocks would be exactly the same. In the present (T=0), the clock at the bottom of the building and the top of the building have the exact same time and this unchanging, constant or universal timeline is the present (T=0) and is the same throughout the universe and across the multiverse. What does this mean exactly? If you were in the present (T=0), when as a physicist you measured time using clocks at the bottom of the building and the top of there should have been no time variance between them. The time variance is evidence that everything you are experiencing right now is a record of your past (Te) that the real you, which you mistakenly believe is future you who exists in the real present (T=0) has already done. However, your human consciousness which is actually in the past believes it is doing and experiencing everything (life) in real time, for the first time, right now, on the incorrect assumption that this is the present. Lets say that CNN records the news live (Tt or T=0). It then broadcasts the news with a 6 hour time delay (Te). You the viewer are watching this news or broadcast and reacting to it believing it is live (your current belief about where you are in time at this very moment) when it fact it is a record of what took place 6 hours ago. Everything that you are doing right now, a future you who is actually in the real present has already done. However, your consciousness being unaware of this time shift is processing a record of the past (everything you are seeing and doing right now) incorrectly as the present. In other words your understanding of your own place or existence in Space and Time is primitive because it is not aware of existing in this variance. This raises an important question, which is why?
Firstly, in essence its the realization that you exist in Te as a version of yourself existing in the past, not the real you functioning in the future Tt, that is actually the real present. The answer to why the universe is structured in this way may be quite simple, human beings do the same. Its called risk management. To understand this process you have to differentiate between Einstein's Space, which is Geodesic or Geometry (Se), not True Space (St), which is created from code or information [or spirit to add a religious perspective. Religion seems way ahead of science in this respect, it explains there is more than one kind of death - a physical death (Te) which a person can be spared from and second death (Tr), which is a death of the spirit from there is no recompense or recovery]. Should anything cataclysmic happen to the real you operating in Tt, this code, information or data and knowledge would be permanently lost beyond recovery. Not only would you die a physical death, you would also cease to exist. Therefore, the universe is cradling or protecting the real-time you (Tt) by running what it may consider just as important or more important, that is, a fail safe, off-site backup of you (Te) that exists further back in time without being aware of this discrepancy [which is you right now]. Te can be used to restore or resurrect Tt in the event that a cataclysmic event destroys a valuable part of the universe. If viewed in this way, it makes perfect sense.
Secondly, the version of you, B (functioning at Te) that is actually a backup of the real you A functioning in real or true Time (Tr) is re-living, re-enacting and reacting to what has already occurred, in first person as though it is spontaneous, has free will and happening for the first time, when it is in fact reacting to a pre-existing record flowing through time. The scientific proof of this being the case is observed by Einstein himself in that in the present (Te being experienced by B) simultaneity for separate events does not exist. Why is it important for you - B, to experience Te as though it is taking place in real-time? Once again, the answer to why the universe would function in this way may be very simple, it does this once again for risk management purposes. Technically, although B believes it has, but in fact has no free-will, the fact that it relives the record believing it has free will and is responsible for outcomes allows it to form an unbiased second opinion of events that have already occurred. A becomes immediately aware of this second opinion in real time (Tt) because B is just earlier time shifted A. If B relives the record in Te and forms a different opinion, A will spontaneously become aware of it and make an adjustment in behaviour in real or true time (Tt). This adjustment in behaviour will lead to a new record moving down the timeline that will once again be reviewed by B, unbiased because of being unaware of A, the opinion of which A will be immediately be aware of. In this bending of time, the perpetual loop (which behaves like RAM) that is created may give rise to what scientists today refer to as self-awareness and a conscience. The memory and improved ability to make decisions becomes intelligence. Human intelligence or intelligence in general, in this case, can be described as being created by manipulating time through Te and Tr as a risk management process designed to create self-awareness and improve intelligence through the interaction of A and B, which is one and the same person or organism functioning in different time settings for decisions made to increasingly have better outcomes. These improved outcomes are what accumulate as the record, which is an accessible history preserved in Space-Time (ROM). This seems to be a method for forcibly jump-starting intelligence. What is interesting about this process is the manner in which time is manipulated into behaving like a natural transistor or processor where only the present exists, but by making the system treat or view the present (Tt) as the future and the data being backed up which is past, it naturally creates a pseudo-present (Te) that is in fact the past, and uses the record of events as history, which in combination create Past-Present-Future. The reason why there is a delay between A and B is likely to be due to time it takes to capture, backup and archive A at B, which must have a maximum processing speed and involve vast amounts of data. This natural method of structuring and organizing time and information would be remarkably useful. This analysis may make some people uncomfortable, however, where a hypothesis is formed every stone needs to be turned to increase the depth by which greater clarity on a subject may be gained.
Of course, this same property will apply to gravity. All of the universe's mass, forces and distances that seem tremendous seem so only because they are seen from the top view. The reality is that they in turn are controlled by underlying properties yet to be discovered in physics, occupy just one tiny dimension and are a very small part of something much greater; namely the underlying code by which they are written and operate. Later in this paper an attempt will be made to explain why there may be a scientific basis for this in theoretical physics. In this regard the “increase” in weight or g-force such as that observed when an object is accelerated may in fact not be created by velocity since distance=0 and consequently velocity remains zero; this idea of g–forces would only be a top level “illusion”, that is both quantifiable and measurable from the top view, but that is easiest to manipulate from the side view. Economics is gripped by a similar illusion. It believes fundamentally in scarcity and the definiteness of economic resources, when in fact the volume of economic resources is not determined by what is observed or what appears available, but by the underlying operating system by which those resources are made available.
The Measurement Problem Explained: A Refresh Rate for Matter & Analytics in Economics
Economics faces the same human constraints in logic that are found in physics in that what is observed and interpreted analytically may not be what occurs. For instance it is possible to theorize a scientific basis for the inexistence of time; an attempt shall be made here to explain this view. To make the explanation simpler or clearer let us begin with a very simple approach such as this; a cartoonist or animator drawing a car can flip pages with drawings to show a car travelling at 100km/h, however, in reality the image on each page is standing still and has no velocity; consequently, similarly to begin to understand forces like gravity scientists may have to learn to accept the non-existence of durational time and rationalize this in physics.
Though the car in the video below is moving at 100km/h, from the time its started off and reached its destination it was visible in the same frame, a frame is no different from an entire universe. Since the starting point and the destination are in the same frame, in reality there was no distance between them, neither is there at the smallest point, the quantum level or the greatest, the astronomical level. If there is distance, there is matter to quantify distance, much like a ruler is scaled to measure, and if there is distance and matter, then time must exist to quantify rates of motion; if motion exists then it represents time and Relativity Theory applies: all this leads to a Distance-Time, Matter-Time [Time being Motion] or top view of the universe not a Space-Time view of the universe which is where Einstein errs. If there is no distance, then there is no motion, there is no durational time and matter exists but not as it is understood in the top view; since all matter exists in a single point when observed from the side view, that single point is a single frame or the entire universe. At this stage physics begins to move outside observable phenomenon from existing as matter to existing as information or a type of code.
When this underlying code and how it works is understood, humanity will gain a new threshold in physics and be able to manipulate gravity with the greatest of ease. The image on each page standing still entails it has no velocity and yet the object when observed on refreshed pages appears to move. This condition creates what scientists call the “measurement problem”. To date physics has failed to explain the measurement problem. As I have suggested here, if matter indeed refreshes then this very elegantly and comprehensively explains how it can appear to be a particle and a wave at the same time essentially providing a pragmatic end to this debate in physics. This very simple explanation does not need the concept of a "superposition" of two states when its not being measured, there is no need to attribute active observation or measurement or a state of non-observation that affects the particle which is quite weird, or that only conscious beings affect or are affected by this process, if matter refreshes this problem is solved. If you don't know what the measurement problem you can watch this video.
When frames used to animate an object are examined more closely it will inevitably be found that the object on each frame is standing still yet when the flipped pages are observed less closely or at a distance the object is animated. This phenomenon also represents the problem experienced in quantum physics where matter appears to be able to exist as a particle or a wave. Since we know that fundamentally the image on each page or frame is standing still we are able to infer the movement we observe in science as a “wave form”, velocity or on the screen as motion or movement is in fact an illusion, a form of paramnesia or a kind of “trick” of nature, it is a scientific phenomenon which creates empirically verified wave properties, yet without this “illusion” [Einstein’s interpretation of a “top view of the Universe] matter as human beings understand it and reality as they experience it would not exist. When the “pages are being flipped” and motion appears to take place through an animated object, matter seems to behave as a wave, however, when “moving” matter is examined more closely; on each page it has no velocity, is in fact standing still and therefore appears a particle, explaining the inevitable dilemma of how an object can be moving and standing still at the same time.
Wave particle duality in physics may in fact be a flawed concept, since waves as they are conventionally understood do not exist, but are more likely a top view illusion earlier described as a kind of paramnesia attributed to observation at the top level induced by motion facilitated by the refresh rate of the universe and therefore the refresh rate of matter; this illusion is what is referred to as the experiential Universe or that aspect of the Universe people inhabit on a daily basis. It is brought to life by matter being refreshed thereby endowing it with mobility and free will: since motion is time in Einstein's model, the refresh rate is the origin of both motion and top-view time. The experiential Universe is not an illusion per say, it is a real, flesh and blood world or existence since it has origins in a particle form, however, the fundamental properties upon which it is created rely on the wave form of matter which is technically produced by ephemeral or impermanent processes. In other words even though reality is rooted in the particle form of matter, a particle itself is impermanent. This impermanence can be better understood by appreciating that matter refreshes, constantly; in other words particles must persistently appear and disappear in order to exist for matter to appear to be capable of animation or motion. Consequently, it can be deduced that to have the ability to move matter or a particle from which matter is constructed must have a third property that is currently unaccounted for in modern physics and this is the be ability to “refresh”. To “refresh” refers to the ability to disappear and reappear; a property of matter currently unaccounted for in modern science, but that inferences show may occur. At this stage we have leaped past Einstein's model of the universe. We are able to begin to understand how we live is a universe without conventional distance, and therefore without conventional Time. This makes it easy to understand how and why the phenomenon of both quantum tunneling and quantum entanglement occur.
When we see an object moving, no matter how fast, it is in fact never in motion. If the image is closely observed, like a single frame in a movie projector's real, it is in fact completely still or frozen in place. In order to appear to move each frame must be successively removed and replaced by another. The refresh rate of matter entails that matter exists and ceases to exist so rapidly it is difficult to say which state it occupies leading to a paradox, such as that observed in Schrödinger’s Cat Experiment; it is dead and alive, standing still and moving, existing and ceasing to exist all of which appear to defy conventional thinking in physics. However, using the flow of logic thus far we can dismiss the top view concept that a particle is a wave since we know distance is a construction designed specifically for perception or put simply, it is experiential and conclude that matter always is and fundamentally remains a particle which is why when observed more directly and scrutinized closely matter will tend to appear as a particle, while its wave properties are induced by the process of refreshing these static particles or “stills”. Wiki (2010) explains that “Wave–particle duality postulates that all matter exhibits both wave and particle properties. A central concept of quantum mechanics, this duality addresses the inability of classical concepts like "particle" and "wave" to fully describe the behaviour of quantum-scale objects. Standard interpretations of quantum mechanics explain this ostensible paradox as a fundamental property of the Universe, while alternative interpretations explain the duality as an emergent, second-order consequence of various limitations of the observer. This treatment focuses on explaining the behaviour from the perspective of the widely used Copenhagen interpretation, in which wave–particle duality is one aspect of the concept of complementarity, that a phenomenon can be viewed in one way or in another, but not both simultaneously.”[7] Clearly, as we may note here, there is no duality; a particle is never really a wave in the same way the stills on frames of a projector are never actually moving, they always remain static on each frame and consequently remain in “particle” form therefore the Copenhagen interpretation may be a little misleading.
With physics on the other hand the mistakes in fundamental theory feed on one another and therefore emerge one after the other like whack-a-mole, no sooner is a solution found for one another raises its head.
Einstein and Misconceptions about the Speed of Light
It is my belief that we are entering an era of science in which the public will no longer be able to take seriously a highly qualified physicist from a reputable institution who believes the speed of light is both an invariable constant and unsurpasable limit. They must now be viewed as relics who are being eclipsed by the evolution of the paradigm that defines the very foundations of knowledge in a subject area in which they were once experts who must begin to evolve and advance to remain of any relevance to mankind's future.
Dualities in physics are often symptoms of models that are flawed and therefore do not concisely explain phenomena therefore they allow contradictory statements or theories ascribed to "relativity" to co-exist and equations are tailored to suite these weird juxtapositions. The measurement problem is one of these, but we have used a refresh rate to concisely explain how a wave and a particle co-exist. A refresh rate can be used to explain the relationship between matter (-) and anti-matter (+) and how the two states co-exist by refreshing from one state to the other (.i.e. the same matter is altering states between electron and positron). Matter and anti-matter is very likely to be the same matter alternating between positive and negative charges, its either in one state or the other which explains why it does not explode spontaneously. If alternating charges back and forth between negative and positive charges in this way occurs during each refresh then it explains why anti-matter can be present but unseen or intangible making matter appear dominant. This switching back forth may be fundamental to how matter exists.
The Measurement Problem Explained: A Refresh Rate for Matter & Analytics in Economics
Economics faces the same human constraints in logic that are found in physics in that what is observed and interpreted analytically may not be what occurs. For instance it is possible to theorize a scientific basis for the inexistence of time; an attempt shall be made here to explain this view. To make the explanation simpler or clearer let us begin with a very simple approach such as this; a cartoonist or animator drawing a car can flip pages with drawings to show a car travelling at 100km/h, however, in reality the image on each page is standing still and has no velocity; consequently, similarly to begin to understand forces like gravity scientists may have to learn to accept the non-existence of durational time and rationalize this in physics.
This flip book animation demonstrates how matter on every page is standing
still [is a particle], that is, in the Bose-Einstein Condensate. However, when the artist begins to flip the
pages [refresh matter] or heat back up the car miraculously begins to move [it becomes a wave] appears as sodium
or the sodium atom effectively solving the measurement problem.
If Einstein explains the Bose-Einstein Condensate as a matter in its fundamental wave form this is misleading. The problem with professors continuing to teach students of physics that the fundamental state of matter is a wave not a particle is that this misinterpretation then makes it impossible to explain gravity because the particle rather than the wave function or form is the conduit for mass or gravity. As we explain later this mass is not contained in the particle itself, but consists of true Space acting on the particle. It is also important remember that for physicists to see and be able to understand a "particle" is a top view or Einsteinian description of matter. We know that from the side view there is no particle. There is only some form of side-view code, and when the code is observed from the top view it appears as and behaves like a physical particle. Understanding this procedure explains why and how a wave cannot be more fundamental than a particle or object. Technically, this means physical matter does not exist as is commonly thought (from the top view) and all the physical or sensory reality associated with matter (even mass-energy equivalence in nuclear energy) is created by the interaction of electromagnetism and gravity, rather than the matter itself which consists of code. In other words, mass, texture and all "physical" attributes associated with matter do not come from matter itself but are simply "attributes" added to the code, contextually by the interaction of gravity (refresh rate) and electromagnetism, without which matter could not be interacted with .i.e. it would be incapable of being physical (solid) neither could it have mass - even something as simple as picking up a coffee mug would be impossible because it would have no physical attributes or properties. However, if magnetic fields can in turn act on the particle, it then becomes possible to manipulate gravity indirectly using the plasma's mass.
Proof that the fundamental state of matter is a particle, not a wave, was empirically proven by the Bose-Einstein Condensate achieved using lasers to cool sodium atoms down to 177 nano Kelvin. The result is a plasma that on observation is easy for scientists to mistake for a wave form. It is in fact atoms being observed as a fine cloud of fundamental particles or a particle soup. (It appears to align only because the particles appear uniform and cascade only because they are aligned under the influence of a magnetic field) in basically the structural pattern, thumbprint or blueprint of sodium. Hypothetically, interfering with this field would alter its "field pattern" or blueprint causing the plasma to change from sodium into some other substance when it condenses (heats back up). Technically using lasers to melt substances by freezing them to 177 nano Kelvin and below should allow scientists to create any substance or form of matter by altering the pattern of plasma while still in its melted state and let it condense into some other form of matter (performing what would be considered the equivalent of the Philosopher's Stone in Alchemy).
If all matter is in fact created from light as a fundamental particle infinitesimally smaller than an electron that can be observed in the Bose-Einstein condensate then it may be necessary to revisit the structure and nature of light itself. If magnetic fields are dismissed using a "fields do not exist approach" it may be necessary to consider that light is made up of a uniform particulate that propagates through transistors (Space or St) not fields (Se or Geometric Space) as is observed by the naked eye and experienced. If light is controlled by transistors not electromagnetic fields it means that it is inherently, non-contiguous and does not move from one place to another, but uses a stimulation process that only makes it appear to travel. Light can behave as both a wave and particle. However, we have seen that light waves can exist without the need for magnetic fields by being non-contiguous and controlled instead by using entanglement and made to produce "light waves" moving at 299,792,458 m/s when in fact light does not travel this distance or move at all (see the animation below). By generating light waves these transistors are not only able to make light appear to travel, but also generate depth, width and motion (where motion is mistaken for Time in Se-Te). For this process to be understood it must be accepted that light in general consists of a fine uniform particulate as observed in the BE condensate. This particulate is turned into waves using a similar process to the way electrons are used in semiconductors to process information.
Furthermore, this plasma state and the ability to influence it using magnetic fields to create a particle beam possibly offers one of the few plausible methods for using electromagnetism to indirectly control gravity by manipulating the plasma. This works because suspending motion in this manner is an artificial way of slowing down time [motion] (another safer route to the Hutchinson effect), which gives physicists access to the manipulation of the mass of the plasma through electromagnetic fields.
The reason why this aspect of physics has not been done
and the technology not made any progress thus far is because, once again, the
observation has been misinterpreted, in this case by calling what is observed
in the behaviour of the cloud or plasma waves instead of fundamental particles.
In other words the method and technology for controlling gravity and
potentially building any substance using fundamental particles as building
blocks in manipulated fields has already been developed. The challenge with
this type of technology of course will always be the method used to suspend motion
which requires extremely low temperatures.
Proof that the fundamental state of matter is a particle, not a wave, was empirically proven by the Bose-Einstein Condensate achieved using lasers to cool sodium atoms down to 177 nano Kelvin. The result is a plasma that on observation is easy for scientists to mistake for a wave form. It is in fact atoms being observed as a fine cloud of fundamental particles or a particle soup. (It appears to align only because the particles appear uniform and cascade only because they are aligned under the influence of a magnetic field) in basically the structural pattern, thumbprint or blueprint of sodium. Hypothetically, interfering with this field would alter its "field pattern" or blueprint causing the plasma to change from sodium into some other substance when it condenses (heats back up). Technically using lasers to melt substances by freezing them to 177 nano Kelvin and below should allow scientists to create any substance or form of matter by altering the pattern of plasma while still in its melted state and let it condense into some other form of matter (performing what would be considered the equivalent of the Philosopher's Stone in Alchemy).
Britannica describes Bose-Einstein condensate (BEC) as "a state of matter in which separate atoms or subatomic particles, cooled to near absolute zero (0 K, − 273.15 °C, or − 459.67 °F; K = kelvin), coalesce into a single quantum mechanical entity—that is, one that can be described by a wave function—on a near-macroscopic scale." Firstly its not a single quantum entity, it consists of a cloud of fundamental particles from which the atom is built, secondly the wave function is created by a magnetic field (it is not evidence of wave particle duality, in fact it proves the opposite which is that fundamentally matter is a particle not a wave, i.e. the state of matter when the projector's wheel stops spinning (when time stops) the object in each frame is a particle, shown in the above image captured of Bose-Einstein condensate. Physicists to this day are still getting this wrong. In all likelihood if temperatures fell further the electromagnetic field itself would collapse leaving only the plasma without a wave function or pattern that forms the blueprint for sodium.
The atoms when viewed normally in Se-Te (Top View) are viewed as physical objects objects. However, when temperature is sufficiently lowered toward absolute zero using lasers the view changes to St-Tt (Side View) and the particles that make up the atoms become visible. They are uniform. Each particle has mass and is entangled with other particles in the cloud. Though the cloud appears as a single indivisible substance it should not be mistaken for this as it consists of fine particles. Entanglement furnishes each particle with information on how it should behave or move and this makes energy and matter programmable. The entangled particles moving in harmony create the particle cloud or Bose-Einstein condensate. The code, algorithms and other information being fed to the particles through entanglement, that determine how they behave, is what determines the substance they become. The manner in which the particles move in unison make them appear as though they are a wave or are under the influence of a magnetic field. Any fields should be considered a by-product of quantum entanglement. Quantum mechanics programs the particles, is responsible for communication and for harmonizing the fine particle cloud or Bose-Einstein condensate. Fields are not the origin or source of harmonics. The fundamental state of matter is an individual like particle, not a wave. Although waves and fields are useful for explaining what is being observed, it may be more accurate to state that the waves are not being created by fields, but by quantum mechanics. Each uniform particle, at this stage of analysis, can be considered part of the process by which a bit of information is created and processed. |
The image above is the more accurate emulation of an atom. The element can be referred as a "Tron". It only shows one element in orbit, for simplicity, to make it easier to understand. Though there is only one Tron in the diagram, it remains plausible there can be a number of Trons and orbits in an atom, each dipping into and out of the nucleus at the same T3 neutron nexus. There can be Trons in the same and different orbits moving as electrons while some are moving as protons in the nucleus and they exchange places as they travel into and out of the nucleus switching polarity as they do so. When the Tron is at T1 it is described as having a negative charge and is therefore called an "elec-tron". When its orbit reaches T3 it reverses polarity. During reversal or the process of switching polarity it has no charge may be referred to as a "neu-tron", when it is at T2 it has reversed polarity and is described as being a "posi-tron". It cycles back into orbit to T1 repeating the process. It moves under its own internal power directed by quantum mechanics. |
MRI scan of an atom. The dimple in the atom which can vary in depth and width corresponds with "Trons"dipping into and out of T1-T3-T2 orbits. The scan is 3 dimensional and it fits what is predicted by T1-T3-T2. . see Magnetic resonance imaging of single atoms on a surface. |
When it comes to quantum mechanics the purpose of energy bands becomes clearer using the new construct. The diagram above shows that the orbital energy bands are designed to repel any Trons in orbit. Electrons that are negatively charged will be compelled to exit a negatively charged energy band in external orbit at C. When in external orbit they will therefore be repelled until they find the nearest exit, which is the positively charged energy band or orbit in the nucleus. However, when Trons enter the nucleus they become positively charged. A consequence of this that the positively charged energy band in the nucleus pushes them away and they once again have to look for the nearest exit which is back into external orbit at C. However, when they enter external orbit they become negatively charged electrons and are once again pushed along until they find an exit. This continuous cycle inevitably creates a "motor" or the energy observed and referred to as atomic energy. The fact that this energy can be harvested in a reactor entails that this design not only creates a "motor", but also in the process creates a "dynamo" or "generator", basically a reactor. The energy bands act as power lines feeding Trons with momentum that pushes them along and keeps them circling. Trons dipping into and out of the nucleus does not give off radiation or photons because they are riding energy bands into and out . the nucleus at constant velocity. When the atom is observed the process of dipping into the nucleus is not obvious and it appears as though the electrons simply maintain a round or circular orbit, which is the classical manner in which they are depicted. Understandably the Trons are moving at such high velocity that when when observed it will appear as though they are standing still and have a permanent residence in the nucleus and in orbit when in fact they are constantly on the move throughout these locations making them fundamentally impermanent. The Tron likely has a standard mass. The only reason why the proton is heavier than the electron is that when the Tron circles into the nucleus its rate of acceleration obviously increases due to travelling a shorter circumference and this makes its mass greater than when it is in outer orbit as an electron. Importantly, it explains why Trons appear to never run out of energy and collapse into the nucleus, as is expected for a Hydrogen atom for example. Rather than collapse, they intentionally dip or dive into the nucleus, however, this design entails the energy bands act as accelerators. It also explains why neutrons are difficult to find, Trons become neutral for the shortest duration. They are riding energy bands where they circle into and out of the nucleus. These energy bands (from which quantum mechanics gets it name) by repelling Trons both in the nucleus and in external orbit keep feeding them with energy or momentum with which they amplify mass at T4 and sustain their orbital movement. The energy bands in quantum mechanics are simply "fields". This complex explanation is necessary when the atom is viewed to have electrical charges and magnetic fields. However, all this complexity can be dropped if the Trons, which at this stage can be compared to graphical "sprites" are simply programmed to move in the manner observed, and therefore orbit under their own power, which may very well be the case. In addition to this the path they take may be determined by underlying code, in which case the energy bands or "quantums" of quantum mechanics, like "fields", do not exist, as they are nothing more than by-products of processing taking place hidden below the size and scale of electrons where the BE condensate or fine particulate constructs an atom - see the animation below. |
The Collision Drive: harnessing gravitational force
By advancing the understanding of how gravity works from Newton to Einstein to autonomous matter using the "fields do not exist" method applied from brute force analysis it is possible to engineer an apparatus and method that emulates how the Gravitron in atoms shown in diagrams above as T4 generates gravitational force. This apparatus and method is called the Collision Drive. The Gravitron is the Graviton or "Higgs Boson" and the mechanism replicates and generates a gravitational force at T4 that can be pointed in any direction to create propulsive force.
The Collision Drive uses Mechanical Engineering to emulate the
force at T4 creating what can be referred to as entry level
or tier 1 gravitational force
When analysing the mechanics of an atom it is important not to be mislead by terminology. A certain degree of flexibility is required. For instance, descriptions such as negative and positive charges and energy bands or a neutron can be useful when trying to explain how particles behave. However, a flow of electrons may be, in terms of mechanics, no more complicated than the flow of water in a downward sloping river. There are implications when it is believed that negative and positive charges actually exist when in fact these are just different directions, types or stages of acceleration or that there are "permanent" electrons, neutrons, protons and gravitons in an atom when in fact these are just Trons momentarily in different parts of an atom or that orbits are circular when in fact they are curved into and out of the nucleus. To use analytical brute-force to unlock the secrets behind these problems we can then say "electrons, protons, neutrons and gravitons" do not exist in order to force a different explanation, approach or dynamic to the analysis, when what we are in fact saying is all of these components of an atom are created by Trons. Just like the "fields do not exist" method of analysis it can be said electricity and electrical charges "do not exist" because they are just aspects of the mechanical nature rather than the "electrical" method in which an atom is thought to operate. They therefore have to be handled dexterously to avoid interpretations being limited by how any terminology is used, framed, defined and applied as the very process of defining can hinder the capacity to make deeper and more accurate inferences. Therefore, we need to be careful not to lose the capacity to think and solve problems outside the definition when scientific terminologies are created.
If all matter is in fact created from light as a fundamental particle infinitesimally smaller than an electron that can be observed in the Bose-Einstein condensate then it may be necessary to revisit the structure and nature of light itself. If magnetic fields are dismissed using a "fields do not exist approach" it may be necessary to consider that light is made up of a uniform particulate that propagates through transistors (Space or St) not fields (Se or Geometric Space) as is observed by the naked eye and experienced. If light is controlled by transistors not electromagnetic fields it means that it is inherently, non-contiguous and does not move from one place to another, but uses a stimulation process that only makes it appear to travel. Light can behave as both a wave and particle. However, we have seen that light waves can exist without the need for magnetic fields by being non-contiguous and controlled instead by using entanglement and made to produce "light waves" moving at 299,792,458 m/s when in fact light does not travel this distance or move at all (see the animation below). By generating light waves these transistors are not only able to make light appear to travel, but also generate depth, width and motion (where motion is mistaken for Time in Se-Te). For this process to be understood it must be accepted that light in general consists of a fine uniform particulate as observed in the BE condensate. This particulate is turned into waves using a similar process to the way electrons are used in semiconductors to process information.
In this great video, Dianna Leilani Cowern (Physics Girl: follow the link to her
awesome YouTube channel) uses an experiment
with a tone generator, mechanical vibrator, plate and sand to
create patterns from sand. The sand can be compared to
the smeared atom or fine particulate in the BEC. The black plate
represents the medium for entanglement connecting all the particles together.
The pattern created by the plate is code that designs the electron.
This is the view of the atom in 3 dimensions, that is, the atom
at close to absolute zero or in the absence of time (without the curvature of Space-Time). When the same pattern is viewed in 4 dimensions, at room temperature what is seen
is the atom instead of the smear. Her experiment is identical to how it is predicted
the atom is created from the Bose-Einstein condensate in diagrams above.
Familiar structures: How the BEC designs orbits and atomic structures |
If you can understand how this system works then it should be clear why nature's processing speeds are faster than any microprocessor in the world today. Its clear nature uses 1s and 0s to read and store information but not to process it. To process information, find the solutions to equations and calculations the 1 and 0 system used by computers is far too slow. Even if the fastest microprocessors in the world were clocking 10Ghz and above this would seem infinitely slower than the method nature uses to process information. Earlier it was noted that nature seems capable of communicating 10,000x faster than the speed of light. Processing at these speeds and more provides a clue as to just how much more efficient this method is. The fastest computer in the world today could not keep up with this method. In the same way that digital is better and faster than analogue, this system is faster and better than digital computing methods.This method is not based on absolute answers, but the highest probability of the solution being correct. Computing at these high speeds explains how different kinds of entanglement are possible. Its very interesting. Can you see how it works?
Light waves and fine particulate in the BEC (est. at 1 trillion transistors per electron) are controlled through entanglement by transistors to create electrons and atoms are non-contiguous. This means the waves are created by fine particles remaining in place and transferring energy - using the Mexican Wave method. When this is observed it appears as though light is traveling from one location to another, when in fact it is being transferred not propagated. The fine particles in the animation are not flowing from one location to another, by staying in place, rising and falling in synchrony transistors can create the illusion of distance and movement from location to location. Light does not travel from the sun to earth. It is simply stimulated. This implies that light itself, like any other substance, is in this sense created by natural transistors. All 4 Space-Time (Se-Te) dimensions, that is, 3 directional movement plus Time are created by these transistors (St-Tt).The fine particulate moves and forms waves as though they are in a field when in fact this is not the case as their movement is being controlled by entanglement shown by the red line vector in the animation above. |
The basic method for differentiating waves is wavelength or size then we must conclude that the fundamental particles in the BE condensate, behave like electromagnetic wavelengths (see the wave motion in the animation above) only because they are being controlled by natural transistors when they propagate, using entanglement to apply the 1s and 0s of code not electromagnetic fields ,which like the blue line in the animation above are just a by-product of the process. The animation demonstrates that "fields" and waves are a by-product of the manner in which entanglement orchestrates the movement of fine particles. The wave form is non-contiguous. It is created by transistors orchestrating the movement of fine particulate not magnetic fields, although the movement when observed will be attributed to the presence of a field or waves propagating from one location to another, just like the blue field line in the animation above. There is no field line, this is just a misleading observation, the two bodies negotiate movement around one another using the red line vector or entanglement seen in the animation. Entanglement is linked to the presence of transistors. Earlier we proposed that a single electron can be made up of a density of 1 trillion transistors with true Space (St) seen in the manner the BEC acting as a semiconductor mechanism that processes and controls light or the fine particulate to produce atoms. This process governs the uniform particulate that produces atoms of different size and therefore of different mass that propagate on a measurable scale. In other words, in the same way that there is visible and invisible light on the electromagnetic spectrum there must also be visible and invisible atoms or matter on what can be described as being created from an electroparticle or matter spectrum currently unaccounted for in physics. This fine light particulate when manipulated by natural transistors form complex matter and organisms and endow it with mass at T4 to give it physical properties observed in material objects. T4 is simply tier 1 gravity, or 1st generation gravitational force the production of which has already been made possible and worked out using a collision drive. Shortly, how to emulate the St transistors will be similarly figured out, like tier 1 gravity this need not initially be a complex process. They are part of how the BEC manipulates uniform particulate to create atoms that in turn create the physical universe, except that in the BEC natural transistors processing uniform particles, that appear as light or photons but that are actually made up of fine particulate infinitesimally smaller than electrons (est. 1 trillionth the size of an electron) is currently not accounted for in physics. (see the diagrams above that describe how a Tron creates atoms and amplifies mass at T4.)
Wikipedia describes the above chart as "Wavefunctions of the electron in a hydrogen atom at different energy levels." As mentioned earlier this description is of course another commonly accepted and propagated mistake in physics. These images represent homogeneous particle clouds or "plasma" not waves. The patterns are created by electromagnetic fields which define the characteristics of the substance they create. This plasma consists of fundamental particles at the quantum level. Each of these particles moving as though they are single entity has mass. This mass comes from true Space (We know that when Einstein refers to space he actually means geometric space and when he refers to time, its not genuine time, its motion.) Since the mass in the particles come from true space and the patterned fields are electromagnetic this is the only level at which gravity and electromagnetism intersect. This intersection is important because it allows electromagnetism to control gravity through the mass of fundamental particles. This is a derivative of quantum mechanics because controlling gravity in this way involves the acceleration of the mass inherent in these plasma clouds and is mechanical in nature. It is not a direct control of gravity itself which requires operation through true space rather than the particle. [When observing phenomena, even at the quantum level physicists need to retrain themselves to understand what they are seeing. They have to observe from the 5th dimension not Einstein’s 4th dimension. To understand the universe from the 5th dimension it becomes necessary to subtract the lower dimensions (1st to the 4th). This entails removing distance, motion, time etc. Through the subtraction of these the observation takes place from the 5th dimension (side view). Here mass, volume, weight, energy do not exist or are not the same as they are seen and experienced from the top view (4th dimension) for instance, in the way it is between software and hardware - an image on the screen is different from the underlying code from which it is written. This way what is observed can be correctly interpreted. [For instance instead of structure, lines, instead of patterns, code, instead of energy levels, markers] |
Furthermore, this plasma state and the ability to influence it using magnetic fields to create a particle beam possibly offers one of the few plausible methods for using electromagnetism to indirectly control gravity by manipulating the plasma. This works because suspending motion in this manner is an artificial way of slowing down time [motion] (another safer route to the Hutchinson effect), which gives physicists access to the manipulation of the mass of the plasma through electromagnetic fields.
Professor Jim Al-Khalili explains how everything
can be described using information
Mistaking particles moving in formation under the influence
of a magnetic field for waves. The atoms are made up of fine particulate clouds.
Firstly what is being observed are
particles from which atoms are constructed (forming a fine particulate cloud or plasma) not waves. The problem is that atoms are currently thought to be particles, when in fact the atom is not the final particle. It is in turn constructed from the particle cloud. These particles appear to
move in unison because they are uniform and under the influence of a
magnetic field, not because they have literally become waves. This is a
misinterpretation of what is being observed. This mistake is easy to make if an observer believes an atom is the [smallest] particle, when in fact the atom itself is constructed from the particulate cloud or Bose-Einstein condensate. If this is true then technically by
advancing this very same laser based technology not only can any form of matter
be designed and built from the ground up by changing its blueprint, that very
same plasma under the influence of electromagnetic fields should be able to
indirectly control gravity. This could be described as the equivalent of 3D printing at the subatomic level or on quantum scale. The only problem with this method is that it cannot be done at room temperature. However, achieving this should simply require a little innovation. Each atom is constructed from a cloud of fine uniform particulate. Each particle in the cloud is acting in unison with other particles by communicating directly with one another through quantum mechanics to perform a dance, so to speak, that appears as the wave form being observed which is in turn thought to be under the influence of a field, when in fact the movement of each particle and the cloud occurs through quantum mechanics, most likely by the particles being entangled.
Electromagnetic Fields Do Not Exist
If the fundamental state of matter is a particle and not a wave, it requires us to reflect on the ultimate validity of electromagnetic fields. This means that the entire superstructure of physics where electromagnetism is concerned sits on a fallacy or misinterpretation of observations concerning electromagnetic fields. I have for a very long time viewed the interpretation of what electromagnetic fields are with some scepticism. The greatest weakness of modern-day physics is misconception, misperception or incorrectly interpreting observed phenomena, a problem that affects the works of even the most renown physicists in history. We see this with Einstein’s Space-Time, when fundamentally Space exists outside of Time making this basic descriptive of its own theory an oxymoron. The same kind of mistakes may apply to our understanding of electromagnetic fields which affects a host of research outcomes from renown figures such as Faraday, Maxwell, Schrodinger, Dirac and so on.
Magnets of opposite poles attract, magnets with similar poles oppose one another, magnetic fields can be drawn using lines of force with iron shavings. Try to force two similar magnetic poles together and the resistance is potent, the magnetic field is almost fluid like in your hands as the two magnets oppose one other. However, the reality is more likely that there is no “field” whatsoever between the two opposing magnets. The “force” that is created that pushes the magnets apart is not a field at all. It is merely acceleration taking place within the magnetic material. In other words, the two magnets (in and of themselves) are accelerating against one another, in opposite directions. In other words, there is no fundamental difference between how magnetism and gravity works both are merely forms of acceleration.
Let’s say at every conference a person hosts, as soon as he arrives, he asks all the people to stand up and all those sitting in the front row to form a circle around him, facing him so they can introduce themselves, then everyone sits down. Doing this then becomes a general rule at all his conferences. Would you then say as soon as this person enters a conference, he exerts a magnetic field? Those sitting in the front row are pulled or drawn by this field to him while the rest only stand because they are further away? No. Yet this is exactly how modern physics describes the action of “force” or “fields”. There is no field. Everyone in the room is following a rule or script (equations and laws in physics and code or algorithms in software). They are moving into position not as a result of an external motive force, but by their own internal energy and movement. In other words, something is taking place inside the magnetic material, not around it. This points to the fact that all matter contains this electromotive ability or internal capacity for motion observed in magnets. Even when iron shavings are used to draw a magnetic “line of force” or flux lines, this is an illusion or delusion because each individual iron shaving is moving itself into position, it is not being moved into position by the force exerted by a field. In other words, the entire process consists simply of internal acceleration. Essentially most of what you've been taught in physics about magnetism, as early as secondary school or high school, all the way into university, to this day, is misinformed.
One way of looking at this, is that when a child takes a magnet and brings it close to another magnet on a smooth table and that magnet is propelled away, it can be interpreted that the magnets exchanged information and the magnet on the table propelled itself away from the approaching magnet. The entire process can be viewed superficially from the position of cause and effect. However, for this to be true how did the magnet on the table know it was required to move? The current response is it was pushed by a magnetic field or force. This is inaccurate because we can now deduce there is no magnetic field or force. The two magnets are communicating with one another at the quantum level and using a quantum level process to create internal self energized motion. If this is true then what implications does this transfer of information between magnets or communication have for quantum computing? If magnets are already using some form of quantum computing by using entanglement to control their movement, behaviour or reaction through communication with other magnets or magnetic materials based on an internal algorithm, then why is quantum computing so difficult when simple objects and materials like magnets and metals are already using it in place of physical contact that is required to validate cause and effect?
This video makes an observation about magnetic force
that is ingenious
This then requires physicists to look deeper into how these magnets are interacting. Each opposing magnet is not being pushed away by the other magnet. It is in and of itself moving, in such a manner that it only appears to be under the influence of a “field”. There is no field. When a pole is reversed and a magnet jumps up, dashes across a table and attaches itself to the other magnet, it is not being attracted by a “force”, but is in and of itself providing the energy, momentum and movement or acceleration with which to jump and attach itself to the other magnet. There is no field. It's difficult to shake the notion that permanent magnets somehow behave like powerful transistors and emitters capable of naturally storing, exchanging and acting on information in the form of algorithms to create magnetic effects and reactions. The perceived "magnetic field" though technically not a force seems to present evidence of this being the case.To postulate that electromagnetic fields do not exist, also requires us to revisit our understanding of electricity, how it works as well as the function of positive and negative charges and what they really are. When a magnet moves this seems to be a response to or execution of code (in the form of algorithm). The evidence that the magnet is moving as a result of an algorithm or code being executed is the appearance of what are observed as fields or field effects. Fields appear due to the fact that when nature initiates an action through code, matter will move as fast as it can to obey an instruction and the "field" appears due to the fact that processing speeds exceed the physical frame rate of matter or what can be technically described as "reality" experienced by human beings. For instance, if the attribute that governs the location of an object in 3 dimensional space is changed using code, the instantaneous nature of the relocation from one place to another will occur through Space (real space not the Distance) side stepping the slower speeds of cause and effect which causes a disruption to reality will be observed and described as a field. This implies that quantum computing may be the means that offers access to Space and may provide the technology with which it would seem possible to hack reality itself. It makes sense that Nature would create reality from code as this represents the most resource efficient way of generating a universe. This direction is not in conflict with religious beliefs [since this approach may bring into question certain values]. In fact religion seems to be ahead of science, as referred to earlier, since long before science developed "code" religion had already identified it as "spirit" therefore alluding to the fact that it was and is already aware of this being the nature of reality. For instance religion is already aware that spirit, reality or matter (code) must obey instructions (faith), for example, telling a mountain "be removed and be cast into the sea." perfectly demonstrates how instructions given at the sub-quantum level (code) will generate results that appear to take place outside cause and effect. The ability to speak directly to spirit and instruct it is therefore no different from a programmer accessing and instructing data or basically coding. However, religion which is yet again way ahead of modern science appreciates that this access requires knowledge and a unique kind of effort the nature of which modern science does not as yet understand and is still too backward to grasp; "However, this kind does not go out except by prayer and fasting." Nature appears to use quantum tunneling in both both biological and plant based processes, for instance in photosynthesis. Quantum tunneling can itself can be easily explained using code with the detection of "waves"or "fields" not being the cause but evidence of changes in code that affect attributes of matter or particles related to their location.
The circular magnet hovering above the pad is not
suspended in mid air by a magnetic force field, that
force does not exist. It is suspended in mid air by itself,
that is by changes taking place at the quantum level,
in exactly the same way that gravity only appears to suspend the earth in Space.
Gravity and magnetism are therefore both simply forms of acceleration.
When the circle or disc in the video is forced down it appears as though there is a magnetic
field (cushion) resisting this downward push by the hand. Modern day physics
as high up as colleges, universities and engineering or science and technology
institutes teach and interpret what is observed as such. This is actually wrong.
The circle or mass within its own matter independently resists the downward
push of the hand. There is no field or field "force" acting on the circle,
this is a modern misdirect or misinterpretation of what is being observed.
Physicists are misdirected by the belief objects cannot independently suspend themselves
in mid air or space. To appreciate this requires a counter intuitive approach to understanding
this phenomenon. A bird flapping its wings requires air to fly or glide. A bird cannot
fly in a vacuum. However, any material be it magnetic or non-magnetic can fly or float, even in a vacuum on nothing more than the interaction of the subatomic particles within its own mass without
the need for a "field" or "force" to act as a means of cause and effect e.g. like an [air] cushion or field. Levitation is a fundamental property of all masses. Physicists can work out the mathematics
and algorithms required to do this. The only reason it has not been done
is simply because self levitation or internal propulsion stimulated or occurring independently within any given mass appears to defy logic, when in fact it does not. When the "fields do not exist method
of analysis is applied how planets orbit and remain suspended in Space is
fundamentally no different from how magnets behave. This
property makes magnetism and gravity simply aspects of acceleration at
the quantum level. There is no mystery to make this problem unsolvable.
[As you've probably guessed by now the algorithm is solved, done and
has been applied.]
What is one of the major implications of the "fields do not exist" approach? If magnetic fields do not exist then one of the major inferences is that energy itself, not just information can be transferred wirelessly through quantum entanglement. Since entanglement ignores distance and is instant this widens the scope of wireless charging beyond anything currently considered. It means a power source and the machine or device being charged can be in different places anywhere in the world or across vast astronomical distances without proximity being a requirement for any device or machine's access to power. This implies that through entanglement a power station on earth could supply power in real time to anywhere on earth, but it could also do so just as competently to a colony on the moon or on mars. Supplying electricity using entanglement would be precise, as it could be to a specific entangled device, machine or vehicle, within a specific range, metered and precisely controlled which makes it more attractive than Nikola Tesla's attempt to broadcast power. In essence it represents a new kind of smart programmable energy that has diverse applications and can be supplied irrespective of the distance between the power station supplying the electricity and the client. These are some of the broad range of benefits of switching from trying to understand and control energy through magnetic fields, to doing so through quantum mechanics.
When a superconductor is cooled such that it is able to levitate,
quantum lock or travel along a non-existent
"magnetic field", the reality is that this levitation, movement and locking effect take place
within the superconductor itself. Cooling the superconductor may
simply alter acceleration of particles
within the superconductor itself.
An explanation for this behaviour without fields is quantum entanglement. When two magnets or magnetic materials resonate they become entangled. Without magnetic fields how electricity is viewed may need to be reviewed, for example, by viewing positive and negative charges as bits of information (ones and zeros) rather than charges and electrical power or the "flow of electrons" and electrical effects such as heat, light and so on as being just a by-product of exchanges of information between materials more than it has do with electrical power, that is, electrical energy, gravity and magnetism are all merely a by product of materials exchanging information though processes related to quantum entanglement.
In other words, if fields do not exist, then technically, neither does electricity as its just a by-product of exchanges of information between materials at the level of quantum mechanics. Since the focus is on the utility value of energy, emphasis on the unseen exchange of information, the science of which is more important to quantum computing, is overlooked. The analogy "information is power" may be closer to the truth than is thought. The only attribute that separates gravity from magnetism is the quality of resonance between different materials. Magnets resonate when in proximity and consequently begin to exhibit the push and pull effect mistakenly attributed to fields or exchanges of information or mistaken for electrical energy. Resonance is therefore what predetermines entanglement and whether there will be a magnetic or gravitational response from within an object's mass. This implies that there are different levels or types of quantum entanglement possibly created from exotic combinations of elements and minerals. Human beings are acquainted with resonance. Its the powerful feedback experienced whenever two frequencies overlap and consequently involuntarily create a loop. This loop is basic a algorithm. For example, resonance can be caused by the acoustic setup of a microphone, amplifier, speaker and guitar which create positive loop gain that causes powerful feedback. Some materials can use resonance to become entangled regardless of distance, some can become entangled regardless of substance to take on gravitational properties, while other materials resonate when in close proximity and take on attributes of magnetism. Resonance which is observed in the alignment of domains creates or is affected by entanglement which alters the behaviour of atoms that generate an internal motive or electromotive reaction that is observed as magnetism or electricity. The same applies to gravity. No fields are involved. Fields are only descriptive of how the gravitational effects and magnetic materials will behave. This means this behaviour is not based on a “force”, but on an algorithm, much in the same way an engineer designs software to encode how a program behaves. Equations to do with fields or magnetic force are no different from loops or algorithms. Fields become apparent or observable when changes in code take place and are executed e.g. a magnet moves, pulls or pushes and cutting magnetic flux lines generates a charge. This is how electricity and movement takes place [magically] without there appearing to be an external cause (action at a distance). In other words this tends to imply that gravity and magnetism are actually one and the same. They are both caused by acceleration with motive force operating and originating internally within masses. Not by fields. Neither magnetism nor gravity are created by "fields", therefore both mass and motion are not created directly by fields (see the example later were blue spandex, when stretched is used to explain to science students how a field creates gravity and how these students are actually being mislead by the teacher as this is a misinterpretation of what is being observed.) When Special Relativity is used to explain away the fact that "magnetic fields" or "magnetic force" do not exist, this becomes a significant misdirect and the truth is lost.
Simple but fundamental mistakes like this that prevail in physics to this day have dire consequences when it comes to the capacity of scientists to conclusively understand gravity or how it relates to electromagnetism. We see similar intellectual and cognitive impasses in economics where technocrats running and managing important organisations are unable to see or understand losses to subtraction that erode as much as 100% of GDP per annum, where they tend to focus on or become consumed by non-essential or less critical economic problems which cause far less damage to modern economies and their inhabitants.
What are electromagnetic fields then? If it is true that all mass is inherently capable of movement then it is quite easy to explain what fields are. Fields can be observed everywhere, for instance when you are watching a game of football on television, if the rate of motion exceeds the frame rate [also comparable evident in the phenomenon of frame dragging] a blur forms on the screen, for instance when a footballer is running or kicks the ball. To deduce that the blur (e.g. magnetic field) on the screen propels the athletes movement, as is currently the case in physics where magnetism is concerned, is a misrepresentation or misinterpretation of what is being observed. A ball can rise and remain suspended in mid air or space. It is not held in that position by a field. It is kept in that position by properties within its mass, generated by acceleration using quantum mechanics. Raise a hand in front of your face and move it rapidly from side to side, it creates a blur, i.e a field, which is not causing the hand to move, but is merely a byproduct of the hand's movement. Fields simply trace or record movement, they are not a force that causes movement or a source of movement itself. Similarly, magnets, magnetic filings, auroras, even a compass needle aligning itself to true North with the earth's electromagnetic field, is not being moved by the field (the blur), it is being moved by activity at the subatomic level of its own mass or matter ascribed to quantum mechanics. It is important to note that though this explanation may reveal that both gravity and magnetic fields are simply created by acceleration, it should not be forgotten that mass itself is merely an attribute of code or true Space (St). This generally means that mass and acceleration will so whatever St codes them to do.
Of course I repeatedly explain or bring up "quantum entanglement" by stressing the need to correct Einsteins view or explanation of "Space-Time". The attributes of Space are that there is no distance [all objects and matter are located in one point or frame], therefore, there is no conventional time.
Let's explain this more succinctly.
There is no Distance
Though the car in the video below is moving at 100km/h, from the time its started off and reached its destination it was visible in the same frame, a frame is no different from an entire universe. Since the starting point and the destination are in the same frame, in reality there was no distance between them, neither is there at the smallest point, the quantum level or the greatest, the astronomical level. If there is distance, there is matter to quantify distance, much like a ruler is scaled to measure, and if there is distance and matter, then time must exist to quantify rates of motion; if motion exists then it represents time and Relativity Theory applies: all this leads to a Distance-Time, Matter-Time [Time being Motion] or top view of the universe not a Space-Time view of the universe which is where Einstein errs. If there is no distance, then there is no motion, there is no durational time and matter exists but not as it is understood in the top view; since all matter exists in a single point when observed from the side view, that single point is a single frame or the entire universe. At this stage physics begins to move outside observable phenomenon from existing as matter to existing as information or a type of code.
None of the images on the pages in this video are moving. They
are all static. By flipping pages and refreshing these static
stills the images come to life. The Universe follows the same principle
bringing Matter to life.
When this underlying code and how it works is understood, humanity will gain a new threshold in physics and be able to manipulate gravity with the greatest of ease. The image on each page standing still entails it has no velocity and yet the object when observed on refreshed pages appears to move. This condition creates what scientists call the “measurement problem”. To date physics has failed to explain the measurement problem. As I have suggested here, if matter indeed refreshes then this very elegantly and comprehensively explains how it can appear to be a particle and a wave at the same time essentially providing a pragmatic end to this debate in physics. This very simple explanation does not need the concept of a "superposition" of two states when its not being measured, there is no need to attribute active observation or measurement or a state of non-observation that affects the particle which is quite weird, or that only conscious beings affect or are affected by this process, if matter refreshes this problem is solved. If you don't know what the measurement problem you can watch this video.
The measurement problem has confounded the best minds in physics for many
years; yet in this paper I quite easily, pragmatically and elegantly explain it as a refresh rate
or property in matter presently unaccounted for.
When frames used to animate an object are examined more closely it will inevitably be found that the object on each frame is standing still yet when the flipped pages are observed less closely or at a distance the object is animated. This phenomenon also represents the problem experienced in quantum physics where matter appears to be able to exist as a particle or a wave. Since we know that fundamentally the image on each page or frame is standing still we are able to infer the movement we observe in science as a “wave form”, velocity or on the screen as motion or movement is in fact an illusion, a form of paramnesia or a kind of “trick” of nature, it is a scientific phenomenon which creates empirically verified wave properties, yet without this “illusion” [Einstein’s interpretation of a “top view of the Universe] matter as human beings understand it and reality as they experience it would not exist. When the “pages are being flipped” and motion appears to take place through an animated object, matter seems to behave as a wave, however, when “moving” matter is examined more closely; on each page it has no velocity, is in fact standing still and therefore appears a particle, explaining the inevitable dilemma of how an object can be moving and standing still at the same time.
Wave particle duality in physics may in fact be a flawed concept, since waves as they are conventionally understood do not exist, but are more likely a top view illusion earlier described as a kind of paramnesia attributed to observation at the top level induced by motion facilitated by the refresh rate of the universe and therefore the refresh rate of matter; this illusion is what is referred to as the experiential Universe or that aspect of the Universe people inhabit on a daily basis. It is brought to life by matter being refreshed thereby endowing it with mobility and free will: since motion is time in Einstein's model, the refresh rate is the origin of both motion and top-view time. The experiential Universe is not an illusion per say, it is a real, flesh and blood world or existence since it has origins in a particle form, however, the fundamental properties upon which it is created rely on the wave form of matter which is technically produced by ephemeral or impermanent processes. In other words even though reality is rooted in the particle form of matter, a particle itself is impermanent. This impermanence can be better understood by appreciating that matter refreshes, constantly; in other words particles must persistently appear and disappear in order to exist for matter to appear to be capable of animation or motion. Consequently, it can be deduced that to have the ability to move matter or a particle from which matter is constructed must have a third property that is currently unaccounted for in modern physics and this is the be ability to “refresh”. To “refresh” refers to the ability to disappear and reappear; a property of matter currently unaccounted for in modern science, but that inferences show may occur. At this stage we have leaped past Einstein's model of the universe. We are able to begin to understand how we live is a universe without conventional distance, and therefore without conventional Time. This makes it easy to understand how and why the phenomenon of both quantum tunneling and quantum entanglement occur.
When we see an object moving, no matter how fast, it is in fact never in motion. If the image is closely observed, like a single frame in a movie projector's real, it is in fact completely still or frozen in place. In order to appear to move each frame must be successively removed and replaced by another. The refresh rate of matter entails that matter exists and ceases to exist so rapidly it is difficult to say which state it occupies leading to a paradox, such as that observed in Schrödinger’s Cat Experiment; it is dead and alive, standing still and moving, existing and ceasing to exist all of which appear to defy conventional thinking in physics. However, using the flow of logic thus far we can dismiss the top view concept that a particle is a wave since we know distance is a construction designed specifically for perception or put simply, it is experiential and conclude that matter always is and fundamentally remains a particle which is why when observed more directly and scrutinized closely matter will tend to appear as a particle, while its wave properties are induced by the process of refreshing these static particles or “stills”. Wiki (2010) explains that “Wave–particle duality postulates that all matter exhibits both wave and particle properties. A central concept of quantum mechanics, this duality addresses the inability of classical concepts like "particle" and "wave" to fully describe the behaviour of quantum-scale objects. Standard interpretations of quantum mechanics explain this ostensible paradox as a fundamental property of the Universe, while alternative interpretations explain the duality as an emergent, second-order consequence of various limitations of the observer. This treatment focuses on explaining the behaviour from the perspective of the widely used Copenhagen interpretation, in which wave–particle duality is one aspect of the concept of complementarity, that a phenomenon can be viewed in one way or in another, but not both simultaneously.”[7] Clearly, as we may note here, there is no duality; a particle is never really a wave in the same way the stills on frames of a projector are never actually moving, they always remain static on each frame and consequently remain in “particle” form therefore the Copenhagen interpretation may be a little misleading.
Schrodinger's thought experiment is a classic example
of how trying to adhere to Einstein's flawed model leads theoretical physics on comedic wild goose chases
and compromises the ability of even reasonably accomplished people to think outside the box in order
to solve simple problems.
If the cat in the box is considered alive when it is moving (a wave) and dead when it is static (a particle) then the idea is that before we look at the cat we do not know if it is dead or alive, a static still on a single frame or flipped frames showing it moving around. However, if we do not look into the box the cat is in a “Superposition”, that is, a precursory state the actuality of which will only be revealed when the observer actually looks. There is actually little or no use for probability in this thought experiment. The reason for this conundrum is that physicists taught to evaluate this experiment using Einstein’s approach cannot theorize a universe without Time. Since motion is Time in Einstein’s model the idea is that in the same way that lightspeed is an unsurpassable limit, Time can never stop. To bend over backward and accommodate Einstein’s flawed model the poor cat must potentially be alive 50% of the time and dead 50% of the time. The fact remains that the simplest way of getting the correct answer to this thought experiment is to correct Einstein’ flawed model. When this is done it is understood that when the box is open, the state the cat will be in will be predetermined by which method the observer uses to view the cat. If the observer suspends Time (sic Motion) the cat inside will be static, like the still on a single frame; in this inanimate state it will be considered dead; the poison or grenade will consequently be considered to have been activated. Should the observer allow Time (Einstein's Motion) to exist in the box, when he or she opens it the cat will be moving about normally and therefore considered alive; the poison or grenade will not have been triggered.To believe there is a Superposition governed by the rules of probability where the state of the cat cannot be determined is just another example of how flaws in Einstein’s model forces physicists to make mistakes, that are then justified by a twisted kind of logic that defies pragmatism. In the case of Schrodinger’s cat there is no extensive probability that determines the state of the cat, this unknown is merely a symptom of Einstein’s model clouding the ability of even highly intelligent people to think objectively about an idea or system of thought.
I must say, finding the underlying flaws in economic theory the correction of which would end poverty, unemployment all the host of problems experienced by modern economies was quite challenging. Finding a solution was a personal life goal set from the time I was first introduced to economics at university. Poverty and its negative effects on humanity very deeply moved me and I was determined to get answers to why economics has not resolved this issue despite so much research, literature and knowledge gains. I am glad to say I did find the solution. I found that the problem is in fact the economic model itself which is basically built on a Total Revenue (TR) - Total Cost (TC) = Profit model. This model is a virtual Pandora’s box. It cannot be fixed. To attempt to do so as is the common practice in economics today is an exercise in futility, that will continue to lead straight into dead ends. It is a problematic scarcity generating model that maintains a state of equilibrium business cannot survive in unless they retaliate with disequilibrium, it is in fact operating in direct conflict with the interests of the businesses and financial systems it claims to serve at a fundamental level. Economic problems therefore cannot be comprehensively dealt with without changing the model itself. To attempt to do this without changing the model itself is a complete waste of time and resources. This model has to be transformed into a Total Revenue (TR) = Total Cost (TC) = Profit model which is fundamentally a new growth and resource generating model. End of story. Scarcity ceases. Poverty ends. Financial institutions, businesses and industry thrive like never before because they now operate in the model they are supposed to be in, that supports rather than smothers growth and development. Finite resources against infinite demand is thenceforth no longer the basic economic problem. By no means should you be beguiled by the simplicity used here to describe these economic models as to think they are easy to understand, arrive at, interchange and implement. In economics swop out these models and economic woes of any nation, any government you see today should come to an abrupt end.With physics on the other hand the mistakes in fundamental theory feed on one another and therefore emerge one after the other like whack-a-mole, no sooner is a solution found for one another raises its head.
Einstein and Misconceptions about the Speed of Light
It is my belief that we are entering an era of science in which the public will no longer be able to take seriously a highly qualified physicist from a reputable institution who believes the speed of light is both an invariable constant and unsurpasable limit. They must now be viewed as relics who are being eclipsed by the evolution of the paradigm that defines the very foundations of knowledge in a subject area in which they were once experts who must begin to evolve and advance to remain of any relevance to mankind's future.
Dualities in physics are often symptoms of models that are flawed and therefore do not concisely explain phenomena therefore they allow contradictory statements or theories ascribed to "relativity" to co-exist and equations are tailored to suite these weird juxtapositions. The measurement problem is one of these, but we have used a refresh rate to concisely explain how a wave and a particle co-exist. A refresh rate can be used to explain the relationship between matter (-) and anti-matter (+) and how the two states co-exist by refreshing from one state to the other (.i.e. the same matter is altering states between electron and positron). Matter and anti-matter is very likely to be the same matter alternating between positive and negative charges, its either in one state or the other which explains why it does not explode spontaneously. If alternating charges back and forth between negative and positive charges in this way occurs during each refresh then it explains why anti-matter can be present but unseen or intangible making matter appear dominant. This switching back forth may be fundamental to how matter exists.
Another such fiasco in physics has to do with light. For instance, light can transfer momentum something only bodies with mass should be able to do, evidence of this is in light pushing a solar sail. When light passes a gravitational field it bends providing further evidence that light has mass.
link to video Alcubierre Warp Drive (AWD). Although the warp bubble is shown surrounding the vessel, the "fields do not exist" approach implies that the warp bubble can be generated at the subatomic level within the volume of matter occupied by the vessel, which requires much less power to achieve the same result. |
In this interesting video Arvin Ash explains the
Alcubierre Warp Drive. (Arvin has a great channel
with amazing explanations about otherwise complex
theories in physics - click the link)
The major challenge that is immediately obvious is that it has to bend the Geometry of Space to accomplish this form of propulsion. Space is built to prevent this kind of manipulation. Why is Space (Se) so rigid? Earlier we saw that going back in Time is not possible, instead a vessel that attempts to do this will find itself relocated to another location in the same universe. A record of Space-Time will be observed during the process. Similarly, the speed of light limit most likely exists as a buffer between universes. Distance in true Space (St) technically does not exist, to exceed the speed of light will cause a vessel to move into the next universe in line. To travel at twice the speed of light will cause a jump into the 2nd in-line universe. To travel at 3x times the speed of light will cause a jump into the 3rd in line universe and so on. However, as the vessel decelerates it will jump to the universe it originates from and the "distance" it has covered can be gauged in terms of universes crossed. For instance if the vessel traveled at 9 million times the speed of light, it will appear to cover this distance in its universe of origin, when in fact it has crossed 9 million separate and individual universes in the multiverse. Even though it has traveled this "distance" the reality is that in true Space (St) it remained in the same location, while this "hardware" was instead used to create the multiverse. To cover the distance of travelling at a billion times the speed of light, a vessel will cross a billion universes. This is hypothetical, but the analysis needs to be made to test the concept.
The number of universes that exist can be counted, calibrated and lined up in this way to predict how many there are. They may be infinite, in the sense that there is unlikely to be 1 multiverse, but many multiverses as this reduces redundancy. Each Multiverse is a Host to countless universes separated by calibrated or sequential light-speed boundaries that create Einstein's Geometry of Space-Time (SeTe). A collection of Hosts can be referred to as creating an Omniverse. The universe earth resides in, on this scale, is infinitesimally tiny, seemingly insignificant. This gives some idea of just how expansive a multiverse is. Each multiverse is sustained in a unique band and each universe a unique frequency, for instance, in the CMB. Keeping individual universes in the multiverse apart or separate within individual singularities lowers existential risk. The destruction of a universe will be contained by light speed barriers and will not affect other universes. It allows laws in physics within each universe to act and function independently. Although universes can be described as being calibrated, stacked or layered in this way, they occupy the same Space. This is what makes Space (Se) so rigid and inflexible. True Space (St) does not need distance, in the same way that true Time (Tt) does not need motion (Einstein's "Time" in Se) to generate what human beings regard as reality. This may take some time to understand, but it is not complex.
A hypothetical representation of Multiverses shows that navigating space at superluminal velocities may be more complicated than is currently thought. At these incredible speeds the relationship between geographical locations and destinations in Space require such a high degree of refinement that frequencies become the geography that governs the dynamics of navigation. Here locations are tuned into and out of, rather than traveled to. Therefore, the velocity, trajectory and other factors that affect the orientation of the vessel in relation to Space are the tuning dial that direct the spacecraft toward the location it seeks. At these speeds the normal geographic distances that are used on a map will appear to over-lap or occupy the same location. For instance Los Angeles and Lusaka when observed on a normal map will appear to physically be in exactly the same place. So how does a pilot fly to one destination rather than the other? The only feasible means to reach a location is its Space-Time frequency. This is an area of physics in which there is little or no research. This essentially means that humanity does not currently have the knowledge required to be able to navigate a vessel when it is travelling at superluminal velocity. The relationship between frequencies and Space-Time mapping of the universe are a critical area of research that requires significant and appropriate attention. |
It may not be possible to create Einstein's Space-Time without multiple universes since they form its very Geometry and fabric. |
A jump to superluminal velocity at any location in space is likely to reveal the layers or calibration of separate universes. |
A wormhole or black hole with funnels on both ends has the added advantage of possibly offering a return route between universes, otherwise a single funnel black hole remains a one way trip, unless there's another black hole nearby that offers a reverse trip. As long as the pressure or contraction of Space at the mouth of the black hole is greater than than pressure or contraction at the singularity, a black hole should in essence not be completely different from Alcubierre's Warp Drive. Note that when words like pressure and compression are used it is reference to the elasticity Space and how it affects acceleration. When an object enters a back hole it will be accelerated in a similar way that a warp bubble will move an object through space faster than the speed of light without violating Einstein's light-speed limit . However, despite Einstein's light-speed limit not being violated, light-speed barriers (divisions that separate universes that create the Geometry of Space or Geodesics) will still be breached. This means that the last singularity (U10x) is merely the final speed at which the object will exit the black hole where resistance from light speed barriers or boundaries has managed to force the breach to close consequently halting further propagation of the black hole. A vessel should be able to exit any section of the black hole and the singularities it crosses, it does not have to emerge at the tip of the black hole which is hazardous because that's where matter is potentially collecting and being ejected. As we have hypothesized above travelling at the speed of light creates breaches across calibrated light-speed barriers that separate universes, in this case creating 10 singularities or 10 exits (U1x-U10x) dependent on velocity. Each barrier forces the black hole to contract until it is stopped at U10x. If the final exit speed at the final singularity is 10x the speed of light the object can exit in any area of the multiverse along the length of the back hole (U1x-U10x) which represents a geographical area of Space 10x bigger than the earth-universe. Which universe it exits into from the singularity may depend on many factors such as exit location, velocity and trajectory. This will be a one way trip since the vessel can only travel from regions of expansion to contraction which create the direction of acceleration or gravity. If the vessel does not have the technology to match the acceleration it gained from the back hole it cannot return to the universe it originated from. However, if this is the case, it can avoid this by exiting in a new location back into the universe it came from depending on how the pilot navigates the exit from the black hole. A vessel may also be able to use a black hole like a slingshot by combining acceleration from its own engines with the additional speed provided by the black hole. Any time travel distortions experienced by occupants of the vessel will not be real, they will simply be holographic. |
Even if a black hole weighs "6.6 billion" suns, hypothetically this gravitational force can be converted into slingshot acceleration as long as knowledge of how to slipstream, navigate the Geometry of Space, manage and control g-forces and resistance at superluminal rates of acceleration is available. It should not be forgotten that gravity is not just about mass, it is also a form of fuel for acceleration in and of itself. Therefore, it makes sense to harness it. In StTt the Geometry of Space does not exist outside a vessel neither does it create Space, these are all internal changes taking place in the atomic structure of matter. However, the Geometry of Space visualized by Einstein in SeTe is likely to be as useful to knowing how to pilot through a back hole as a pilot's knowledge of air currents in aerodynamics. If the Expansion of Space nearby and at the funnel or opening of black holes, being greater than the contraction of space at the singularity, can be deduced as what causes the gravitational acceleration of matter from open Space, toward the black hole's opening and from the opening down its funnel then technically the principle is not very different from Alcubierre's Warp Drive. The wave created by the AWD that propels the spacecraft appears to be almost identical to the wave that accelerates mass toward and through the black hole. At superluminal velocities the internal frequencies of atoms viewed as this Geometry, generated by the vessel's trajectory and velocity are likely to be critical to its ability to enter, move through and exit a black hole safely. For instance note that in AWD the spaceship is kept safe by riding a section of the wave (Geometry of Space) parallel to the spaceship while the cause of propulsion namely expansion and contraction or compression move the spaceship along. The Geometry of Space remaining parallel to the velocity and trajectory of the spaceship should probably not be trivialized as this harmony may be what keeps the occupants in the spaceship safe from dangerous time distortions associated with the Geometry and light-speed barriers that separate universes by creating singularities between them. Since the Geometry of Space in black holes is vast, and concave or angular from the funnel to the tip, to maintain the same harmony or "bubble" it is likely the pilot must ensure that the spaceship's superluminal velocity is parallel to the Geometry of the black hole. A spaceship traveling at superluminal velocity that travels against the Geometry is likely to be similar to an airplane hitting turbulence except that with this Geometry at superluminal velocity the turbulence will emerge as loss of trajectory toward intended destination, erratic geographic or Spatial displacement and distortions of Time that can have serious consequences that affect the capacity for the spaceship and its occupants to arrive at their destination, in the right universe, at the projected time, in the right frame of mind. If frame dragging occurs in the Geometry of the black hole pilots are likely to have to carefully follow this thereby keeping the frequency of the spaceship in harmony with the frequency of the Geometry it travels through consequently keeping the "bubble" safe and intact. Its possible that physicists have not as yet considered these attributes or the importance of beginning to document such frequencies. These frequencies will consist of the X,Y,Z of the 3 directional (dimensional) Space with T for Time. The Geometry consists of specific frequency combinations of these 4 factors that may mean very little when travelling at low speeds, however, navigation at superluminal velocity may simply be impractical without knowledge of how a vessel emits them and how these emissions affect navigation. Even though for now how they work and what they are is not fully understood, developing a catalog still remains vital since the very procedure of figuring out how to collect this data and keeping records of what is found impacts on humanity's ability to understand how navigation through frequencies works.
One place to start in this science is to begin to understand and map the frequencies emitted by microchips when they process diverse types of information. This refers to the actual frequencies emerging from microchips on a motherboard and not the familiar sounds of a modem transmitting code or information. Understanding the relationship between the frequencies emitted by microchips and the information they process is something easily accessible and therefore an interesting place to begin as it may reveal some useful insights.
In the diagram above to travel the distance from A to B in earth's universe, there are two options. To travel at subluminal or superluminal velocity. If the distance from A to B is 100 million light years, it becomes impractical to try to get there at subluminal velocity. However, if physicists in future develop the technology to enable travel at 100 million times the speed of light, this jump in speed is not a straight-line journey from A to B through earth's universe. By traveling at superluminal velocity it will have to breach 100 million universes that occupy the same "Space" to get from A to B in shorter time.
It may get even more complicated. Some universes in earth's local multiverse are out of reach. If the furthest universe from earth-universe marked as C on the multiverse diagram were 48 billion light years away from earth there would be no way earth's civilization could reach it unless it developed the technology to travel 48 billion light years per second in the same way that earth could not reach the nearest universes marked as D and E, which are only 1x light speed jump away, if earth's civilization did not develop the technology to travel at least at 1x the speed of light.
Nevertheless, it cannot be ruled out that the prevalence of black holes at the center of galaxies and universes may allude to the possibility of the core of a multiverse consisting of a black hole. If this is the case then they form a nexus or "inter-connector" where short cuts to other countless universes and locations within universes will be found. It may be possible from this location to access all other locations in a multiverse and universe without being prevented from doing so by having to cross countless barriers. If multiverses, universes and galaxies are commonly formed around black holes these may offer easier routes for getting around distances and speeds that may be otherwise impossible to achieve.
The layering or calibration of universes in this way may offer physicists, cosmologists and astronomers the option to consider that black holes that end in a singularity can in fact act as bridges between universes that could otherwise not be breached without the ability to travel at superluminal velocity. A Collision Drive propulsion system is likely to be able to offer scientists in these fields the first opportunity to approach and study diverse features of Space first hand, in person. If black holes do offer links like this they will be one way streets unless their architecture consists of a double sided funnel. Lightspeed barriers between universes effectively isolate universes from one another, therefore a black hole ending in a singularity makes perfect sense because exiting it leads to an alternate adjacent universe that is completely cut off from other universes.
In this presentation internationally acclaimed physicist Dr Neil deGrasse Tyson
explains why the concept of a Multiverse needs to be considered a very real construct.
It will take a mind boggling amount of energy to flex Space in order to be able to achieve the technology to warp even though the mathematics shows that it is possible. This is 3rd Generation Gravitational force. Humanity will inevitably be able to create energy technologies that can do this easily in future. But right now, bending Space does not seem an accessible method. The other problem is that physicists do not know how to create negative mass. Although this has more to do with the weaknesses in theory than actually being unable to achieve it. For instance, mass created by Trons at T4 is omnidirectional. This means the gravity they create will act in any direction on the dimple where it is pointed when generated. So if positive mass pushes it toward an object, to gain negative energy, gravity or mass that will push or repel it from the same object the dimple, T4 simply has to face in the opposite direction. Therefore, there is no mystery, difficulty or impasse when it comes to creating negative energy, mass and gravity. It is something achievable that tier 1 gravity can easily demonstrate today.
There is no mystery concerning how to create negative energy/mass/gravity. It is easily created at the subatomic level by manipulating T4 or emulating how this is done using tier 1 gravity. It may also be useful to note that the propulsive or gravitational force (negative or positive energy - E) a single atom is capable of exerting at T4 is equivalent to E = mc2 . The general assumption is that gravity is a weak force being – 1040 weaker than the electromagnetic force that holds atoms together. As shown by Einstein's mass energy equivalence equation this assumption is in fact misplaced. Atoms collectively need only harness tiny proportions of kinetic energy to implement what is observed as gravity. Most of this huge potential energy remains dormant and unused. This means its propulsive force can be harnessed directly, for instance, to turn a generator without the need for heat and steam as is the case with modern nuclear reactors which would seem primitive in comparison. This is in line with the assumption that all matter is self propelled. It must therefore have contained within it the dormant propulsive force to act when it receives instructions to do so through entanglement depicted by the red line vector as the two bodies communicate and navigate around one another. For instance, a teaspoon of neutron star weighs 10 million tons. This makes 5th generation gravitational force the most powerful and most advanced method for controlling gravity. This is what powers or drives the mobility shown earlier in the animation reproduced below: |
The third explanation for gravity, which is possibly the most accurate to date, is that gravity is created by communication between particles which then negotiate how to position themselves relative to one another, and they do this autonomously. This is illustrated in the animation above. Fields, which are imaginary and an invention, like a ruler cannot fully explain the objects they measure, which is why the physics community after centuries of gathering knowledge still cannot demonstrate any device for creating or controlling gravity. This is because fields do not create gravity, in fact they do not exist anywhere except for in the imagination of a person trying to make a measurement of a physical force. They are useful in this sense, but if this limitation is not understood they turn from an aid into an impediment. Even in situations where physicists cite gravitational lensing or light bending around an object due to gravity, the fields do not exist hypothesis would rightly identify that photons or particles of light are not being moved by a gravitational field but are in an of themselves aligning with other objects through red-line communication as is depicted by the animation above. Even if a scientist where to feel the pull off a gravitational force on his or her body, it would be caused by particles in their body positioning themselves in relation to some other object and the impression would be that they are under the influence of a field when in fact not, since their physical mass is responding to communication with other masses and repositioning autonomously. Each particle is fundamentally designed to have mobility and position itself in this way thereby making it able to naturally assemble, create or replicate any type of force OR matter. These same particles, at the atomic level, through this very same autonomous mobility, generate gravity as we saw earlier with the Tron weaving through T1, T2 and T3 to generate Tier 1 gravitational force at T4 in atoms, this same Tier 1 gravitational force is replicated or has been reverse engineered using mechanical engineering in the Collision Drive. In order to understand how to do this fields had to first be dismissed as what causes gravitational force otherwise they impede the ability and need to look beyond the field for a cause, mechanism and method for what is inducing levitation in both magnets and gravity. Fields being a by-product in general or a generalization of forces in physics do not have the requisite refinement and dexterity required to create matter, which requires the coordination and control of each individual particle and sub-particle in the BEC or the atom. They therefore are not responsible for creating matter and become a blunt instrument when trying to explain it, especially at the level of quantum mechanics, which itself becomes too shallow to fully grasp various phenomena, after all, energy bands and "quanta" in quantum mechanics are basically a reference to fields. If matter in general and particles are actively autonomous and coordinate their movement through communication, then fields observed by physicists do not exist, they are just a visual by-product of this communication and autonomous coordination to create matter, forces and reality as it is perceived and measured in physics. The fields do not exist hypothesis or method of analysis is an important tool because it demonstrates that the very fundamental mechanics upon which the universe functions are not understood by contemporary physicists, be it in aeronautics, fluid dynamics and so on there are substantive misinterpretations permeating modern physics in its entirety concerning how forces deploy to fulfill its laws. For instance, if matter has delicate and intricate mobility and is autonomous then even observed phenomenon such as buoyancy of objects on water, is in fact not caused by them displacing water equal to their mass as is deduced by Archimedes Principle. The water molecules and the atoms of bodies that rest on water communicate and align themselves in a manner which when observed by physicists is thought to be created by Archimedes Principle, when in fact Archimedes Principle is also flawed as is Einstein's observance of fields or Space-Time Geometry being what creates gravity. However, if a physicist assumes that a boat floats on water, and the water, like a field, is responsible for this buoyancy then his or her reasoning meets a brick wall in that the belief the cause of flotation is the medium or field deters any further investigation. Without further inquiry it is then never established that negotiation between particles and not the field is what is responsible for buoyancy in any medium, be it water or air, magnetism or gravity. This is like going to a movie theatre and leaving believing that the story, stunts, action and characters were all real simply based on what was empirically observed to have transpired on the screen and then basing all of the laws of physics on the movie. There may be accuracy and empiricism, but it is all based on a misinterpretation of underlying facts because a movie is the result of many intricate parts and movements from directors, writers, actors, producers and so on functioning behind the scenes. Even simple concepts such as the physical interaction between matter is very likely to be negotiated at the particle scale, which means that objects do not pass through one another and are regarded as "solid" only because the particles communicating with one another at a scale smaller than that currently viewed in quantum mechanics negotiate positions and use autonomous mobility to create what is experienced as contact between matter and masses, which in turn makes them appear solid or "real". Particles and sub-particles may be tiny, however, the energy they have readily at their disposal to negotiate movement and positions is governed by the mass equivalence equation. This means the cup you casually place on a table is not supported by the table but by the activity of its own particles at the atomic and sub atomic scale. Technically it is suspended in the air, and only appears at rest on the table due to the fact that this is the position it has negotiated with the particles of which the table is comprised, which in turn are doing the same. The same applies to the table being at rest on the earth. Particles and sub-particles appear to behave like or can be compared to complex naturally evolved nano-particles or nano-technology, but on a minuscule sub-atomic scale which is currently outside the reach and direct measurement of quantum mechanics. The next stage beyond quantum mechanics may very well be referred to as sub-particle nano mechanics with the understanding that particles are created from sub-particles, which in turn are complex entities. Without this red-line communication and decision by matter or particles to respond to other particles, matter could not interact and may even become invisible or indiscernible to other particles. This process of selective interaction in particles is very likely tied to frequencies at which particles choose to operate, which in turn explains how particles and matter in different universes can occupy the same Space and yet have no physical contact thereby setting the stage for the existence of a Multiverse that is also currently unaccounted for and unsubstantiated in modern physics. As is observed with mass energy equivalence these sub-particles form particles, which in turn create atoms, which means they wield or manage tremendous amounts of power and since each sub-particle in the universe is connected they also access a tremendous amount of information, both of which create what becomes the laws of physics, chemistry and the sciences in general. Earlier an attempt was made to show what the spiral structure of these sub-particles may look like in terms of nano-mechanics. The blue curve in the animation above is the modern day physicist's imaginary field, it does not exist and is simply an example of how empirical evidence, when misinterpreted can still mislead the sciences. It is quite sad and unfortunate that physicists are caught in this trap of misinterpretation as it inevitably means by failing to escape from it they will never grasp or be able to explain how important phenomena in the sciences actually function and will subsequently fail to produce important new technologies or understand how to control them as is the current case with gravity. The fields do not exist hypothesis affects how fundamental forces such as electricity actually work and imply that even how electric current works is being misunderstood and misapplied. It means that, like fields, electricity is a by-product of continuous entanglement or red-line communication even between unpaired particles since even the absence of measurable entanglement between particles needs to be negotiated and communicated between particles. This was deduced earlier with the floating magnets being described as levitating as a result of processes in their atomic structure, not fields, which in essence implies that information and power can be instantaneously supplied, transmitted and broadcast, across tiny or vast distances without the need for a conduit such as wire. However, this process cannot be fully understood due to ongoing misinterpretations in the sciences of what is observed. These flaws in physics and economics may seem benign, however, the reality is that some of these shortcomings, without intervention, may have consequences. For instance, if humanity is required to be able to achieve a required rate of economic growth to sustain its civilization, but this evolution is failing then it can lead to internal collapse by virtue of an advanced civilization subsisting on 1%-3% of its potential for economic growth instead of accessing the full 100% the majority of which remains idle and goes to waste simply as a result of insufficiency of knowledge, inefficiencies in the circular flow of income and failing to understand the means by which to create the resources by which to control economic outcomes and create the resources required by a civilization to support itself and its populations. In the same vane, if knowledge about how to control gravity is required to allow the mass movement of people from earth before man made or natural cataclysmic events build up but is stalled due to myopia or low intellectual ability in the sciences then this flaw becomes quite significant as it means an advanced civilization such as that developed by humanity today is likely to face certain demise from conditions it is intellectually powerless to address. In both cases, economics and physics, the inability of a civilization's learning curve to exceed critical primary challenges creates latent and inadvertent extinction level impasses that over time will inevitably reach critical mass and therefore need to be taken seriously before its too late. In humanity's case one of the key causes of extinction is likely to be a prolonged history of racism and discrimination. Without reparation and correction to balance the curve the consequence of this extinction is a deformed distribution of resources where the entire intellectual ability of a civilization is underutilized which leads to gaping holes in theory and practice in the sciences, specifically as is observed in economics and physics. The ingredients required for the survival of a civilization are scattered across diverse tribes and races. In humanity's case its civilization is not accessing and has not accessed its full intellectual ability due to the fact that discrimination has biased participation in education, access to which is decisive about whether a civilization will survive or face extinction by its own incompetence. This is why equality should be taken seriously and an effort should be made to ensure the highest quality of education is available to everyone regardless of what they look like, which part of the world they come from or reside in, as a minimum standard. Discrimination practiced by a small group threatens the entire collective often in ways society cannot see or is too self involved to understand its long term implications, therefore, it is the one vice that a civilization should not accommodate for purposes of its own long-term self preservation. The collective growth of a civilization is governed by its collective intellectual development that is in turn nourished by equality and the diversity prevalent in participation that maximizes its capacity for problem solving creating a test, balance or processes of natural selection that establishes longevity for a civilization or effectively shortens its lifespan. For this reason every civilization, including that of humanity, is always moving to critical mass on the path to survival or mass extinction, where both outcomes are determined by its own hand.
The graph above shows that the extinction of the human civilization based on the inability of its learning curve or intellectual development to rise above or match critical challenges is very real, but not taken seriously. As it is said humanity will be eating and drinking until the last day. Ignorance is bliss. The required learning curve shows a civilization maximizing the full intellectual ability of all its people across the world without discrimination and is therefore able to counter adversity and escape the extinction curve, whereas nature has three basic options to cull a civilization that fails this basic test, that is, through inadequacy in sciences, inadequacy in socio-economic development or through natural disasters which can include pandemics, floods, earthquakes, asteroids and natural cataclysms. In any of these cases the only means of survival is the required learning curve. |
From the "fields do not exist" position the AWD theory is trying to bend, compress and decompress something that in reality does not exist. The forces that create gravity, mass and how they interact, that are being referred to, are created by interactions taking place at the subatomic level of matter, as has been pointed out at T4 in the proposed anatomy of an atom. If this is accurate, then changes to Space itself to create the warp bubble are wholly unnecessary as the same results can be gained simply by influencing the mechanics of the vessel alone at the atomic level, i.e. "compressing and decompressing" the atomic structure of the vessel and not the Space around it or simply manipulating the direction and rate of amplification of force in the atom at T4, because technically the Space around it does not exist. This is like light passing through a very thick pane of bullet-proof glass, nothing has to be done to the glass, the vessel and the volume it occupies simply has to behave like light by ditching its mass to get passed the glass. Therefore, how Space creates gravity though useful in explaining concepts, fundamentally is not being correctly interpreted. However, bending the Geometry of Space to travel faster than light, for now, may not be necessary. To overcome this problem we simply need the ability to penetrate the Geometry of Space, rather than try to bend it or travel directly against it. This can be done by building a propulsion system that can accelerate a vessel, for the sake of example, momentarily at 3x the speed of light without actually travelling at the speed of light. This can be achieved today using tier 1 gravity. Tier 1 gravity also solves the problem of how to create negative mass. It achieves this by reversing polarity [see Collision Drive]. Once faster than light acceleration is achieved it can be used to unlock all 4 remaining generations or forms of gravity manipulation. What is the hypothetical result of stationary matter interacting with matter or vessel travelling at 1x-1,000x the speed of light? This knowledge is outside the framework of what is known in physics. Matter being accelerated at such high velocity initially may not need a warp bubble due to the fact that it is travelling faster than the natural rate of cause and effect or causality in physics. As a result it begins to slip-stream and travel without resistance from the Geometry of Space. In addition to this, should inferences concerning the elasticity of the Geometry Space prove true, then it implies that as a vessel accelerates faster than the speed of light whilst being momentarily at a sub-luminal velocity Space itself becomes more elastic, then it also alludes to the possibility that during this elastic phase very little power is required to compress and decompress space occupied by the vessel to generate the warp bubble required for the AWD. In this scenario there is no speed limit to how fast a vessel can travel and the technology with which to achieve this is available in this age. All this can now be tested experimentally using tier 1 gravity or a Collision Drive [Patent Pending].
Physicists have not addressed how the elasticity of the Geometry of Space changes at exceptionally high rates of acceleration, instead the tendency is to focus on the resistance of Space, regard it as absolute and how to counter this hurdle. Technologies that need to compress, decompress and bend Space-Time can come later when advancements in energy and science make them more practical. What can be done right now, today, is use the same approach in air travel, which is to make vessels more aerodynamic, consequently reducing wind resistance. Similarly, exceptionally high speed (supernormal velocity) is anticipated to have the same effect on the resistance created by the Geometry of Space. Supernormal velocity can be used make vessels more "garvi-dynamic", to coin a phrase, which can be compared to matter transforming more toward behaving like energy allowing vessels to slipstream and travel through Space at high velocity without resistance (the way light moves through an inflexible medium such as glass). More on this is explained below:
This is downright contradictory. However, instead of correcting Einstein’s model the solution in physics is often to explain away improper theory and back this with "relativity", jargon and relevant maths. I am more inclined to believe that light does not travel from point A to point B. Rather it moves like a Mexican wave transferring light. When this transfer is observed it appears as though light is moving. When a stone is dropped into a pond, where light is concerned, the ripple effect takes place such that the water remains in place and the ripple radiates like a Mexican wave. The water itself does not move from the centre to the periphery. Photons, like the lake itself, are ubiquitous and remain in place. The photons contain their own energy. When there is no light the photons, like spinning tops that are dormant remain unmoving and opaque creating what appears to us as darkness. Despite being opaque and creating darkness they still contain energy, but it is in a dormant state. When light is triggered from a source the photons remain in place and like falling dominoes begin to stimulate or appear to transfer an energy trigger to one another causing them to spin, emit light becoming transparent. Since the photons stay in place the speed of light we observe is in fact the rate at which they trigger or transfer the emission of light to one another. The cloaked or opaque photon that was in a dark state of potential energy is triggered and uncloaks releasing light proportional to the source of the emission or stimulation. If this is true then the less photons interact with any other sources of energy the greater their potential or dark energy becomes in the Distance as they lie dormant. With this model all the contradictions concerning light are easier to resolve. It would explain how light can travel at light speed and have mass and yet not have that mass approach infinity. The answer is that light is not traveling at all, photons remain in place and instead change state thereby transferring light. It explains why light is attracted or bent by gravity. It also explains why despite appearing to travel so fast the push effect of light is very weak.
Einstein postulated that nothing can travel faster than the speed of light. This inference made from the top view creates a constant upon which he builds much of his theories. Einstein's understanding of the top-view universe was revolutionary in his time. However, when analysed from the side-view the limitations he associated with the speed of light belong to a more primitive understanding of the universe applied in physics today. For instance the nearest galaxy is Andromeda. Travelling to Andromeda at the blistering speed of light it would take 2.5 million years to get there. For the purposes of astronomy this makes the speed of light exceptionally slow. However, side view analysis would force scientists to dismiss distance as formal barrier to space travel. It would demonstrate that the idea that the Andromeda galaxy is an unreachable distance away is a primitive one because this galaxy and earth occupy the same Space (as does the furthest known galaxy in the universe) which means the science of an advanced civilization would know that technically there is no substantive distance between them. If there is no technical substantive distance between them these locations, perceived as being unreachably far by primitive modern day science, are in fact very easily reachable. They can in fact be be reached at an interval of time determined by side view based technologies in, for instance, half an hour or 1 second. Theoretically this means straight-forward faster than light travel from earth to Andromeda in 1 second without weird repercussions or having to devise wormholes, warping time and other exotic theories is possible very much in the same way that the BBC or CNN switch from a journalist in Perth, Australia to a journalist in Chicago, USA within the same space (or frame, i.e. the area of the television screen) the traveler or "spaceship" simply switches from earth as a location to Andromeda limited only by the duration it takes to turn the dial. This is due to the fact that from side view analysis a scientist is not crossing places separated by "distance", but rather "tuning" from one place into another irrespective of distance as they are located in the same space much more like tuning from one radio or television station to another where all the waves or signals inter-exist (as does earth and Andromeda). For instance, in Zambia when audiences listen to radio they don't say they traveled to Hot FM, then traveled to Radio Phoenix, then traveled to Komboni Radio, they say they "tuned" into these stations because they know that while they listen to one station all the other stations are still present but are simply not tuned into. Similarly, the side-view postulates that being on earth is the location "tuned" into does not mean Andromeda is not present in the same Space. Achieving this journey in 1 second would entail travelling at 2.5 million times the speed of light, something technically possible from the side-view but technically impossible according to Einstein and the exceptional yet more primitive understanding of modern day physics based on a top-view analysis and its relevant or irrelevant constraints. What this means is that any location in the universe can be accessed. Our universe is simply a tiny part of a a multiverse. Similarly any location in the multiverse can be accessed through a similar process. Hopping from location to location through Space entails there must exist a map or geography of the universe and multiverse to allow the precise selection of coordinates for a location or "channel" to jump to. How this map of the multiverse appears to work would be accurately theorized by Gaston Julia - (1893 -1978) using the Julia [Map] Set for location based geography of a universe and Benoit Mandelbrot (1924-2010) (Mandelbrot [Map] Set for location based geography of how the multiverse would work. These can be used hypothetically to know in advance the exact location where a jump through Space will take a space-craft. If you want to understand these sets the video below offers a succinct explanation. These sets demonstrate how potentially vast the geography of Space is.
Furthermore accelerating from 0 to 2.5 million times the speed of light in one second does not face interference from any primitive notions of being affected by g-forces, requiring infinite energy to move at near light speed or gaining infinite mass as a result, as would be inferenced by modern top-view physics, as this change of location or "speed" is not applied through the medium of matter-distance-time, but occurs through Space (Remember a fundamental weakness in the Theory of Relativity is that Einstein makes no distinction between distance and space, whereas from the side view distance and space are two distinct constructs). Though visually a vehicle travelling below the speed of light through either the Distance or Space would appear to be moving "normally" as we observe the every day occurrence of an airplane travelling across the sky, principally the method of propulsion using side-view space is completely different from that used conventionally to travel using top-view distance e.g. through thrust generated by an engine. Technically it is moving through space and therefore without the primitive notion that g-forces would make travel at exceptional rates of acceleration impossible. This is due to the fact that a vessel moving through distance as a medium such as an airplane, rocket or other similarly propelled vehicle must experience top-view g-forces. A vessel travelling close to the speed of light would most likely be very difficult to navigate and fatal matter on matter collisions would be almost impossible to avoid. Should it be designed to use a wormholes and so on the extraordinary trauma of traversing biological organisms through the effects of Einstein's Space-Time could prove as lethal as exposure to radiation at a nuclear power plant, whereas a vehicle harnessing Space to change its location does so outside mass, distance without motion or "time" and therefore without any weird, excessive and primitive g-force or dangerous "Space-Time" effects on the occupants of the vessel proposed by mundane limitations in antiquated top-view theories currently applied in modern physics. A civilisation functioning on Spatial technologies would view a civilisation function on Relativity Theory as intelligent but very backward. Traveling through Space it has the option of moving a vessel outside our MDT universe where the vessel simply moves through matter be it a planet, asteroid belt, sun or debris safely as it does not require the vessel to make physical contact. A vessel built on Spatial technology could, while standing still, simply shift into Space and completely disappear from physical visibility because light could pass straight through it at will (no need to try to bend light around it). It could become invisible to radar at will. Objects and people outside the vehicle could pass through it, but it would still be right there observing them. It could do this on the ground or in the air. This gives it unparalleled levels of stealth. It could land on the lawn in front of your house or hover just above it and by today's level of physics, science and technology there would be no way of knowing or detecting it was right there. The advances of Spatial technology over matter based technologies found in the MDT universe are innumerable.
Being able to travel to any part of our universe instantly may allow us to explore it more effectively. Nevertheless, if there are a vast number of places to explore and infinite number of universes similar to our own we may soon discover it could take millions of years to investigate and catalogue all of what's out there. To comprehensively do this would take more than just a newfound incredible speed.
What this image demonstrates is not a distortion of Space-Time as
Einstein suggested. It shows a distortion of symmetrical distances denoted by
imaginary lines relevant to matter. This distortion causes any object caught inside the distortion
to accelerate toward the centre of the object. If LIGO measured this distortion, strictly speaking, it did and yet it did not measure gravity. I will explain. If gravity
is caused by some aspect of Space, then it is possible gravity is merely a form of acceleration that is also being misinterpreted by the Einstein error. This simple original
error on Einstein’s part as a result of a slight misunderstanding has
mislead physicists for decades. It is simply a perception based problem of an age in physics that needs to be corrected to allow physics to progress into a new era. The Laser Interferometer Gravitational-Wave
Observatory (LIGO) is on track but is no exception. What has been measured by LIGO is a form of
distortion in the top view matter-distance-time universe. This is correct. But it has not been
caused by a distortion of Space, to say this is incorrect. Gravity is a reaction to the distortion, however, the distortion is not gravity itself. Gravity is nothing more than a form of acceleration, a phenomenon no different from the force that pushes you and your
passengers into the seat when the driver pushes down on the gas pedal.
If the video above and the explanation below it makes sense. Then do you begin to see how significant the misdirect in physics is? For instance, according to Einstein warping Space-Time creates gravity and is capable of bending light. The bending of light as it passes large masses was predicted by Einstein. It has been proven true, similarly many of Einstein's theories continue to be validated. However, closer examination shows that mass is not created within his Space-Time model. If gravity does not come from his Space-Time model, but outside of it, then this is just another seemingly true prediction that is in fact faulty, deeply flawed or catastrophically misleading.
Einstein’s predictions have been proven true by LIGO, and will continue to prove true as long as they are based on the front or top view of the universe. However, we must begin to look at the theories he left us with fresh eyes. The fact that he confuses distance for Space, will continue to have serious repercussions on the viability of modern day physics and its capacity to add value to the advancement of technology in this area. Is this concern really just semantics, for example, you say tomatoe and I say tomato, but we’re talking about the same thing? No its not.
Correcting Einstein's view improves accuracy and provides a more definitive understanding of how gravity works. It also allows astronomers to, once and for all, answer a question that has confounded humanity's understanding of the universe, it very simplistically provides the answer for how and why the universe is continually expanding. Einstein would not have wanted physics to remain stagnant or indefinitely trapped in the greatness of his theories. He would have wanted to see progress, to see his unique ability to inspire new ideas that move humanity onward that are as impactful as his own.
Cosmic Microwave Background (CMB)
If its true that Einstein incorrectly labels the Distance as Space in his model and understanding of the universe, then all bits of the puzzle begin to come together. Technically when we launch satellites and rockets, we are not launching them into Space. As a civilisation we are launching them into the Distance.
Having corrected his model it is possible to see that gravity does not emanate directly from distortion geometry (a distortion of the Distance) as Einstein proposed. It in fact radiates from true Space as a resistance to distortion geometry. From this we are able to conclude that objects are pushed by this resistance, they are not pulled by distortion geometry as Einstein believed.
If objects are not pulled directly by distortion geometry then this requires us to revisit the underlying cause for why our universe is expanding. There is a possibility that matter in our universe is being pushed outward by gravity. However, for this to be true it must take place as a result of Space resisting a distortion geometry created by a mass outside our universe or by our universe itself pushing and pulling itself against the confines of Space, like waves against the shore. This then requires us to entertain the idea that our universe is not the only one here, there may be more separated by Space the composition of which requires further study.
If our universe is not alone and any attempts to see beyond it are obstructed by the Cosmic Microwave Background (CMB) then it would not be ill advised to assume that, though Space is ubiquitous, universes exist such that they are separated, contained and constrained by the CMB. If this in turn is true, then it may require us to accept that the so called 5th dimension or Space, that seems so illusive and impossible to identify, is in fact an aspect of the CMB itself. This would mean that it has to be considered the distortion geometry interacts with the CMB (Space) to create gravity. It would also require us to entertain the idea that the CMB is more than just a remnant of the big bang. It may in fact be the illusive side view, or 5th dimension itself, right under our noses. If there are many or an infinite number of universes and each one of these universes is represented by just one signal which when tuned into becomes a "channel", dimension, continuum or our MDT universe as we observe it from the top-view then all of these continuums or "signals", of which our own universe is merely the one we are tuned into, when combined form the noise observed in the CMB. It looks and sounds like swarming bees because what is being observed is all the continnums, universes or signals in one Space. Consequently, the CMB is our first introduction to Space itself. "With a traditional optical telescope the space between stars and galaxies (the background) is completely dark. However, a sufficiently sensitive radio telescope shows a faint background noise, or glow, almost isotropic, that it is not associated with any star, galaxy or other object" (Wikipedia 2017). What this would mean is that astronomers are in fact observing with a radio telescope is in fact a semblance of Space itself, or the 5th Dimension. The CMB may be key to unlocking much of what science does not know and understand about gravity and that is required to build devices that can control gravity itself. What appears as visual noise in the CMB is probably not noise at all. It only appears as random noise to us because we have not as yet designed a receiver that can interpret what the CMB is broadcasting, which is most likely highly evolved, intelligible and organised and may include inter-dimensional locations that act as beacons for use in tuning into and out of sectors of the universe. Any attempts to understand the CMB or explain it using the four (4) known dimensions is probably a waste of time and will yield a faulty model with misrepresentations and misinterpretations that only further mislead the scientific fraternity. Einstein's Space-Time consists of 3 directions and the 4th dimension Time (Where time is nothing more than moving matter or "Motion") which together form Space-Time. We have shown that this is not Space-Time and corrected it as Distance-Time or a Matter-Distance-Time (MDT) or even better still a Matter-Distance-Motion universe. If the CMB is indeed Space it is of a 5th dimensional construct. Most people try to add a 5th dimension to the 3rd and 4th to arrive at a 5th. Interestingly enough we do not add an additional dimension to the 4th. Instead we should subtract Time and subtract Distance. Why? Because to tune into a new continuum, dimension, signal or universe we do so by tuning out of the one we are already situated in. Having removed these we enter the 5th dimension and tune from here into the next location of our choosing. I have already extensively elaborated how the process of tuning is the function of removing Distance and Time from the physics we use to understand our universe. This 5th dimension is Space which can be identified as or through the CMB. If this is true the CMB cannot be understood using conventional physics. Trying to analyse a 5th dimensional universe using 4 dimensions will yield many false positives and it is very likely that almost everything physics and astronomy thinks it knows about the CMB today is flawed, from faulty top view observations and therefore a half truth. It can only be fully understood when studied outside of Time (Motion) and Distance. Since Time (Motion) and Distance are the foundation upon which the entirety of physics is built today, we do not as yet have the formal reasoning, math, approach or model by which to begin to understand the CMB or how to build a 5th dimensional tuner that will make sense of it, although I have tried to from the beginning of this write up to do this where I have also tried to point out that motion in linear direction may not exist, however, motion with or movement with no vector such as spin may be accommodated in attempts to understand Space; for instance the difference between a distance of a kilometer and 5 million light years should not be seen in terms of how far off they are but rather in terms of the frequency they spin at, their radius from the centre of spin and the direction they occupy in that radius which when tuned into is gained. Every location in the MDT universe from earth to the furthest galaxy, will have a specific frequency in Space on a scale infinitely tinier than that at the quantum level allowing matter to be manipulated below the nuclear level and distances to be covered across the universe, if the Spatial frequency of any location in the Distance is known, it does not matter how far away it is in our universe, it can be tuned into and it doesn't matter how tiny it is, it can be super-manipulated, for instance, allowing bespoke materials to be constructed from the electrons, protons, nucleus and below. Spin would allude to a form of physics centered primarily on frequencies based on spin which become the only basic means of rationally linking Space, where there is no distance to our MDT universe, distance and spin having some shared properties that can be made use of to find workable mathematical linkages to states of existence that function on different properties. Staying in this line of thought, if a complete spin cycle is equivalent to refresh rate and is the only logical means of linking true Space and the MDT universe then this tiny aperture may yield more about technologies capable of directly manipulating Space and therefore a plethora of other phenomena including gravity. Thus far we generally study 2 dimensional waves using amplitude (y-axis) and time (x-axis) to understand magnetism, but to understand gravity we would have to consider a third and fourth property of electromagnetic waves and this is a rate of spin around the x-axis just as fast as the wave moves along the x-axis that loops or corkscrews both the waves amplitude and time from a 2 dimensional construct that is electromagnetic to a 3 dimensional wave. This wave is then pulsed on and off just as rapidly for instance to create a 4th dimensional wave property consequently allowing spin and a refresh process to create specialized frequencies that open a path to harnessing gravity by linking it to a 4th dimensional type of electromagnetism to which Space is able to respond with a push effect. Interestingly enough, in quantum mechanics, it was discovered quite late that electrons do actually spin. However, yet again, we find that this specific spin property was strangely missing from Schrodinger's understanding of waves and it was consequently not included in his famous wave-equation. How is this oversight even possible? Its incredible how great minds in physics such Einstein and Schrodinger could make such immense strides and insights and yet produce ideas that seem to have very obvious flaws that appear to act as misdirects that prevent a clean or clear understanding of gravity and Space. However, it is also possible to conclude that Schrodinger may have noticed spin but because his physics was based on Einstein's erroneous Space-Time model he could not account for it in his equations and decided to ignore it altogether. This possibility simply emphasises the potential dangers Einstein's flawed model has had to weaken the analysis and research of past and modern day physicists. Similarly, Einstein's misdirect affecting the outcomes of Schrodinger's work may have and is still similarly affecting the work of physicists. What other small lab research and billion dollar experiments working in earnest are likely being led in the wrong direction by this misdirect? The potential misgivings the misdirect can cause in physics are real.
Nevertheless, I am of the opinion that the complexity of the technology used in the creation and design of this receiver or tuner, will determine the myriad of ways in which gravity and related phenomena can be manipulated to obtain desired results, much the same way electricity is used by different technologies to produce many devices with innumerable uses. The reason why we fail to identify the CMB for what it really is, is because of the misdirect in Einstein's model that mistakenly labels Euclidian space as Space itself, when it is in fact just Distance linked to Time, Time itself simply being nothing more than Movement or Motion.
One of the misgivings of Einstein mislabelling distance by calling it Space, has been the inevitable confusion it has created amongst the scientific fraternity. It misdirects astronomers by making them believe that when they look up at the stars in the night sky, they are looking at "Space", and misdirects physicists by making them believe the vast emptiness between a nucleus and electrons is "Space" or that when distortion geometry is observed they are looking at Space-Time. This error is so pervasive it almost seems clandestine in that its easy to conclude that it might be deliberate. It gives humanity only one option for controlling gravity, through Einstein's erroneous Space-Time making it practically impossible to do so. We keep on walking passed the elephant in the room, even though we are desperately looking for an elephant. A consequence for science is that right now gravity, its understanding manipulation and control is pretty much like electricity during the Stone Age, in that electricity has always been here, what's been lacking in the past is the means to see, understand, control and harness it. It seems this is the very same problem with gravity today, its right here in front of us, but because of Einstein's error, we simply can't see it, understand it, harness or control it, even though its right in front of us all. Gravity, when seen and understood should be just as easy to control as electricity. This is unlikely to happen without correcting Einstein's model.
In fact seeing Space and where gravity is coming from may not be as impossible as you think. It won't cost you an arm or a leg either. Should you know what you're doing you may not have to build a trillion dollar multi-kilometre long array to understand gravity. If you have one of those old television sets, go and switch it on. When you are between channels the "snow" you see is your first introduction to Space and where gravity is coming from. About 1% or less of the noise you see on the screen is the CMB.
Back to LIGO
Why are we being pushed to the earth with a force of 9.8N, not pulled by the earth? The distortion may be caused by the earth, but the gravitational force of 9.8N is not coming from the earth, but from Space.
A billiard ball can knock another billiard ball. This is an example of matter on matter action. Distance is just an aspect of matter, as matter is just an aspect of distance within the same continuum. For a large mass to act on the geometry of distance bending, pushing, squashing or stretching it is nothing more than matter acting on matter. What LIGO has done is very important. To some extent what it has done is not prove Einstein was right, its proven he was close to the mark, but to some extent got it wrong, if the labels are placed correctly. With the labels in the right place it can be seen that the distortion detected at LIGO is not a gravity wave. It is the detection of a distortion in the matter-distance-time (MDT) universe we occupy. Any gravitational effect is "push-back" or "Resistance" from ubiquitous Space: a by-product of the distortion that is a response from Space. Why this cannot be seen is because Einstein mistakenly labels distance as Space in his concept of Space-Time or Relativity Theory.
The diagram above shows that mass, gravity and acceleration do not emanate from our dimension. Correcting Einstein's model shows that they are all created by the same force. Since it emanates from Space, it is outside our dimension. Consequently, as I mentioned earlier: objects have no actual or substantive volume or mass and therefore no genuine weight. Mass and time are useful for the experiential Universe, but are not practical or efficient to the mechanics of how the Universe is created (that is, the operational Universe) it is not scientifically practical for matter or objects to be of excessive volume or weight and of primitive top view "Einsteinian-Space" itself to be of great “distance” or of time to be of a burdensome duration; these will all be inevitably seen as very crude ways of understanding the Universe and the physics that applies to it.
Why do I keep be-labouring this point? Thus far very little is known about Space. We tend to think we know a great deal about it because Einstein mistakenly took distance and called it Space. This is fine because its a perception based error anyone can make and is one belonging to an age in science. It has had immense repercussions in physics, but as it is with any subject of importance changes in perspective bring about new ways of approaching the same ideas. If for instance, distance and Space are completely different things, then the push effect depicted in red in this diagram does not have to be caused by distortion geometry in our matter-distance-time (MDT) universe. LIGO has proven that our MDT universe is very stiff or inflexible. Geometric distortion as a technology or means of creating gravity would require tremendous amounts of power to generate the resistance that would induce acceleration and that we would observe as gravity. Since gravity does not emerge directly from distortion geometry, but from Space, which is outside our MDT universe and the fact that it is stiff provides a very simple explanation for why gravity is experienced as such a weak force. However, since gravity is a by product of geometric distortion why use this very difficult, extremely weak, nearly impossible route to manipulate gravity? Why don't we boldly go where no man has gone before and instead go straight to the source of gravity, namely Space? Tiny manipulations of Space can induce a much larger push effect on matter-distance-time. However, finding Space, understanding what it is and how it works outside of matter-distance and time is the frontier physics needs to delve into. But you cannot look for something that you believe you have already found. You will simply stop looking, which is the tragedy. Today we mistakenly point to distortion geometry and call it "Space" or "Space-Time" when in fact we have mislabelled and therefore have not as yet found what it is we speak of inevitably misleading ourselves. When this happens physics stops moving forward in leaps and bounds because its trapped in a theory loop caused by a misdirect. This is why I keep be-labouring and stressing the need for science to correct Einstein's model.
Recent Media to Watch:
This interesting documentary released recently (January 2019) concurs with my analysis. Here is the link to it. Its called Einstein's Quantum Riddle. The Institute of Advanced Studies in the documentary is getting closer to the truth (at 46:34 in the video). When Robbert Dijkgraaf (Director of the Institute of Advanced Studies and Leon Levy Professor) talks about correcting Einstein's model or understanding of the universe he implies Space-Time is actually incorrectly interpreted by Einstein. The Institute is absolutely right in the sense that when Robbert Dijkgraaf talks about removing "Space and Time" altogether what he actually alludes to is removing the concept of "Distance and therefore Motion" (as they are conventionally understood) from Einstein's interpretation. Space and Distance are separate and distinct as are Time and Motion and the general mistake that Robbert Dijkgraaf remarkably corrects by removing Space-Time (sic Distance-Motion) is Einstein's assumption Space and Distance are one and the same which is why it then became impossible for Einstein to complete Unified Field Theory, which, with this problem now resolved, it should be possible to. Its certainly interesting to see that Space can exist independently and irrespective of Distance to create a "Holographic Universe" where spooky action at a "distance" becomes somewhat redundant when distance is removed to hypothetically create a universe consisting purely of quantum entanglement. This is a great documentary. Its nice to see my analysis made many years ago proving to be correct today.
Notes:
Space and Time
According to Einstein's model, when you get up in the morning, get dressed then go to the kitchen for breakfast; then stop in the living room to catch the news on TV you have been to three different rooms at different times: the bedroom at 7am, the kitchen at 8am and the living room at 9am. However, according to my theory you woke up and got dressed, had breakfast and watched TV in one location, one frame or one dimension. Imagine you were watching these events on your TV, they would all have taken place in one location, that is, the TV screen in front of you. The bedroom, kitchen and living room are in fact in the same location, frame or dimension. If a physicist was calculating what you did based on the distance between each room and the time it took to move from one room to the next all these calculations would in a sense, be baloney: because you never actually moved to get from one room. You were in fact in the same place the whole time, therefore time itself, as you may have been taught to understand it, did not elapse.
A Final Conclusion: Economics and Theoretical Physics [July 2020]
I can conclusively say I have broken the seals, so to speak, on two important areas modern science has failed to date to deliver conclusive results: the first is the inability of economics, business, accounting and finance to intrinsically identify the cause of and provide a solution to ending poverty. The other is the inability of physics and the sciences in general to explain and provide a working model or mechanics of a system able to deploy and harness gravity. Even though I may try to play it down, I am glad to say in this month of July I have successfully and beyond reasonable doubt accomplished both these tasks. The arguments in the writing above demonstrate I have spent many years trying to get down to the root of these problems and the knowledge paradigms in which they were en-scrolled, therefore this month represents a personal triumph and I feel at peace. Gravity is the most powerful force in the universe when it comes to humanity's physical existence, but scarcity is the most powerful when it comes to the resources humanity needs for its well-being. The fact that the sciences were unable to provide conclusive answers to these problems was a troubling issue for me that raised many questions about inconsistencies and inadequacies in knowledge and ascribed intellectual limitations. These were perception based problems, the kind it seems, are the most difficult even for the most astute minds because they require counter-intuitive processes to unravel the mysteries that cloud the path to accurately determining their truths.
[1] Punabantu Siize (2004) “Time”, Revision of Punabantu S (Nov 2003) “African Time”, Post Newspaper
[2] Einstein Albert (5th May 1920) “Ether and the Theory of Relativity” (an address delivered on May 5th, 1920, in the University of Leyden)
[3] Ibid.
[4] Op. cit.
[5] Jeremy Chapman (2010) “Relativity and Black Holes : The Beginning Becomes the End Becomes the Beginning : A study of cosmological birth and death”
[6] Tim Folger (2007) “Newsflash: Time May Not Exist”
[7] Wikipedia (2010) “Wave–particle duality”
Piloting through Geodesics will be different from aeronautics. Pilots will need to understand how Geodesics perform and react to an object traveling through the Geometry of Space, in much the same way the performance of craft in a wind tunnel are assessed, the diagram above, in this sense is like a Geodesic tunnel. Once again viewing Space-Time as having Geometry (Se-Te) is useful for depicting or explaining gravity but it should be applied with caution in the knowledge that with analysis from St-Tt "there are no fields" outside an object or vessel, all forces act within their mass. If this distinction is not understood it becomes an obstacle to understanding gravity. There are speeds and rates of acceleration for which the Geodesic elasticity declines (moves from being elastic to inelastic, X-Y) and speeds and rates of acceleration for which Geodesic elasticity increases (moves from being more inelastic to more elastic, Y-Z). For vehicles or objects traveling at normal velocity in the super-slow range of Mach 20 or so toward 0x the biggest problem is air resistance. For objects approaching the speed of light Y, the biggest problem is the resistance of Space-Time (Se-Te), which is expected to become infinite at Y. However, it is important to note that there is a difference between speed or velocity and acceleration. A pilot can navigate a vessel in such way that there is control over the g-force the passengers experience. For instance, to perform supernormal acceleration, its possible a pilot can accelerate from close to 0 to 3 times the speed of light for 1/10th of a second in order to take off or execute a turn. He or she can use a rate of acceleration that exceeds the speed of cause and effect thereby slipping through the light barrier without ever having to actually reach the speed of light limit. Technically, this means for objects below supernormal rates of acceleration, the speed of light can become a fixed barrier at early sub-light speeds because they cannot escape their own mass, which would agree with Einstein's view. However, objects accelerating at supernormal rates may alter the elasticity of the Geometry of Space allowing them to maneuver around this barrier by manipulating their own mass or g-force. Se-Te Geodesics are likely to respond relative to the rate of acceleration, not just a constant velocity. This means a pilot can possibly by-pass the speed of light limit at Y by accelerating from 0x to 0z at a supernomal rate of acceleration long before it reaches the speed of light, such that mass cannot keep up with the vehicle consequently making it impossible for the barrier to prevent it from crossing over. Whether this kind of slip-stream is possible or not will have to be verified through experimentation. In this example, unlike in the first, there is no Geodesic boom as the pilot skillfully accelerates through Geodesics at a predetermined pace ditching mass and moving directly into the slip-stream where the craft experiences no resistance since it is moving outside normal timing of the physical laws of cause and effect that restrain objects or craft moving at lower speeds and lower rates of acceleration. Even conservation of energy laws are governed by functional rates of cause and effect; there are natural rates at which objects and chemicals react, move, fall, rub, turn and burn and without circumventing any of these laws, it must be accepted that when an object begins to move at supernormal speeds that exceed the pace and timing of these natural physical laws the consequences of the increasing lag between cause and effect has to be taken into account by physicists. For a vessel travelling at Mach 4 heat at 0x, wind resistance, friction and heat are obstacles, however, for an object travelling at supernormal speed, air may make contact with the vessel's surface but it may be moving faster than it takes for natural physical laws to convert that contact into pressure, if doesn't have time to build up pressure before the vessel has moved on then there will be no resistance, without resistance there is no friction and without friction there is no heat. Furthermore, though traveling at the super-slow speeds at 0x the pilot can still make turns with no g-force simply by momentarily accelerating into the slip-stream while executing the turn then returning to normal 0x flight speeds with the objective of increasing maneuverability without gaining excessive speed. Though Geometry of Space and Geodesics are a Se-Te reconstruction used to make explanations seem simpler when analysis switches to St-Tt the "fields do not exist approach" applies. |
The diagram above shows that physical laws are governed by cause and effect and are tightly chained together such that as a vessel begins to travel at Mach 3 and above friction and heat are at high levels due to air resistance. The way around this is for engineers to design aerodynamic vessels, apply special surfaces and so on which improve performance by reducing friction, pressure and heat. This can be described as reducing outright resistance to reluctance. Every force in physics be it mechanical or chemical is applied through cause and effect that is calibrated by time. When a vessel begins to accelerate so rapidly that it begins to move faster than the time frame for cause and effect the links in the chain start to come apart and reluctance is likely to give way to submission allowing the vessel to move in peace unhindered by air resistance. Technically, none of the laws in physics be it pressure, friction heat or chemical reactions will work effectively at this point because it will be as though cause and effect are held at bay or frozen in time. This allows a vessel accelerating fast enough to slip-stream with no resistance or with full submission. Its also important to note that the slip-stream for Space-Time (Se-Te) at Y and the slip-stream for air resistance are likely to give way at different velocities and rates of acceleration. Every medium, be it air, water, solids, Se-Te will have a specific slip-stream threshold for resistance, reluctance and submission that can be discovered through analysis, research and experimentation. The slip-stream for air, for instance, may occur at velocities much earlier than scientists expect. Its also important to note the difference between reluctance and submission. At the reluctance stage the bond between cause and effect is strong reinforced by time being unaffected for the most part the vessel is what creates the reluctance. However, as velocity increases the bond between cause and effect will begin to weaken until a speed is reached where cause is moving toward effect at a pace slower than a vessel's velocity. When this happens it is not the same as aerodynamics. The process will appear to neutralize physical reactions at the molecular and subatomic level allowing the vessel to move unhindered by resistance from air and will have to be studied in greater detail especially through experimentation. Its also important to note once again that causality is not the same for all substances. Causality for electricity, for instance, can be as low as 100th the speed of light. Signals in the nervous system travel at 70-120 meters per second. Some chemical reactions are very slow while others can be extremely rapid, for instance scientists have observed hydrogen atoms bind onto and then leave a sheet of graphene, all within ten quadrillionths (10^-14) of a second. Within its own framework causality will often appear instantaneous to individual mediums. Therefore, resistance, reluctance and submission will have to be studied on a case by case basis. Tier 1 gravity, though entry level, may prove important for testing faster than light acceleration through experimentation. The rate of acceleration rather than just velocity may hold the key to understanding gravity to the same degree electricity is understood. Accessing the speed of light for geographical distances on earth may be impractical, since at this speed a vessel could circumnavigate the earth in a fraction of a second. By using the slip stream it could do so without causing adverse effects to the vessel or the earth. However, when it comes to astronomical distances the speed of light can be considered exceptionally slow. The technology to generate thrust many times greater than the speed of light is currently possible through tier 1 gravity, and velocities greater than the speed of light may be possible once a vessel is in the slipstream where these is no resistance from Se-Te. Whether it can travel at several times the speed of light in the slip-stream would have to be tested through experimentation. Should the slip-stream be possible but prove to impose limits on velocity the other option is to further enhance the vessel by including a warp bubble. When the vessel accelerates at supernormal velocity much lower amounts of energy may be required to then generate a warp bubble to further increase its velocity - this would involve a jump to light speeds, then a jump to warp speeds be that can be limitless in terms of velocity. The other option is to develop the technology to make jumps from one location to another over distances that are so large they are impractical even for vessels traveling at light speed. As we saw earlier this technology that requires teleportation to cover extraordinary distances is very likely to be gained from technologies thought by physicists to allow time travel based on Einstein's accurate but limited experiential side view of Space-Time (Se-Te). As shown by the diagram the speed of jets and rockets even if they were to reach Mach 100 would barely register on this scale. They would be like an ant crawling along the ground to a far off destination and would remain somewhere around 0x. Even if a vessel were traveling at Mach 20 a vessel built on advanced entry level (tier 1) or first generation gravity could circle the earth fast enough to find the vessel moving at Mach 20 in relatively the same place. This is why even 6th generation jets depending on air, (be it jet, rocket, turbojet, ramjet, hyperjet, or scramjet) which can only operate at super-slow speeds at 0x would be no comparison for vessels able to operate the full range from 0x to 0z. This type of technology will continue to be useful for super-slow speeds, shorter distances and for acting as a power plant rather than for propulsion. |
If it were easy it would have been achieved long ago. Considering the fact that LIGO funded by the National Science Foundation and operated by Caltech and MIT, despite finding the Higgs Boson have no practical, working gravitation based devices shows that this is not an easy field. Despite Paul Dirac unifying Quantum Mechanics and Special Relativity way back in 1928, as well as advanced approaches that are current such as String Theory there are no working devices, no applications to show that harness gravity; should give you an idea of how peculiarly hard to grasp this field is. This is like a Jedi Master on the Jedi Council that can't build a light-saber, despite years of effort and years of toil. The fact that some of the best universities, companies and other institutions and minds in the world have failed to develop the physics and engineering required to design tier 1 gravity is testimony to the degree of difficulty involved with deciphering how this device will work. Though it may appear deceptively easy to understand once accomplished and the designs are seen, the level of technical difficulty is quite high. As mentioned earlier the most significant hurdle that prevented this technology from being developed before this are the misinterpretations or misdirects found in Einstein's descriptions of Space-Time (Se-Te) and misleading analysis on magnetic and gravitational fields which easily led anyone in this field in the wrong direction. Basically its difficult to find something if you keep looking in the wrong place. The mock-up of twin jet engines powering a tier one gravity drive in the diagram on the left shows that the exhaust from the jet engines is not necessary for propulsion. However, placing the jet engine perpendicular to the direction of travel is more stable in inclement weather and makes for a highly maneuverable vessel by vectoring the twin exhausts in the same or alternate directions to make the vessel omni-directional and still retain supersonic velocity. The engines reduce redundancy and can still be a useful source of thrust applied to enhance mobility rather than just wasting it. The jet engines can be swapped out for a powerplant that consists of electric motors and batteries to have a zero emission vessel as shown in the diagram on the right. In future it can also eventually be swapped out for fission and fusion energy sources currently in development. Each powerplant design and configuration, be it jet or electrical will have its inherent advantages and disadvantages depending on what it will be used for. Tier 1 gravity harnessed to a powerplant able to put out 30,000 horsepower and above should be capable of supernormal acceleration and speed for a small vessel as shown. However, without it the thrust from a jet engine directly applied may only generate 40,000 lbf. The advantages of using the propulsion device become clear. |
V-Tol can be standard for all types of vehicles |
The explanation is that light bends because it follows geodesics, yet later in the blue spandex experiment with the students we identified that an object's mass is not induced or caused by geodesics, but comes directly from Space (St) or the programming executed by quantum mechanics. This is based on a line of thought further reinforced by postulating that both gravitational and electromagnetic fields do not exist in the sense that they do not exert a force. What is observed as geodesic is in fact the path a host of given masses choose to take directed by quantum mechanics as entangled masses communicate with one another, much in the same way level 5 self driving software and technology today is expected to safely guide vehicles on busy highways, except of course these processes become viewed as natural. The vehicles are not being guided by fields, but are communicating with one another or what they "see" and making internal changes in their mass that move them into different positions and when this movement is observed, such as the movement of the earth around the sun, it is interpreted as gravity. This is in fact an example of tier 1 or 1st generation gravity at work. The difference, where quantum mechanics is concerned, is that this self-driving is what becomes the natural laws in physics that govern the behaviour and interaction of matter and mass. For instance, this entails that even the magnetic fields in the Hadron Collider and in fusion energy do not exist and the elements moving through them are not being guided by magnetic fields but this Level 5 "self driving" process created by quantum mechanics. Subatomic particles, magnets, planets , moons and asteroids are all generally moved in this way by "level 5 self driving quantum mechanics" so to speak. The path they take is determined by quantum mechanics, the result being the behaviour of matter and observed path is being misconstrued for a magnetic or gravitational field in Einstein's Space-Time (Se-Te). To consistently and unwaveringly try to explain gravity through the Geometry of Space, basically fields or Einstein's Space-Time and compare it to magnetism can be useful, but ultimately misleading if technically these fields do not exist. It poses a significant problem. However accurate the results may be, basing them on an inaccurate assumption, comes with consequences that significantly hinder progress in physics. For instance, it encourages physicists to believe that since Geodesics determine mass, then the way to control gravity is by bending Space-Time. This is most likely why important undertakings in science like LIGO attempted and succeeded in detecting gravitational waves or fields. It should be noted that matter appearing to react to a field or detectors is not evidence that a field exists. This was alluded to earlier with metal shavings tracing lines around a bar magnet and using these "lines of force" as evidence of a field. This evidence does not hold if the shavings are moving themselves into a formation on signals or instructions from quantum mechanics. The behaviour of the shavings imply fields exist, when in fact they may not.
As mentioned earlier, technically, it means there is really no difference between gravity and magnetism. This would require a re-examination of these processes and devices. Since entanglement ignores distance, there is a capacity for simultaneity for distant events regardless of how far apart they are and it therefore is able to operate in true Time (Tt) or where Time = 0, where zero represents entanglement or a processing speed for information applied to matter by true Space (St). Simultaneity for distant events becomes necessary for maintaining accuracy when processing information (the side view), but is not necessary for the experiential universe (top view), which is time shifted and where Einstein's theories on relativity appear to work accurately. I can understand why it is easy to believe that entanglement or simultaneity for distant events means free will does not exist. This is actually incorrect. It is the other way around, free will can only exist with this simultaneity. When there is no simultaneity then this means there is no free will. This was explained and alluded to earlier by showing that lack of simultaneity reduces events to what can be described as a "record" of events, in which case once an event has taken place it cannot be changed, therefore, there is no free will in this condition. These records or recordings can be reviewed in real-time to form new opinions or views of these events.
Imagine that the vehicles moving in the Coruscant Supercut from
Star Wars were doing so under Level 5 autonomy and each
had a beacon that allows each vehicle to know where other
vehicles are in relation to itself in order to enhance navigation.
A field is not ascribed to their movement
, however, the manner in which planets, moons, asteroids, stars, magnets, and
subatomic particles, particles in Bose-Einstein condensate move can be
described using a similar process driven by
quantum mechanics. Tier 1 or 1st Generation gravitational
force does not need a geodesic or geometry of space to ride
on. An untrained eye may say the vehicles are moving along a
geodesic, floating on a field or riding on a medium the way planes ride air
or boats ride oceans when in fact there is no medium and no field
because each is being steered autonomously
by making changes within its own mass
using what can be compared to "Level 5 autonomous driving" which
is really just a way of referring to the code that manages laws of physics.
This is how tier 1 gravity works and can be emulated using
mechanical engineering.
In terms of quantum mechanics there is little difference between
how ships in the Coruscant Supercut are moving and how the planets
in the above clip are moving. They do not need Geometric Space, Geodesics,
a magnetic or gravitational field to do this, however, these can be considered
a useful method for visualization. The drawback with physicists using
Geometry of Space to try to understand gravity is that it encourages the belief
that mass is created by this geometry when in fact this is not true or accurate.
The technical design for the first device to use tier 1 gravity will make this
evident.
However, according to Einstein any object travelling at the speed of light will gain infinite mas. This should make light speed impossible, even for light itself. And yet there is clear evidence light travels at the speed of light. According to Einstein, as an object approaches the speed of light it experiences time dilation, in other words, time begins to slow down and at the speed of light, time dilates to zero, but an object is said to gain infinite mass making the speed of light a limit appear as simply another blatant and possibly uninformed contradiction. If time dilates to zero at speeds approaching or equal to light then mass in relation to time should cease to exist causing the properties of an object reaching such high velocity to begin to convert from matter into energy or from a particle into a wave in order to accommodate super-normal speeds, creating the impression that an object's velocity in relation to its mass does not easily follow convectional physics at super-normal velocities as we see it do the same with the infinitely small in quantum mechanics. How can these contradictory theories coexist and the ideas remain meaningful without becoming just another misdirect that compromises the ability of physicists to realistically draw accurate assumptions in theoretical physics? It's simply not good enough to just inappropriately explain them away using "relativity" as this - like clever, believable and provable but ill-explained physics leads to faulty misconceptions - has a negative knock on effect that hinders future developments in the sciences - the inability to manipulate gravity despite the strides made in physics is a red flag that points to there being fundamental mistakes in theory being swept under the carpet to worship dated and inconsistent ideas to the extent that it is no longer science being followed. The answer to why light can travel at light speed, according to Einstein, is that it is because it has no mass. So… light exists in state in which has mass and is mass-less?
This video illustrates the unacceptable contradictions, conundrums and patchwork in theoretical physics
that arise out of a faulty model from Einstein that needs to be corrected for the facts
to be interpreted correctly.
This is downright contradictory. However, instead of correcting Einstein’s model the solution in physics is often to explain away improper theory and back this with "relativity", jargon and relevant maths. I am more inclined to believe that light does not travel from point A to point B. Rather it moves like a Mexican wave transferring light. When this transfer is observed it appears as though light is moving. When a stone is dropped into a pond, where light is concerned, the ripple effect takes place such that the water remains in place and the ripple radiates like a Mexican wave. The water itself does not move from the centre to the periphery. Photons, like the lake itself, are ubiquitous and remain in place. The photons contain their own energy. When there is no light the photons, like spinning tops that are dormant remain unmoving and opaque creating what appears to us as darkness. Despite being opaque and creating darkness they still contain energy, but it is in a dormant state. When light is triggered from a source the photons remain in place and like falling dominoes begin to stimulate or appear to transfer an energy trigger to one another causing them to spin, emit light becoming transparent. Since the photons stay in place the speed of light we observe is in fact the rate at which they trigger or transfer the emission of light to one another. The cloaked or opaque photon that was in a dark state of potential energy is triggered and uncloaks releasing light proportional to the source of the emission or stimulation. If this is true then the less photons interact with any other sources of energy the greater their potential or dark energy becomes in the Distance as they lie dormant. With this model all the contradictions concerning light are easier to resolve. It would explain how light can travel at light speed and have mass and yet not have that mass approach infinity. The answer is that light is not traveling at all, photons remain in place and instead change state thereby transferring light. It explains why light is attracted or bent by gravity. It also explains why despite appearing to travel so fast the push effect of light is very weak.
This Mexican wave demonstrates how light propagates.
The people in the stadium are like photons. When the wave begins
the light is transferred through the crowd. When we observe this "moving" wave
it is recorded by instruments as the speed of light. It travels very quickly but because it moves through transfer the impact of light is very minimal as the mass remains very low and only consists of the final photons in the wave that make contact with a surface. However, Einstein's model states that photons themselves move,
that is, everyone in the stadium stands up and starts running around the stadium. This mistake creates the problems and conundrums we see when physicists try to understand and explain light. If this happened the mass of the running people would turn light into incredible thrust or a weapon like a laser with deadly impact. According to Einstein this does not happen because the people running like this suddenly and miraculously have no mass because they are no longer matter but energy. It is quite unacceptable that the physics fraternity accept Einstein's misinterpretation of how light works as it compromises meaningful future development of new theories. These corrections need to be made.
A test is required to determine if light waves are inherently contiguous or non-contiguous...
We have thus far explained why and how light continually has mass and can travel at light speed settling this particular problem using a stimulation theory. But this is not where it ends. Light has other properties that are confusing. It was initially thought that light is a photon or stream of fundamental particles, a theory from Isaac Newton. However, a scientist named Thomas Young, through experimentation later discovered that light diffracts after interference creating patterns on a screen indicating light was in fact a wave. This problem became referred to in physics as wave-particle duality. This was yet another conundrum. Is light a particle or wave? Einstein is thought to have settled this argument by declaring that light was both a particle and a wave. The explanation he gave for this conclusion was based purely on observing this duality as fact through experimentation without further interrogation of the underlying circumstances of these results. We know this because this is where the query ended. Yet we all know what is observed and recorded is not always necessarily what is taking place. At this stage we can make certain deductions beginning with Isaac Newton. Is light a stream of particles? No. It is not a conventional stream of particles. Why? If it were a conventional stream of particles this would be once again the same as if all the people in the stadium stood up and began to run around the stadium creating a flowing stream of particles. We can deduce that this is most likely inaccurate because the intensity of a natural beam of light is far too weak to consist of a stream of flowing particles. This moves us to Thomas Young. Is light a wave? With experimentation it was discovered that light scatters. Can we therefore conclude that it is a wave? No. We cannot. Why? Because to do this we would need to know if it is a contiguous wave or a non-contiguous wave. This difference is absolutely important to physics. Both wave forms will create a scattering effect when tested. How then is it possible to distinguish between the two types of waves?
A contiguous wave is a waveform that can be explained descriptively as follows: Two kids take out a garden hose. They stretch the hose out and each holds an end. One child proceeds to wave the hose vigorously up and down. The child's hand on the other end of the hose receives the wave and his or her hand is sharply jolted. This is a contiguous wave. The wave form passes through the hose continuously with little loss of energy creating a proportional impact at the other end. A contiguous wave will pack a serious punch since the area it strikes will experience the intensity of the full length of a beam from source to target rather than just the photons that make final contact, much like the movement of electrons or electrical current through a conductive medium like wire. On the other hand, an example of a non-contiguous wave has already been alluded to as a Mexican wave. It is most likely that these two types of waves on observation are almost indistinguishable and must be tested to tell them apart. For instance when distorted through Thomas Young's experiment they will both yield a distortion pattern. So how can we possibly tell them apart?
There are probably many ways to do this, but the only means that I can think of on the fly is possibly through the intensity and amplitude of the scattered light and the spread of the distortion pattern before and after distortion. A contiguous wave will tend to pack more energy and will therefore retain more of the intensity of the stream despite it being diffused. It will therefore resist diffusion creating a pattern with a much shorter spread. However, If light is non-contiguous (is like the Mexican wave) it will be less resistant to being scattered and may therefore produce a more widely and more easily dispersed pattern. These are of course merely assumptions and could be wrong. If it were possible to measure the intensity of the pattern it may also be discovered that the energy of the light beam after distortion is weaker as a result of a greater propensity for photons using a non-contiguous transfer method to easily drop away causing energy losses. Non-contiguous waves are light on their feet and can move very quickly but do not pack much intensity since their mass is limited to the last particles to make contact with a surface. The final intensity will tend to only be proportional to the surface area of the photons that actually make contact with a final surface. Since what is proposed here is that all light traverses rather than travels (is non-contiguous) it may not be possible or may be quite difficult to re-create or mimic light that is traveling (moving contiguously) from a source of emission (point A to B), except maybe by employing the use of concentrated light or a laser to mimic the intensity of a contiguous wave. A contiguous wave will strike a target with the concerted mass of photons that comprise the full beam. The fact that natural light is so weak that we must use laser technology to mimic a contiguous wave is already a strong indication that light is most likely inherently non-contiguous. This is such a simple experiment that it could probably be carried out in a high school lab. It is also plausible that non-contiguous waves contain their own energy and will tend to continue to propagate at a fixed rate or speed for a while if not in a vacuum and endlessly if in one even after the source of emission has been turned off whereas contiguous waves remain tethered to the source of emission and under the same conditions will die out more quickly when the source stops providing energy. If the suns rays were contiguous I doubt it would be possible to step out into the open without being incinerated or electrocuted. However, if whoever conducts this simple experiment discovered and was able, with a decent test, to prove comprehensively and with replicability that light waves are in fact naturally non-contiguous this discovery would demonstrate that Einstein's space is not a vacuum. This would potentially overturn the entire foundation of physics and be one of the most significant corrections or discoveries of this age.
This explanation of light also allows us to understand a significant problem in astronomy. Scientists have discovered that the speed of light was faster when the universe was expanding at its very beginning. Light speed at this time was faster than it is now. The problem is that according to Einstein's model the speed of light is a constant that has always been the same and will never change. Physicists are then forced to develop very complicated ways of trying to explain or reconcile theories that seem irreconcilable in order to keep their theoretical physics in Einstein's flawed model or risk ridicule. The brilliant physicist Steven Hawking wrote his PhD thesis on this problem. However, if Einstein's model is flawed and light does not in fact travel, but is transferred then it is common sense that the speed of light is in fact not a speed at all, but a transfer rate. If the rate at which light is being transferred between static or cloaked photons is increased, then light itself will appear to speed up beyond the speed of light constant. It can do this easily without violating the rules of light speed by gaining infinite mass because its not actually moving. If the rate at which light is being transferred between static photons slows down, then the speed of light will also appear to slow down. It then becomes quite easy to explain why there are variations in the speed of light during different stages of the development of the universe. Everything related to theories on light fall into place logically and elegantly.
We know that matter and energy are basically the same. Variations in the speed of light may provide insights into the universe as it exists today. During the creation of the universe evolving conditions including the Distance itself were most likely in various states of turmoil. In some of these states light slowed down sufficiently as to break apart and begin to recombine into different substances. It is possible that slowing down the speed of light is similar to melting metals allowing them to be fashioned into different objects with the exception being that primary photons as fundamental building blocks created a virtual soup out of which any imaginable substance could emerge depending on how the primary photons came together during convalescence. It is most likely breaking subatomic particles down to their fundamental building blocks will eventually reveal that they consist purely of primary photons in various states of decay, this decay merely being the various patterns and ways light which slowed down convalesced to form a soup of particles from which all subatomic particles that form matter itself originate.
The LHC has limitations in that it is unlikely matter can be broken below the subatomic state even with an experiment of this scale. The LHC experiment and its magnets could be reconfigured to decelerate light instead of accelerate particles as a way of more concisely achieving its objective. Since light has electromagnetic properties and if it does not travel but traverses distances through a transfer rate and a non-contiguous wave it may make sense to use magnetic fields to slow down the transfer rate and observe how very slow light breaks down and combines to form new matter. Experiments like this can be conducted on matter itself armed with knowledge that at its core matter is just light since it is constructed from primary photons. All matter is therefore simply a form of non-contiguous light. The LHC attempts to understand gravity through geodesics or the Geometry of Space (Se).
Slowing down light is likely to have the same effect as loosening the "glue" that holds subatomic particles together. So it is no mystery should substances affected in this way begin to break apart and therefore liquify. Since its known that the recently detected Higgs Boson plays a role in holding matter together and is linked to gravity it should be no surprise that slowing down light creates gravitational effects. In addition to this Once light slows down it may be able to convalesce into natural physical materials which may account for how 4.9% of the matter in the universe was created while the remaining 95.1% consisting of dark matter and dark energy is light or "primary photons" that convalesced at different speeds in disparate conditions within the same environment so as to produce matter created from the same substance but that has convalesced into matter with dissimilar attributes and properties. Its interesting to note that though space is black, the darkness may simply be a form of light. Decelerating matter, for instance using the Bose-Einstein Condensate (BEC) experiment with lasers to absolute zero may simply yield dark energy and dark matter, but unlike dark matter and dark energy found in space it may be of type that can only exist at low temperatures. . Decelerating light itself is unlikely to yield much because, technically, light is matter that has already been decelerated but that remains in stable state at diverse temperatures but that in its natural state travels at the speed of light. However, decelerating it should convert light into a plasma while it is suspended and if the appropriate electromagnetic blueprint where applied to decelerated light it should transmute that light into a different substance, decelerating or capturing light is only the first step in an experiment to transmute it into a different substance. In its decelerated state, while in a plasma form, scientists should be able to use electromagnetic fields to harness the plasma as a route to creating gravity. What this implies is that there must be an infinite range of attributes to matter that can be created in the lab in this way, i.e. any characteristic can be crafted into it to create designer matter. This further implies that some types of matter, which like sodium, when decelerated using BEC are able to remain in the plasma state at room or any temperature the scientist creating it desires. This type of matter would be ideal as a base substance for building exotic matter with any kind of attribute or property desired and for generating gravity using electromagnetic fields. Finding matter that behaves in this way at present may be a little like Thomas Edison testing different filaments for the desired light bulb. To build matter from light, as strange as it may seem may involve decelerating matter (which is inherently light) as seen in the BEC and transmuting it using fields into some other substance. However, to decelerate light itself to speeds were it loses a straight line trajectory (rather than break down since its already in its fundamental particle form), deceleration itself should be the first step to creating new matter. While decelerated it would have to be exposed to an appropriate electromagnetic field or blueprint. What is exciting is that scientists are already able to physically slow light down by trapping. It should nevertheless be noted that the electromotive characteristics of light that can propagate at diverse temperatures may mean that if transmutation does take place it may only hold briefly. Nevertheless, once light is re-engineered in this way it should be possible for scientists to create light that stays in place, and basically becomes plasma. Finding a BEC substance that will remain a plasma at room temperature is ideal for controlling gravity and designing specialized or exotic matter and it could be designed from the processes like the BEC itself. This is the opposite of the LHC which breaks up matter to study what it fundamentally consists of.
Faster than light travel
We have thus far explained why and how light continually has mass and can travel at light speed settling this particular problem using a stimulation theory. But this is not where it ends. Light has other properties that are confusing. It was initially thought that light is a photon or stream of fundamental particles, a theory from Isaac Newton. However, a scientist named Thomas Young, through experimentation later discovered that light diffracts after interference creating patterns on a screen indicating light was in fact a wave. This problem became referred to in physics as wave-particle duality. This was yet another conundrum. Is light a particle or wave? Einstein is thought to have settled this argument by declaring that light was both a particle and a wave. The explanation he gave for this conclusion was based purely on observing this duality as fact through experimentation without further interrogation of the underlying circumstances of these results. We know this because this is where the query ended. Yet we all know what is observed and recorded is not always necessarily what is taking place. At this stage we can make certain deductions beginning with Isaac Newton. Is light a stream of particles? No. It is not a conventional stream of particles. Why? If it were a conventional stream of particles this would be once again the same as if all the people in the stadium stood up and began to run around the stadium creating a flowing stream of particles. We can deduce that this is most likely inaccurate because the intensity of a natural beam of light is far too weak to consist of a stream of flowing particles. This moves us to Thomas Young. Is light a wave? With experimentation it was discovered that light scatters. Can we therefore conclude that it is a wave? No. We cannot. Why? Because to do this we would need to know if it is a contiguous wave or a non-contiguous wave. This difference is absolutely important to physics. Both wave forms will create a scattering effect when tested. How then is it possible to distinguish between the two types of waves?
A contiguous wave is a waveform that can be explained descriptively as follows: Two kids take out a garden hose. They stretch the hose out and each holds an end. One child proceeds to wave the hose vigorously up and down. The child's hand on the other end of the hose receives the wave and his or her hand is sharply jolted. This is a contiguous wave. The wave form passes through the hose continuously with little loss of energy creating a proportional impact at the other end. A contiguous wave will pack a serious punch since the area it strikes will experience the intensity of the full length of a beam from source to target rather than just the photons that make final contact, much like the movement of electrons or electrical current through a conductive medium like wire. On the other hand, an example of a non-contiguous wave has already been alluded to as a Mexican wave. It is most likely that these two types of waves on observation are almost indistinguishable and must be tested to tell them apart. For instance when distorted through Thomas Young's experiment they will both yield a distortion pattern. So how can we possibly tell them apart?
Helium neon (He-Ne) beam diffraction pattern through a single narrow slit. |
Physicists have yet to determined whether light waves
are contiguous or non-contiguous. This creates many incongruities
that fail to reconcile important areas of physics.What experiments show is that
particles tend to behave as waves most likely because when they are streamed through double
slits for instance they flow into other pre-existing particles re-creating a wave pattern before they
strike the surface of the test area. This would indicate space is not a vacuum. Furthermore there is
no discrepancy when they are observed close to the slits they emerge from as at that point they will
still exhibit the properties of a particle.
particles tend to behave as waves most likely because when they are streamed through double
slits for instance they flow into other pre-existing particles re-creating a wave pattern before they
strike the surface of the test area. This would indicate space is not a vacuum. Furthermore there is
no discrepancy when they are observed close to the slits they emerge from as at that point they will
still exhibit the properties of a particle.
1. The other explanation for the scatter pattern from a dual slit may simply be that when one slit
is used the scatter pattern can be described as 1 dimensional. However, when two slits are used
the interference pattern is 2 dimensional, yet it continues to be viewed and read as a 1 dimensional result,
which makes it appear incorrigible when compared to the result from the single slit experiment.
The dual split is present in the scatter pattern except this pattern or information must be viewed,
read or assessed as a 2 dimensional object. When the ideal or correct manner for doing this is
applied the scattered pattern should appear as two single slits, showing that
the method of detection or observation being applied is the correct one.
2. Another explanation may be that when light strikes the slits it is breaking up and
consequently rarefying. The slits are therefore acting like an "atomizer". The light emerging
from the slits is therefore more refined, much like that observed in Bose-Einstein Condensate.
If physicists are unaware that the BEC consists of light particulate not waves then it is also unlikely
that they are aware that light can revert to a fine particle cloud-state in this way where it
behaves like waves. In this state the particles are finer such that they scatter more widely
and appear to become or behave like waves when in fact they consistently
remain particles except more refined. This would be unusual, because fine
particulate cloud observed from slits is taking place at room temperature
when normally it is only observed or believed to occur at very
low temperatures in the BEC. Nevertheless, this possible mistake in
interpreting what happens to light as it emerges from the slits
makes perfect sense if the BEC being described as "waves"
when it is in fact clouds of fine particulate is also a substantive
misinterpretation of what is being observed. For what is being
observed to be understood it must be considered that an electron
is just a construct of finer particles
observed in the double slit experiment and in the BEC. This
also explains why even when one electron is fired at a time
the pattern remains consistent. The pattern that is rendered is the code
that describes an electron. It implies that even in this
refined state the particulate exhibits quantum properties associated
with different types of light.
Thus far it is known that the BEC can be created through extremes
such as extremely low temperature and extremely high pressure
however, extreme rarefaction (caused by forcing electrons through slits)
seems the most logical means of gaining the effect at room temperature
if indeed the BEC consists of fine particulate rather than waves. If
the diagram below is understood, what the extremes are doing is merely
forcing a change in observation and not a change in the electron itself.
If this is the case then rarefaction seems the most practical method
for achieving the same result.
If the the slits are designed to only allow one electron at a time extreme rarefaction will still take place. Several factors will affect the intensity of rarefaction, for example, the smaller the slits in proportion to the electron the more extreme the rarefaction will be. Smaller particles acting in a coordinated manner makes them appear as diffracted waves.The diagram above attempts to explain the single and double slit experiment. When the electrons pass through the slits they are forced by extreme rarefaction to switch from the experiential view SeTe to the side view StTt where what is observed as a new more complex wave interference pattern is in fact the electron breaking up into smaller parts or into cloud particulate. The the fine particulate rarefying creates the interference wave pattern not the other way around. However, technically there is no matter (electron) and no observer, all of this takes place within StTt, which is why in one view, which is like a hologram, the electron appears as a point when in fact it always remains a particle cloud, which is the code, pattern or information (described as uncertainty) as opposed to what the pattern or information creates (the point, electron or atom) seen in both the double slit experiment and the BEC experiment. This process may illustrate how and where virtual reality and physical reality are integrated. There is a need to de-clutter the science behind particle physics. For instance, 1. There may be a need to reduce the entire standard model that defines atomic structures and accept that firstly there is only a Tron, playing out electrons, neutrons, positrons, gravitons, up quarks, down quarks, Bosons and so on, each and every element and its diverse characteristics in the found and yet to be found in the standard model. By having a refresh rate it may even be what creates antimatter. 2. Matter in general as human beings interact and see it, as much as it may appear inorganic or organic is "smart" in that it consists of Trons. The Tron is "smart" in the sense that it becomes and forms any kind of force or matter it receives instructions to become from the BEC through entanglement, as we saw with the red line of communication or entanglement in the animation of the two bodies passing each other. 3. The Tron "does not exist" in the sense that it is a virtual particle, as is illustrated by the diagram above. It does not exist except as a hologram or projection (SeTe) of the mechanics and computations in the BEC which is the only part of the universe that is substantive (StTt) nothing else apart from this exists. 4. Matter and energy in the shape of a Tron remains what it is, but it is fundamentally malleable, in the sense that it remains what it is and does what it does on instruction from the realm beneath quantum mechanics that is yet to be uncovered by physicists and fully understood. All the matter around human beings appears "dumb", when it is in fact "smart" in the technical sense that it is constantly in a receptive state where it will remain as it is or change and become anything it is instructed or commanded to from the BEC using the red line vector or "entanglement". 5. Matter in the form of Trons appears to mimic some of the properties of fine particles in the BEC (StTt) and is therefore more complex than it appears at face value in the sense that it is formed from a myriad of layers of separate universes, which collectively form the very fabric of Space-Time as we saw earlier with how the Geometry of Space-Time consists of a Multiverse Array. As we see with the Heisenberg's Uncertainty Principle (HUP), motion, distance, space and time, basically the science or mechanics of existence itself as it is experienced are built from the layering universes which collectively create Space-Time. Each fine particle in the cloud is linked. This link (B&C) tells each particle what to do or how to behave in real time presumably using entanglement. Each particle moves itself into position and implements instructions (A) from the link creating the particle cloud and its underlying pattern, which in turn pre-defines what the electron does, how it behaves and how it should look when it is observed. Since both the particle and its observation are processed in the same place this is why there is a discrepancy between electrons and atoms when they are viewed, one state being code (fuzziness and uncertainty) the other being the execution of code, i.e. the point or "physical" matter and its observation. The fine particulate appears to move as one, observed in the propagation of waves in the interference pattern and consequently behaves as though it is in a magnetic force field when in fact there is no force field, but there is an information link presumably using entanglement that instructs and coordinates each of the particles. The reason why it is assumed that each fine particle must operate independently within the cloud is due to the complexity of code, patterns or information upon which the complexity of electrons, atoms and matter in general is created. It is unlikely fields can have the the dexterity with which to control what each individual fine particle does, therefore the inference is that the coordinated behaviour of fine particles creates the appearance of a field effect rather than fields controlling the fine particles which they may not be able to do with the necessary dexterity and complexity. The 2D wave generation, receptor and pattern shown in the diagram are simply a recreation of what is happening using an apparatus such as a sound wave generator discussed earlier. This further emphasizes that the double slit experiment is a Bose-Einstein Condensate (BEC) experiment taking place at room temperature even though experimenters may not be aware of this. The BEC, resultant patterns, codes or the Cloud is simply what comes into view when matter and the observer (experiential reality or the hologram) is disrupted by extreme temperature, extreme pressure or extreme rarefaction.The BEC can be described as an interface where reality is processed and in fact nothing exists outside the Cloud. The inference of this is that the observer (consciousness itself) takes place and is processed within the Cloud. It may be important to note that consciousness itself is a construct of the Cloud especially as this may offer a pathway for how virtual reality technologies that are indistinguishable from reality are designed to integrate with human thought and consciousness. Interestingly, anyone dedicated and sufficiently advanced in meditation will observe all visual and sensory information withdraw and be replaced by a surrounding of clouds very much like those seen in the BEC before ascending to a higher state of consciousness. This internal change in "perception" seems identical to the change in view from electron to the [smeared out] clouds shown in the BEC experiment. |
What Next?
If this analysis is accurate what comes after quantum mechanics? Quantum mechanics leans on energy bands for its theory and energy bands are just like another fancy name or acronym for fields. This means that quantum mechanics faces the similar limitations to Einstein's approach to understanding the universe which is also based on fields or Space-Time Geometry (SeTe). There's a need for physicists to see the limitation of fields and refocus on the particle (StTt) and red vector or red line communication between particles often described as entanglement. There is also a need for physicists to understand the interaction of particles in the Multiverse Array due to the fact they play a critical role in the structure of matter and energy beyond quantity. Beyond quantum mechanics an attempt can be made to theorize what matter actually consists of.
The Anatomy of a Particle beyond quantum mechanics [sub-particle nano mechanics] |
Matter is made up of particles and particles consist of sub-particles woven finely through many layers in different universes. Matter cannot exist without a Multiverse from which it is constructed. The electron does not exist in the sense that it is simply a construct of the BEC fine particulate. However, it is designed to have all the properties of real matter except that it is experiential and confined to a single universe. The nexus at the centre of the particle is where "reality" or perception takes place within a universe, while the code or pattern that manifests it, is formed from the surrounding sub-particles functioning in and woven through separate universes or a Multiverse Array. Energy moves freely between universes through energy receptacles that are like pathways between universes. Therefore, the particle, that is itself built from sub-particles cannot exist without a Multiverse. For example, when the electron is observed, all that is seen is the nexus or singularity at the centre of sub-particle activity at the periphery or "the cloud" which ties in with Heisenberg's Uncertainty Principle (HUP), where to look away from the singularity or nexus naturally causes it to collapse into a view of the cloud or sub-particles at the periphery. The hypothetical anatomy of a particle strangely resembles the Multiverse view of a sun mentioned earlier, almost as though the sun itself were just the nucleus of a large electron orbited by planets or sub-particles circling it in multiple universes, the same way an electron in turn orbits the nucleus of an atom. The tiniest parts of the universe would then be observed in how it functions as a whole or at different scales the outcome of which is a series, spiral formation, or pattern that is simply repeated at different scales. This implies that dark matter and dark energy are simply aspects of fundamental matter operating in other universes within sub-atomic structures at the sub-particle level. It becomes dark because a universe cannot see the parts of itself functioning outside its own reality or singularity, these would basically be beyond what are referred to as light-speed barriers that keep universes separate. As much as scientists can use quantum mechanics (QM) to dissect matter at the subatomic level, this becomes inadequate because QM is not refined enough to see matter at the sub-particle level which requires the ability to see across universes or across light-speed barriers. However, if there is an impasse where physicists cannot accept the Multiverse hypothesis especially its role in the construction of matter at the subatomic and sub-particle level, then the impasse in physics is a mental one, in the same way that "fields" are the imaginary friend that is an obstacle to fully understanding gravity, how it works and how it may be controlled. The next significant leap in the sciences is to develop the technology with which to penetrate or breach light-speed barriers and gain access to the Multiverse Array and the sub-atomic, sub-particle structures that create it that are on a scale smaller than that currently observed in quantum mechanics. These sub-particle structures control the movement of energy and information between universes and offer the next most strategic area for advancement in physics.
The hypothetical anatomy of a particle strangely resembles the hypothetical anatomy of a sun, almost as though the sun were the giant nucleus of an atom. |
The Crisis in Cosmology seems to come about as a result of not accounting for a Multiverse Array, its potential relationship to dark energy/dark matter, and not being able to put all the pieces of this puzzle together. When it comes to the Hubble Constant there may be some simple problems with theory that need to be addressed. For instance if Cosmologists are right in determining that the outer reaches of the universe are accelerating faster and faster determined by red-shift then technically the Cosmologists cannot then state that the universe is expanding. This is an oxymoron, to put it politely. Its like saying , an object speeding away from you is visually getting bigger and bigger. Acceleration is caused by the contraction of Space-Time "Geometry". If this is the case, then how can models of the universe accurately describe it? If the universe is getting bigger and accelerating, it probably means that the rate of acceleration of matter is much greater than the rate of contraction of Space-Time Geometry, to the extent that it is generating a cosmic form of escape velocity and this difference is possibly not being accounted for. In other words the rate at which matter in the universe is expanding will cause it to over-shoot or escape whatever object is contracting Space to cause the acceleration or red-shift in the first place. If these parameters are not viewed correctly then it may make sense why it is difficult to reconcile models of the universe ( Lamda-CDM) with the universe itself. It means that the contraction of Space-Time is being added to rather than subtracted from expansion of the universe because of acceleration being thought to mean Space-Time Geometry is expanding, when in fact it is doing the opposite.
In the 1st diagram the imaginary Space-Time Geometry surrounding matter is negligible. The matter in the universe is more reactive to the interaction of its own local masses. In diagram II a body outside the universe C is shown to affect Space-Time Geometry by contracting it. As a result the universe A&B begins to accelerate towards C. Cosmologists need to be clear about what they are describing. In diagram II Space-Time Geometry is contracting (shrinking), which is what is inducing acceleration. But if this is the case then the matter in the universe should be coming together not expanding uniformly, unless the expansion is linear, that is, toward C but while A&B (matter in the universe) is clumping or drawing closer together. For matter in diagram II to be accelerating and uniformly expanding, that is, with A&B also moving further apart at 3, the rate at which matter is moving apart may need to be so great that despite being accelerated towards C, at 3 it is spreading away form C and may consequently escape contact with it.If by saying "the universe is expanding" Cosmologists mean that Space-Time itself is expanding as shown in diagram III, then this is flawed because if Space-Time is expanding then this is like going uphill and matter in the furthest parts of the universe will meet resistance to its expansion the consequence of which is deceleration. Its difficult to visualize matter in the universe moving faster and faster toward a force or body that induces deceleration. If by saying the "universe is expanding" cosmologists mean that it is moving away from the original force that pushed it out, that is, the big bang, then technically it should be expanding toward C while matter is shrinking in that A&B are moving closer together as they accelerate outwards toward C. If that the case Cosmologists are not explaining the nature of this expansion succinctly. |
The light astronomers see arriving on earth from billions of light years away such that even the star that exploded to create this light no longer exists may not actually travel this distance, it traverses the Distance. Like the water in the lake, the light astronomers observe as it reaches earth will most likely consist of a "sea" or body of photons already here that have simply been stimulated to communicate the light from the exploding star. If this is correct then it must also be accepted that the entire distance of a billion light years is like an ocean filled with photons by which this light becomes transferred. Basically, this would mean that the entire universe is filled with these primary photons and would in fact be what physicists refer to as the "vacuum of space", except that it is not a vacuum at all. Though they are referred to as "photons" here they may not be photons at all. Remember they put out whatever energy is put in. The light we see is therefore just an input out process being propagated by something we do not have the means to detect except through this process. Therefore light may simply be a small attribute of something more complex than is understood today. All the energy from the sun that reaches the earth may arrive using this process of stimulation. This stimulation theory may require us to then accept that even the heat we experience with the light does not travel from the sun to the earth, it is simply triggered here on earth in proportion to the source of the emission. If this true then this implies that empty space or "the vacuum of space" contains more energy than matter. It has to be able to emit as much energy as any source that triggers it to sustain an equivalence or to maintain a universal balance that has little compromise. To do this the vacuum would need to inherently contain what would seem an almost limitless amount of dormant energy, but because it is a responsive energy it must be very difficult to access since it only lets you get out as much energy as you put in. Nevertheless, it may be possible that catastrophic astronomical events may test the resilience of the vacuum being so violent as to attempt to exceed the potential of Space to equal the energy put in causing a tear in the very fabric of the MDT universe, in which case the response from authentic Space is an almighty push back to contain this that appears as a black hole or singularity. This would prevent any further damage to the fabric of the MDT. We also cannot rule out that these "photons" that transfer light and energy propagate throughout the universe and exist in various states from dormant to energized, and that like any substance may exhibit different properties depending on what state they are in. For instance, water can be frozen to ice, thaw to a flowing watery consistency and even be converted into steam with each state having its own inherent properties but it remains essentially the same substance. If this were correct then the universe would be filled with various consistencies of photons (for instance the way ice floats on water) which on observation exhibit different properties so as to make them appear as different materials when in fact it is one substance.
This explanation of light also allows us to understand a significant problem in astronomy. Scientists have discovered that the speed of light was faster when the universe was expanding at its very beginning. Light speed at this time was faster than it is now. The problem is that according to Einstein's model the speed of light is a constant that has always been the same and will never change. Physicists are then forced to develop very complicated ways of trying to explain or reconcile theories that seem irreconcilable in order to keep their theoretical physics in Einstein's flawed model or risk ridicule. The brilliant physicist Steven Hawking wrote his PhD thesis on this problem. However, if Einstein's model is flawed and light does not in fact travel, but is transferred then it is common sense that the speed of light is in fact not a speed at all, but a transfer rate. If the rate at which light is being transferred between static or cloaked photons is increased, then light itself will appear to speed up beyond the speed of light constant. It can do this easily without violating the rules of light speed by gaining infinite mass because its not actually moving. If the rate at which light is being transferred between static photons slows down, then the speed of light will also appear to slow down. It then becomes quite easy to explain why there are variations in the speed of light during different stages of the development of the universe. Everything related to theories on light fall into place logically and elegantly.
We know that matter and energy are basically the same. Variations in the speed of light may provide insights into the universe as it exists today. During the creation of the universe evolving conditions including the Distance itself were most likely in various states of turmoil. In some of these states light slowed down sufficiently as to break apart and begin to recombine into different substances. It is possible that slowing down the speed of light is similar to melting metals allowing them to be fashioned into different objects with the exception being that primary photons as fundamental building blocks created a virtual soup out of which any imaginable substance could emerge depending on how the primary photons came together during convalescence. It is most likely breaking subatomic particles down to their fundamental building blocks will eventually reveal that they consist purely of primary photons in various states of decay, this decay merely being the various patterns and ways light which slowed down convalesced to form a soup of particles from which all subatomic particles that form matter itself originate.
It is most likely that quests such as the Large Hadron Collider (LHC) will discover
that breaking down subatomic particles to their fundamental parts will reveal
they consist of primary photons or light that has slowed down and convalesced into the various
forms of matter we see today. But there are significant limitations to this approach. It may therefore be just as sensible to build a light decelerator as it is to build an particle accelerator. A light decelerator would be able to answer all the questions the LHC seeks answers to in physics and may have fewer limitations. However, to do this scientists would have to accept that the speed of light is not a constant, and that fundamentally all matter consists of light. Being unable to think outside the box like this may simply be another example of the obstacles Einstein's flawed model has created that hinders the advancement of physics.
The LHC has limitations in that it is unlikely matter can be broken below the subatomic state even with an experiment of this scale. The LHC experiment and its magnets could be reconfigured to decelerate light instead of accelerate particles as a way of more concisely achieving its objective. Since light has electromagnetic properties and if it does not travel but traverses distances through a transfer rate and a non-contiguous wave it may make sense to use magnetic fields to slow down the transfer rate and observe how very slow light breaks down and combines to form new matter. Experiments like this can be conducted on matter itself armed with knowledge that at its core matter is just light since it is constructed from primary photons. All matter is therefore simply a form of non-contiguous light. The LHC attempts to understand gravity through geodesics or the Geometry of Space (Se).
Non-contiguous light can be decelerated using disruptive ultra-high or ultra low frequency powerful electromagnetic fields.
This may allow scientists to slow light down until it begins to break apart. This may in turn reveal the
true nature of primary photons. Being non-contiguous suggests that they transfer energy equivalent to
any source of emission. Should all forms of sub-atomic matter be made from light then matter itself is inherently non-contiguous and evidence of this is that it should begin to break apart or start to liquefy as it reverts to its individual atomic and subatomic parts when exposed to these types of fields. If this is true it may be possible to observe the behaviour of matter and gravity when affected below the sub-atomic level, this would allow scientists to investigate deeper than the LHC can reach. A decelerator may not only be a detector but can also be designed to manipulate decelerated light or matter and any or all of its properties including how Space interacts with it, which may provide avenues for observing gravitational
effects and eventually finding ways to manipulate gravity itself. It is most likely the observed effects of these kinds
could not be explained by conventional physics based on Einstein's model because in his model
light is constant cannot be slowed down; in addition to this non-contiguous waves are not thoroughly
accounted for in modern physics investigation of waves and no emphasis is placed on theory that all matter may simply be made up primary photons. The only experiments that seem to be available
that seem to yield results that are predicted by the theory here though at times dubious is the "Hutchinson Effect" nevertheless the authenticity of these have never been independently and openly verified.
|
This video shows what has come to be known as the
"Hutchinson Effect" of substances said to be placed in ultra high
frequency electromagnetic fields. Though these experiments are not verified
the breaking up of particles at the subatomic level causing liquification
and gravitational effects are some of results experiments designed
to slow light waves down should expect to observe. Imagine being able to
liquify metals using magnetic fields without applying any heat and what new
advantages physics like this could bring to the mining industry.
The Hutchinson-Effect and Boson-Einstein Condensate appear
to be fundamentally the same phenomenon achieved using
different aparatus.
the breaking up of particles at the subatomic level causing liquification
and gravitational effects are some of results experiments designed
to slow light waves down should expect to observe. Imagine being able to
liquify metals using magnetic fields without applying any heat and what new
advantages physics like this could bring to the mining industry.
The Hutchinson-Effect and Boson-Einstein Condensate appear
to be fundamentally the same phenomenon achieved using
different aparatus.
Tibetan prayer wheels are a good example for explaining how
light works. The wheels represent photons. When they are standing still
they are dormant, opaque and do not emit light creating darkness. The monk represents the rate
at which light is triggered by a source. As he walks he spins each wheel which represents a photon and its state changes. It begins to spin becomes transparent emitting light. The speed of light is the rate at which the wheels are stimulated. The wheels themselves always retain their individual mass and remain in place. The energy they release is proportional to the
source of the trigger or emission.
source of the trigger or emission.
Dominoes can also be used to illustrate how light travels.
If each domino is a photon then when it is standing still it creates darkness or opacity.
When light is emitted from a source the dominoes become excited and begin to fall. The falling
dominoes simply demonstrates that the photons stay in place while they transfer light. If this is true then it means that light should be highly manipulable and does not have to travel in straight lines.
If light traverses instead of travels, then it possible to slow it down immensely. If it can be slowed down then it should be possible to control the direction in which it appears to move. This means new avenues in physics can be opened
since it can be manipulated to create all manner of exotic combinations of matter as is shown by the
creative pattern of the different types of dominoes used in this video. However, all this incredible
potential to advance physics will be blocked by people, science journals, the media and institutions policing Einstein's flawed model, who ridicule physicists who think outside of it, refuse to fund their research thereby effectively
stonewalling advances in both theoretical and applied physics.
potential to advance physics will be blocked by people, science journals, the media and institutions policing Einstein's flawed model, who ridicule physicists who think outside of it, refuse to fund their research thereby effectively
stonewalling advances in both theoretical and applied physics.
Many unanswered questions about how to harness gravity. Finally, the first critical answers will be unveiled. |
Einstein postulated that nothing can travel faster than the speed of light. This inference made from the top view creates a constant upon which he builds much of his theories. Einstein's understanding of the top-view universe was revolutionary in his time. However, when analysed from the side-view the limitations he associated with the speed of light belong to a more primitive understanding of the universe applied in physics today. For instance the nearest galaxy is Andromeda. Travelling to Andromeda at the blistering speed of light it would take 2.5 million years to get there. For the purposes of astronomy this makes the speed of light exceptionally slow. However, side view analysis would force scientists to dismiss distance as formal barrier to space travel. It would demonstrate that the idea that the Andromeda galaxy is an unreachable distance away is a primitive one because this galaxy and earth occupy the same Space (as does the furthest known galaxy in the universe) which means the science of an advanced civilization would know that technically there is no substantive distance between them. If there is no technical substantive distance between them these locations, perceived as being unreachably far by primitive modern day science, are in fact very easily reachable. They can in fact be be reached at an interval of time determined by side view based technologies in, for instance, half an hour or 1 second. Theoretically this means straight-forward faster than light travel from earth to Andromeda in 1 second without weird repercussions or having to devise wormholes, warping time and other exotic theories is possible very much in the same way that the BBC or CNN switch from a journalist in Perth, Australia to a journalist in Chicago, USA within the same space (or frame, i.e. the area of the television screen) the traveler or "spaceship" simply switches from earth as a location to Andromeda limited only by the duration it takes to turn the dial. This is due to the fact that from side view analysis a scientist is not crossing places separated by "distance", but rather "tuning" from one place into another irrespective of distance as they are located in the same space much more like tuning from one radio or television station to another where all the waves or signals inter-exist (as does earth and Andromeda). For instance, in Zambia when audiences listen to radio they don't say they traveled to Hot FM, then traveled to Radio Phoenix, then traveled to Komboni Radio, they say they "tuned" into these stations because they know that while they listen to one station all the other stations are still present but are simply not tuned into. Similarly, the side-view postulates that being on earth is the location "tuned" into does not mean Andromeda is not present in the same Space. Achieving this journey in 1 second would entail travelling at 2.5 million times the speed of light, something technically possible from the side-view but technically impossible according to Einstein and the exceptional yet more primitive understanding of modern day physics based on a top-view analysis and its relevant or irrelevant constraints. What this means is that any location in the universe can be accessed. Our universe is simply a tiny part of a a multiverse. Similarly any location in the multiverse can be accessed through a similar process. Hopping from location to location through Space entails there must exist a map or geography of the universe and multiverse to allow the precise selection of coordinates for a location or "channel" to jump to. How this map of the multiverse appears to work would be accurately theorized by Gaston Julia - (1893 -1978) using the Julia [Map] Set for location based geography of a universe and Benoit Mandelbrot (1924-2010) (Mandelbrot [Map] Set for location based geography of how the multiverse would work. These can be used hypothetically to know in advance the exact location where a jump through Space will take a space-craft. If you want to understand these sets the video below offers a succinct explanation. These sets demonstrate how potentially vast the geography of Space is.
Ben Sparks' excellent explanation of Juila Sets and Mandelbrot Sets
Mandelbrot Sets may be subject to misinterpretation because depth is not taken into account correctly. There is only one type of circle in a plane or layer, namely A. The remaining circles B, C, D etc appear smaller because they are not on the same perspective or layer/plane as A. Each circle of different size occupies a different lattice. What you then get is a matrix for matter where the tiniest part of the set is equal to the largest part. When you go out of the circle another circle is entered to create a complete map. In this case movement or the map is 3 dimensional or 3 Directional. If you notice when Ben goes outside some parts of the circle the figures explode. This is likely due to the fact that Mandelbrot's equation does not account for layers and therefore for the fact that exiting one circle or bubble in a certain manner may take you out of that plane into another one, as shown, i.e. from A-B-C etc. If a circle or bubble is not mapped then this is likely a design quality of a specific type of matter or subject being observed, but not of Space itself. The map to the smallest part is the mapping of the largest part. |
Talk of folding Space-Time, building worm-holes, warping time, engaging warp-drive and hyperdrives for jumping through space, figuring out how to get there and back without arriving before you were born and so on to create shortcuts to far off places in other galaxies are antiquated ideas in physics of a less informed era like fossilized dinosaur bones because our universe observed in a single frame or continuum is already infinitely compressed since technically distance does not exist from the side view. The theory of a hyperdrive technologically trumps that of warp drive, since a hyperdrive is theorized to move outside of time by jumping into hyperspace although it still faces interference from mass or "mass-shadows". The recently introduced fictional concept of a Sporedrive in Discovery that allows almost instantaneous travel and though exotic is a little closer to Spatial technologies. It trumps hyperdrive and though fictional is closer to the kind of technology we should hope to have in the future. Spatial technologies would at present be the ultimate technology for travel. Using side-view physics a civilization simply identifies where it wants to go or be, anywhere in the universe, and arrives there at a pace, speed and in a duration of its choosing that suites a need, requirement or preference at any particular juncture. Meanwhile a civilization building its technology on Relativity Theory blew themselves up on the way, were crushed like pancakes into their seats by high speed g-forces, scrambled like eggs in warp drive or were woken up from hibernation too early mid journey and the entire crew is nearly 200 years old when they arrive, whilst on earth no-one they know is still living and their space suits look like they were designed in the 18th Century because a thousand generations have gone by. No crew member would like to experience the trauma of going insane as a result of being exposed to time distortions that persist after a jump and other potentially dangerous aspects of this kind of travel. Basically a civilization whose technology is built from top-view physics will be significantly backward compared to a civilization that has leapt ahead through a side view understanding of physics.
Furthermore accelerating from 0 to 2.5 million times the speed of light in one second does not face interference from any primitive notions of being affected by g-forces, requiring infinite energy to move at near light speed or gaining infinite mass as a result, as would be inferenced by modern top-view physics, as this change of location or "speed" is not applied through the medium of matter-distance-time, but occurs through Space (Remember a fundamental weakness in the Theory of Relativity is that Einstein makes no distinction between distance and space, whereas from the side view distance and space are two distinct constructs). Though visually a vehicle travelling below the speed of light through either the Distance or Space would appear to be moving "normally" as we observe the every day occurrence of an airplane travelling across the sky, principally the method of propulsion using side-view space is completely different from that used conventionally to travel using top-view distance e.g. through thrust generated by an engine. Technically it is moving through space and therefore without the primitive notion that g-forces would make travel at exceptional rates of acceleration impossible. This is due to the fact that a vessel moving through distance as a medium such as an airplane, rocket or other similarly propelled vehicle must experience top-view g-forces. A vessel travelling close to the speed of light would most likely be very difficult to navigate and fatal matter on matter collisions would be almost impossible to avoid. Should it be designed to use a wormholes and so on the extraordinary trauma of traversing biological organisms through the effects of Einstein's Space-Time could prove as lethal as exposure to radiation at a nuclear power plant, whereas a vehicle harnessing Space to change its location does so outside mass, distance without motion or "time" and therefore without any weird, excessive and primitive g-force or dangerous "Space-Time" effects on the occupants of the vessel proposed by mundane limitations in antiquated top-view theories currently applied in modern physics. A civilisation functioning on Spatial technologies would view a civilisation function on Relativity Theory as intelligent but very backward. Traveling through Space it has the option of moving a vessel outside our MDT universe where the vessel simply moves through matter be it a planet, asteroid belt, sun or debris safely as it does not require the vessel to make physical contact. A vessel built on Spatial technology could, while standing still, simply shift into Space and completely disappear from physical visibility because light could pass straight through it at will (no need to try to bend light around it). It could become invisible to radar at will. Objects and people outside the vehicle could pass through it, but it would still be right there observing them. It could do this on the ground or in the air. This gives it unparalleled levels of stealth. It could land on the lawn in front of your house or hover just above it and by today's level of physics, science and technology there would be no way of knowing or detecting it was right there. The advances of Spatial technology over matter based technologies found in the MDT universe are innumerable.
Being able to travel to any part of our universe instantly may allow us to explore it more effectively. Nevertheless, if there are a vast number of places to explore and infinite number of universes similar to our own we may soon discover it could take millions of years to investigate and catalogue all of what's out there. To comprehensively do this would take more than just a newfound incredible speed.
Laser Interferometer Gravitational-Wave Observatory (LIGO)
LIGO has recently been in the news for having detected gravity waves. Since this is such an important development in physics and our technical understanding of the universe I cannot help but comment on it.
To begin with what is gravity? According to Einstein gravity can be observed in the pressure large masses exert on Space-Time. Gravity can therefore be detected by distorting, bending, compressing and stretching Space-Time. In the deliberations we have had thus far it has been noted that Einstein erred when he labelled distance and/or the matter it separates as “Space”. I prefer to refer what Einstein actually refers to as distance, the Distance or matter not Space. Relativity Theory is formulated on a Matter-Distance-Time (MDT) universe. By the way, when time is referred to here, it refers to motion. Time in Einstein's model is nothing more than a construct of motion where matter moves relatively to other matter; hence the use of the term "Relativity Theory". It is not formulated on a Space-Time universe as Einstein postulated and as is believed and maintained in applied physics to this day. If distance is observed as being of a symmetrical nature it can appear to be empty and behave like or mimic Space and can be used to draw force lines popular in drawings used to depict gravity.
To begin with what is gravity? According to Einstein gravity can be observed in the pressure large masses exert on Space-Time. Gravity can therefore be detected by distorting, bending, compressing and stretching Space-Time. In the deliberations we have had thus far it has been noted that Einstein erred when he labelled distance and/or the matter it separates as “Space”. I prefer to refer what Einstein actually refers to as distance, the Distance or matter not Space. Relativity Theory is formulated on a Matter-Distance-Time (MDT) universe. By the way, when time is referred to here, it refers to motion. Time in Einstein's model is nothing more than a construct of motion where matter moves relatively to other matter; hence the use of the term "Relativity Theory". It is not formulated on a Space-Time universe as Einstein postulated and as is believed and maintained in applied physics to this day. If distance is observed as being of a symmetrical nature it can appear to be empty and behave like or mimic Space and can be used to draw force lines popular in drawings used to depict gravity.
Einstein mistakenly labels this Euclidean geometry as Space or Space-Time. |
Here is a simpler recreation of the earlier diagram
with the earth in the middle and a single imaginary line of "force"
|
Standing on the shoulders of a giant. Albert Einstein giving a lecture on relativity at Lincoln University, Pennsylvania 1946. |
As much as we all love Einstein and what he accomplished in theoretical physics there comes a time when we need to take what he left humanity with and begin the next journey. This is what he would have wanted. We have stood on the shoulders of a giant and its time to use his brilliance to take the next step, the next leap, the next bound.
This video is a contemporary example of how Einstein's error continues
to mislead physicists to this very day. The blue spandex represents the Distance
or Euclidean geometry it does not represent Space. Though the balls are
being moved by the depressions created in the spandex, the spandex itself
does not endow the balls with mass. You can see this here with your own eyes. This mass comes from Space,
that is, from outside of Euclidean geometry and therefore outside of Einsteins Space-Time model. Without
a pre-existing mass or push-effect, even if the experimenter pulled the spandex down with his bare hands
there would be no gravitational force, no movement. Movement or "motion" being Time itself in Relativity Theory further
demonstrates just how mistaken Einstein's understanding of the universe was at this stage. What more evidence do you need?
that is, from outside of Euclidean geometry and therefore outside of Einsteins Space-Time model. Without
a pre-existing mass or push-effect, even if the experimenter pulled the spandex down with his bare hands
there would be no gravitational force, no movement. Movement or "motion" being Time itself in Relativity Theory further
demonstrates just how mistaken Einstein's understanding of the universe was at this stage. What more evidence do you need?
To therefore say that Euclidean geometry creates gravity as Einstein does in Relativity Theory
and his model of Space-Time is in fact seriously erroneous and caused by Einstein mistaking distance for Space.
This error is echoed by the teacher in this video and by physicists in general across the world.
For how long will students, science and the public at large be misinformed and mislead by this misdirect?
This error is echoed by the teacher in this video and by physicists in general across the world.
For how long will students, science and the public at large be misinformed and mislead by this misdirect?
The use of spandex to explain gravity is a useful tool for visualizing gravity but it is not accurate
. Its useful only if when being used to demonstrate gravity its limitations are understood by
physicists and this flaw or limitation is explained to students.
Einstein’s predictions have been proven true by LIGO, and will continue to prove true as long as they are based on the front or top view of the universe. However, we must begin to look at the theories he left us with fresh eyes. The fact that he confuses distance for Space, will continue to have serious repercussions on the viability of modern day physics and its capacity to add value to the advancement of technology in this area. Is this concern really just semantics, for example, you say tomatoe and I say tomato, but we’re talking about the same thing? No its not.
We are talking about completely different
forces. This is like seeing a v-tol craft
taking off and saying it’s being lifted by gravity, when in fact it’s being
lifted by air. The difference is that blatant. Similarly, gravity and distortions in matter-distance-time (MDT) are not the same thing. This
difference should be able to demonstrate to physicists why Einstein
mislabelling distance by calling it Space should be of tremendous concern. Distance and matter are not Space, the geometry of the observable universe is not Space. Failing to distinguish the Distance from Space creates many hidden pitfalls. Space-Time infers that distortions of the geodesic nature of the universe create gravity when they in fact do not. As long as this view remains uncorrected how to
control and manipulate gravity will either prove tremendously costly, weak, illusive or inconclusive.
If what has been described here is true then where is Space
in this theory or explanation? If what Einstein thought was gravity in his age is simply a
geometrical effect which is not responsible for gravity then where is gravity in our era? Since
Space is ubiquitous gravity is the resistance from Space by and through which movement
observed as acceleration takes place. I will explain this shortly with just one diagram. When Einstein's view of the universe is corrected, the interpretation also begins to make sense.
The diagram below shows an object that, from the top-view or Einstein's view is being pulled toward the earth. The earth's mass causes the distortion. However, the distortion in and of itself is not gravity as proposed by Einstein. It is not Space as observed in Einstein's model. The force that causes acceleration is the resistance of Space to the distortion which pushes the object downward. It is not being pulled. Gravity is coming from Space, not the distortion. Therefore, technically what the physicists at LIGO are measuring is not gravity, itself; but a distortion of matter-distance-time (MDT). It also means that objects are not pulled towards the earth and electrons are not being attracted by the nucleus as is commonly believed or as described by Einstein, they are in fact pushed or or kept in place, just like how the seat in the accelerating car, described earlier, pushes the passengers. Objects on earth are not being pulled by gravity they are being pushed, skydivers jumping out of planes are not falling towards the earth, they are being pushed toward the earth atoms are not being held together by a pull effect from the nucleus, they are being held together by force response from Space. Nevertheless, this description should not cause further confusion. The geometric distortion is being resisted by Space, that pushes or resists on and on, forcing the universe to continually expand. What we consequently view from the top view is acceleration. We refer to this acceleration as gravity.
Fundamentally, whenever physicists refer to gravity the general term used is "pull", for example, "the sun's gravity pulls on the planets..." If it is indeed true that objects entering the earth's gravitational field are being pushed by the resistance from Space in response to distortion geometry, and not pulled directly by distortion geometry itself as Einstein theorised then this is game changer. For instance the fact that the universe is expanding is based on the assumption that matter is being pushed further out. Since distortion geometry itself cannot directly create a gravitational effect, and the gravitational effect acts in the opposite direction to distortion geometry rather than being pushed away from a theoretical centre, astronomers may have to consider that the universe is expanding because it is being attracted by a larger mass. The distortion geometry of this mass may be acting on our universe causing an outward gravitational push on matter that is observed as an expanding universe. If this is true then our universe is continuously expanding due to the resistance of Space (not Einstein's flawed "Space-Time" but the Space as it is understood in the corrected model) in response to a distortion geometry coming from outside our known universe. This approach may also solve a long standing problem in physics of how something infinitesimally small can be astronomically heavy such as a black hole. Our universe is neither infinitely large nor infinitely small and a singularity is merely the point beyond which the MDT universe and any top view substance within it reduces until it arrives at the minimum scale of existence hence infinity or divisibility by 0; whereas Space being ubiquitous exists well beyond this minimum and maximum scale being capable of existing outside of the concept of distance itself. If cause and effect are separated, as has been shown here, then gravity can in fact exist outside the singularity [the infinitely small source of distortion] is receiving a push effect or response from Space [the astronomically heavy effect] and the two are able to co-exist. Since Space keeps universes apart any cataclysmic event that threatens to tear the Distance will be responded to by a gravitational effect that resists the distortion. The more violent the distortion the more aggressive the gravitational response from Space that contains it. Space effectively constrains any mild or excessive physical disruption of the Distance (Euclidean geometry or Einstein's lines of force) possibly sealing off and preventing any direct connection between separate universes effectively responding with gravity to force a submission viewed in our universe as a singularity; which seems like nothing more than a stopper plugged into a hole held in place by gravity . The singularity not being the actual source of gravity itself solves the conflict. Here is brief presentation of this problem:
Correctly labeling Space and correcting Einstein's error may comprehensively answer questions that are presently unanswerable such as the ongoing problem with understanding quantum gravity and why the universe is expanding and so on. The answer becomes quite simple: the universe is expanding because it is responding to a gravitational force and infinitely small phenomena such as black holes or the tiny nucleus of an atom do not pull or are not the direct source of gravity but rather interact with a push effect from Space. The answer is simple, logical and elegant. However, where the universe is concerned, if this makes sense and is true, it raises another question. What is it, beyond our universe that could exert such a powerful distortion geometry that it is able to indirectly generate a gravitational push or force able to "pull" (sic push) toward it all the matter in our universe? Is the matter in our universe on a collusion course toward it? Or is our entire universe merely orbiting or tethered to some other object or universes of incredible mass equal to or greater than it? Does this mean our universe is part of a local group of universes? If galaxies can create local groups or clusters, it may be possible that universes can do the same. If our universe is just one amongst many, then how many are really out there? If there is merit to this, new theories in astrophysics may be required to understand our universe and we may have to revise how expansive this realm is and our universe's place in it. Furthermore, if Space will go to the extent of creating a singularity to prevent physical matter from escaping from one universe into the next it explains why objects become infinitely heavier as they approach light speed. It may become heavier due to the resistance from Space, namely gravity, effectively containing unintelligent matter and keeping or trapping it in a continuum. It also explains why faster than light travel is only possible through Space, but not through the conventional universe.
The resistance from Space is not sufficient to comprehend Space itself. Space operates from the side view where distance and time are meaningless forces to it, to the extent that they do not exist. Since distance is no obstacle, to access Spatial technologies is to gain the ability to go way beyond the size of an electron or proton and interact or work with microscopy at an advanced sub-nano scale where matter in its most fundamental state can be examined and manipulated. Space does not function on exactly the same principles as the MDT universe proposed by Einstein. Distance, mass, weight durational time are insignificant to it. It functions more like code or informatics; a method, operating system or type of physics completely different from anything being currently applied. For instance, to be able to do this the structure of an atom consisting of protons, neutrons and electrons should not be seen as "Matter", a substance, but rather as bits of information coded by Space to behave in certain ways that we view at the top level as quantum mechanics or when coded together in a certain way produces the Periodic Table. These particles, which are in fact just made up of bits of information in different patterns, are refreshed to give them [the appearance] of mobility we refer as a wave from which Schrodinger gets his famous equation. Being bits of information they are in fact always static and refreshing these stills frees them from immobility to create the hypothetical structure of the atom. This means that in reality protons, neutrons and electrons have no inherent mass or tangible form, all these are just attributes that must be provided or naturally programmed by Space. Like the earth any mass they exhibit and any energy they have or can produce comes from Space, as does their ability to move (wave properties), appearance and any other attributes observed from the top view. Space itself, which cannot be thought of in terms of "size", since size is just an attribute of Space may have to borrow ideas from computer science relevant to hardware and software. To be understood it may need to be approached from the concept of how transistors can hold, process and manipulate information. These tools can provide some insights into how Space works. To control Space is to gain the technology with which to manipulate matter and possibly [top view] reality itself since all the constructs created from its attributes such as mass, matter, distance, energy, time and so on are merely a form of code whose value is determined by the underlying coding or informatics that have dictated how these attributes should appear and behave. For instance when it is said "Astronomers have discovered what may be the most massive black hole ever known in a small galaxy about 250 million light-years from Earth,... The supermassive black hole has a mass equivalent to 17 billion suns and is located inside the galaxy NGC 1277 in the constellation Perseus" (Wiki 2017) we should not be overwhelmed by the more primitive top view values spoken of. Instead we should take the more advanced view that 250 million light years and mass equivalent to 17 billion suns are merely attributes of Spatial code that cannot be greater than the information from which they are being created or generated. The fact that even a black hole with the mass of 17 billion suns is too weak to penetrate or break through the resistance of Space illustrates the power inherent in this technology once how to harness it is understood. In other words we need to stop primitive thinking in science that continues to believe that when a bigger truck appears on a laptop screen, the laptop gets heavier, when the laptop can hold and contain the mass and breadth of an entire galaxy or universe on its screen and not flinch. Mass, matter, size, energy distance and so on are only truly quantified within the top view universe they exist in, whereas, outside it, in Space, these are merely attributes that an advanced civilsation is able to use its technology to manipulate for its own needs. Consequently, even if the mass of our entire universe were to distort cataclysmically, it would be repelled by Space as if it were nothing, and our universe would continue to expand indefinitely against this resistance in accordance with the behaviour determined by the underlying code. To understand, control and manipulate this code is simply another step toward understanding a universe and the physics behind it that may be much greater than Space itself, which is merely the next rung on the ladder available for us to understand. To speak of the size of Space is an oxymoron, because the concept of "size" cannot realistically be applied to Space; it cannot be understood in terms of distance as we know it, or time as we like to do in physics. Fundamentally, Space most likely contains an infinite number of universes or seperated continuums but it does not function on time, distance or matter. How it resists distortions, causes a push effect we observe as gravity be it around the earth or around the nucleus of an atom is unlike anything we have dared explore therefore it is an exciting, albeit new era and field of physics open to exploration and that is definitely manipulable, I have already tried earlier to some extent to explain this using the concept of a refresh rate. It offers science new vistas such as the ability to travel instantaneously to any part of our universe in seconds and other feats of technology previously thought impossible. The limitations once thought very real by Relativity Theory seem like nothing more than an idle dream of a bygone age. The potential of Space makes a nuclear reaction, nuclear fission or the energy given off by the sun seem less significant than the energy given off by a lit matchstick. Energy, time and distance are just ideas or tenets in a Spatial construct, nothing more and therefore almost absolutely manipulable within the limits of Spatial physics. Even if a nuclear weapon or atomic bomb were set off in a small cube designed from Spatial physics or technology the cube would shield or contain the blast and remain completely unaffected, because Spatial potentialities operate on different principles and are by far greater than either atomic or quantum level forces as to render them irrelevant, backward or obsolete in comparison. Our universe being just one continuum, in comparison to Space which probably contains an infinite number of continuums, is probably like trying to compare the known universe sic a seemingly immeasurable force to a spec of dust.
The diagram below shows an object that, from the top-view or Einstein's view is being pulled toward the earth. The earth's mass causes the distortion. However, the distortion in and of itself is not gravity as proposed by Einstein. It is not Space as observed in Einstein's model. The force that causes acceleration is the resistance of Space to the distortion which pushes the object downward. It is not being pulled. Gravity is coming from Space, not the distortion. Therefore, technically what the physicists at LIGO are measuring is not gravity, itself; but a distortion of matter-distance-time (MDT). It also means that objects are not pulled towards the earth and electrons are not being attracted by the nucleus as is commonly believed or as described by Einstein, they are in fact pushed or or kept in place, just like how the seat in the accelerating car, described earlier, pushes the passengers. Objects on earth are not being pulled by gravity they are being pushed, skydivers jumping out of planes are not falling towards the earth, they are being pushed toward the earth atoms are not being held together by a pull effect from the nucleus, they are being held together by force response from Space. Nevertheless, this description should not cause further confusion. The geometric distortion is being resisted by Space, that pushes or resists on and on, forcing the universe to continually expand. What we consequently view from the top view is acceleration. We refer to this acceleration as gravity.
The diagram shows gravity coming from Space not the distortion because the distortion is not Space. It shows gravity being caused by a push effect from the resistance of Space rather than a pull-effect. |
Fundamentally, whenever physicists refer to gravity the general term used is "pull", for example, "the sun's gravity pulls on the planets..." If it is indeed true that objects entering the earth's gravitational field are being pushed by the resistance from Space in response to distortion geometry, and not pulled directly by distortion geometry itself as Einstein theorised then this is game changer. For instance the fact that the universe is expanding is based on the assumption that matter is being pushed further out. Since distortion geometry itself cannot directly create a gravitational effect, and the gravitational effect acts in the opposite direction to distortion geometry rather than being pushed away from a theoretical centre, astronomers may have to consider that the universe is expanding because it is being attracted by a larger mass. The distortion geometry of this mass may be acting on our universe causing an outward gravitational push on matter that is observed as an expanding universe. If this is true then our universe is continuously expanding due to the resistance of Space (not Einstein's flawed "Space-Time" but the Space as it is understood in the corrected model) in response to a distortion geometry coming from outside our known universe. This approach may also solve a long standing problem in physics of how something infinitesimally small can be astronomically heavy such as a black hole. Our universe is neither infinitely large nor infinitely small and a singularity is merely the point beyond which the MDT universe and any top view substance within it reduces until it arrives at the minimum scale of existence hence infinity or divisibility by 0; whereas Space being ubiquitous exists well beyond this minimum and maximum scale being capable of existing outside of the concept of distance itself. If cause and effect are separated, as has been shown here, then gravity can in fact exist outside the singularity [the infinitely small source of distortion] is receiving a push effect or response from Space [the astronomically heavy effect] and the two are able to co-exist. Since Space keeps universes apart any cataclysmic event that threatens to tear the Distance will be responded to by a gravitational effect that resists the distortion. The more violent the distortion the more aggressive the gravitational response from Space that contains it. Space effectively constrains any mild or excessive physical disruption of the Distance (Euclidean geometry or Einstein's lines of force) possibly sealing off and preventing any direct connection between separate universes effectively responding with gravity to force a submission viewed in our universe as a singularity; which seems like nothing more than a stopper plugged into a hole held in place by gravity . The singularity not being the actual source of gravity itself solves the conflict. Here is brief presentation of this problem:
Correcting Einsteins Model helps resolve quantum gravity
Correctly labeling Space and correcting Einstein's error may comprehensively answer questions that are presently unanswerable such as the ongoing problem with understanding quantum gravity and why the universe is expanding and so on. The answer becomes quite simple: the universe is expanding because it is responding to a gravitational force and infinitely small phenomena such as black holes or the tiny nucleus of an atom do not pull or are not the direct source of gravity but rather interact with a push effect from Space. The answer is simple, logical and elegant. However, where the universe is concerned, if this makes sense and is true, it raises another question. What is it, beyond our universe that could exert such a powerful distortion geometry that it is able to indirectly generate a gravitational push or force able to "pull" (sic push) toward it all the matter in our universe? Is the matter in our universe on a collusion course toward it? Or is our entire universe merely orbiting or tethered to some other object or universes of incredible mass equal to or greater than it? Does this mean our universe is part of a local group of universes? If galaxies can create local groups or clusters, it may be possible that universes can do the same. If our universe is just one amongst many, then how many are really out there? If there is merit to this, new theories in astrophysics may be required to understand our universe and we may have to revise how expansive this realm is and our universe's place in it. Furthermore, if Space will go to the extent of creating a singularity to prevent physical matter from escaping from one universe into the next it explains why objects become infinitely heavier as they approach light speed. It may become heavier due to the resistance from Space, namely gravity, effectively containing unintelligent matter and keeping or trapping it in a continuum. It also explains why faster than light travel is only possible through Space, but not through the conventional universe.
Correcting Einstein's model improves our understanding of gravity and may require astrophysicists to consider that our universe may be under its own gravitational influence or that of another body, may simply be paired with other universes or just one amongst many tethered together in a local group of universes by gravity perhaps separated by cosmic background radiation. [A logarithmic illustration of the entire universe, starting with the solar system and ending with the cosmic background radiation of the big bang. (Pablo Carlos Budassi/Wikipedia (CC BY-SA 3.0))] |
Cosmic Microwave Background (CMB)
If its true that Einstein incorrectly labels the Distance as Space in his model and understanding of the universe, then all bits of the puzzle begin to come together. Technically when we launch satellites and rockets, we are not launching them into Space. As a civilisation we are launching them into the Distance.
Having corrected his model it is possible to see that gravity does not emanate directly from distortion geometry (a distortion of the Distance) as Einstein proposed. It in fact radiates from true Space as a resistance to distortion geometry. From this we are able to conclude that objects are pushed by this resistance, they are not pulled by distortion geometry as Einstein believed.
If objects are not pulled directly by distortion geometry then this requires us to revisit the underlying cause for why our universe is expanding. There is a possibility that matter in our universe is being pushed outward by gravity. However, for this to be true it must take place as a result of Space resisting a distortion geometry created by a mass outside our universe or by our universe itself pushing and pulling itself against the confines of Space, like waves against the shore. This then requires us to entertain the idea that our universe is not the only one here, there may be more separated by Space the composition of which requires further study.
If our universe is not alone and any attempts to see beyond it are obstructed by the Cosmic Microwave Background (CMB) then it would not be ill advised to assume that, though Space is ubiquitous, universes exist such that they are separated, contained and constrained by the CMB. If this in turn is true, then it may require us to accept that the so called 5th dimension or Space, that seems so illusive and impossible to identify, is in fact an aspect of the CMB itself. This would mean that it has to be considered the distortion geometry interacts with the CMB (Space) to create gravity. It would also require us to entertain the idea that the CMB is more than just a remnant of the big bang. It may in fact be the illusive side view, or 5th dimension itself, right under our noses. If there are many or an infinite number of universes and each one of these universes is represented by just one signal which when tuned into becomes a "channel", dimension, continuum or our MDT universe as we observe it from the top-view then all of these continuums or "signals", of which our own universe is merely the one we are tuned into, when combined form the noise observed in the CMB. It looks and sounds like swarming bees because what is being observed is all the continnums, universes or signals in one Space. Consequently, the CMB is our first introduction to Space itself. "With a traditional optical telescope the space between stars and galaxies (the background) is completely dark. However, a sufficiently sensitive radio telescope shows a faint background noise, or glow, almost isotropic, that it is not associated with any star, galaxy or other object" (Wikipedia 2017). What this would mean is that astronomers are in fact observing with a radio telescope is in fact a semblance of Space itself, or the 5th Dimension. The CMB may be key to unlocking much of what science does not know and understand about gravity and that is required to build devices that can control gravity itself. What appears as visual noise in the CMB is probably not noise at all. It only appears as random noise to us because we have not as yet designed a receiver that can interpret what the CMB is broadcasting, which is most likely highly evolved, intelligible and organised and may include inter-dimensional locations that act as beacons for use in tuning into and out of sectors of the universe. Any attempts to understand the CMB or explain it using the four (4) known dimensions is probably a waste of time and will yield a faulty model with misrepresentations and misinterpretations that only further mislead the scientific fraternity. Einstein's Space-Time consists of 3 directions and the 4th dimension Time (Where time is nothing more than moving matter or "Motion") which together form Space-Time. We have shown that this is not Space-Time and corrected it as Distance-Time or a Matter-Distance-Time (MDT) or even better still a Matter-Distance-Motion universe. If the CMB is indeed Space it is of a 5th dimensional construct. Most people try to add a 5th dimension to the 3rd and 4th to arrive at a 5th. Interestingly enough we do not add an additional dimension to the 4th. Instead we should subtract Time and subtract Distance. Why? Because to tune into a new continuum, dimension, signal or universe we do so by tuning out of the one we are already situated in. Having removed these we enter the 5th dimension and tune from here into the next location of our choosing. I have already extensively elaborated how the process of tuning is the function of removing Distance and Time from the physics we use to understand our universe. This 5th dimension is Space which can be identified as or through the CMB. If this is true the CMB cannot be understood using conventional physics. Trying to analyse a 5th dimensional universe using 4 dimensions will yield many false positives and it is very likely that almost everything physics and astronomy thinks it knows about the CMB today is flawed, from faulty top view observations and therefore a half truth. It can only be fully understood when studied outside of Time (Motion) and Distance. Since Time (Motion) and Distance are the foundation upon which the entirety of physics is built today, we do not as yet have the formal reasoning, math, approach or model by which to begin to understand the CMB or how to build a 5th dimensional tuner that will make sense of it, although I have tried to from the beginning of this write up to do this where I have also tried to point out that motion in linear direction may not exist, however, motion with or movement with no vector such as spin may be accommodated in attempts to understand Space; for instance the difference between a distance of a kilometer and 5 million light years should not be seen in terms of how far off they are but rather in terms of the frequency they spin at, their radius from the centre of spin and the direction they occupy in that radius which when tuned into is gained. Every location in the MDT universe from earth to the furthest galaxy, will have a specific frequency in Space on a scale infinitely tinier than that at the quantum level allowing matter to be manipulated below the nuclear level and distances to be covered across the universe, if the Spatial frequency of any location in the Distance is known, it does not matter how far away it is in our universe, it can be tuned into and it doesn't matter how tiny it is, it can be super-manipulated, for instance, allowing bespoke materials to be constructed from the electrons, protons, nucleus and below. Spin would allude to a form of physics centered primarily on frequencies based on spin which become the only basic means of rationally linking Space, where there is no distance to our MDT universe, distance and spin having some shared properties that can be made use of to find workable mathematical linkages to states of existence that function on different properties. Staying in this line of thought, if a complete spin cycle is equivalent to refresh rate and is the only logical means of linking true Space and the MDT universe then this tiny aperture may yield more about technologies capable of directly manipulating Space and therefore a plethora of other phenomena including gravity. Thus far we generally study 2 dimensional waves using amplitude (y-axis) and time (x-axis) to understand magnetism, but to understand gravity we would have to consider a third and fourth property of electromagnetic waves and this is a rate of spin around the x-axis just as fast as the wave moves along the x-axis that loops or corkscrews both the waves amplitude and time from a 2 dimensional construct that is electromagnetic to a 3 dimensional wave. This wave is then pulsed on and off just as rapidly for instance to create a 4th dimensional wave property consequently allowing spin and a refresh process to create specialized frequencies that open a path to harnessing gravity by linking it to a 4th dimensional type of electromagnetism to which Space is able to respond with a push effect. Interestingly enough, in quantum mechanics, it was discovered quite late that electrons do actually spin. However, yet again, we find that this specific spin property was strangely missing from Schrodinger's understanding of waves and it was consequently not included in his famous wave-equation. How is this oversight even possible? Its incredible how great minds in physics such Einstein and Schrodinger could make such immense strides and insights and yet produce ideas that seem to have very obvious flaws that appear to act as misdirects that prevent a clean or clear understanding of gravity and Space. However, it is also possible to conclude that Schrodinger may have noticed spin but because his physics was based on Einstein's erroneous Space-Time model he could not account for it in his equations and decided to ignore it altogether. This possibility simply emphasises the potential dangers Einstein's flawed model has had to weaken the analysis and research of past and modern day physicists. Similarly, Einstein's misdirect affecting the outcomes of Schrodinger's work may have and is still similarly affecting the work of physicists. What other small lab research and billion dollar experiments working in earnest are likely being led in the wrong direction by this misdirect? The potential misgivings the misdirect can cause in physics are real.
Nevertheless, I am of the opinion that the complexity of the technology used in the creation and design of this receiver or tuner, will determine the myriad of ways in which gravity and related phenomena can be manipulated to obtain desired results, much the same way electricity is used by different technologies to produce many devices with innumerable uses. The reason why we fail to identify the CMB for what it really is, is because of the misdirect in Einstein's model that mistakenly labels Euclidian space as Space itself, when it is in fact just Distance linked to Time, Time itself simply being nothing more than Movement or Motion.
One of the misgivings of Einstein mislabelling distance by calling it Space, has been the inevitable confusion it has created amongst the scientific fraternity. It misdirects astronomers by making them believe that when they look up at the stars in the night sky, they are looking at "Space", and misdirects physicists by making them believe the vast emptiness between a nucleus and electrons is "Space" or that when distortion geometry is observed they are looking at Space-Time. This error is so pervasive it almost seems clandestine in that its easy to conclude that it might be deliberate. It gives humanity only one option for controlling gravity, through Einstein's erroneous Space-Time making it practically impossible to do so. We keep on walking passed the elephant in the room, even though we are desperately looking for an elephant. A consequence for science is that right now gravity, its understanding manipulation and control is pretty much like electricity during the Stone Age, in that electricity has always been here, what's been lacking in the past is the means to see, understand, control and harness it. It seems this is the very same problem with gravity today, its right here in front of us, but because of Einstein's error, we simply can't see it, understand it, harness or control it, even though its right in front of us all. Gravity, when seen and understood should be just as easy to control as electricity. This is unlikely to happen without correcting Einstein's model.
In fact seeing Space and where gravity is coming from may not be as impossible as you think. It won't cost you an arm or a leg either. Should you know what you're doing you may not have to build a trillion dollar multi-kilometre long array to understand gravity. If you have one of those old television sets, go and switch it on. When you are between channels the "snow" you see is your first introduction to Space and where gravity is coming from. About 1% or less of the noise you see on the screen is the CMB.
The CMB: Welcome to Space.
Tune in, to go anywhere you so desire.
Back to LIGO
Why are we being pushed to the earth with a force of 9.8N, not pulled by the earth? The distortion may be caused by the earth, but the gravitational force of 9.8N is not coming from the earth, but from Space.
A billiard ball can knock another billiard ball. This is an example of matter on matter action. Distance is just an aspect of matter, as matter is just an aspect of distance within the same continuum. For a large mass to act on the geometry of distance bending, pushing, squashing or stretching it is nothing more than matter acting on matter. What LIGO has done is very important. To some extent what it has done is not prove Einstein was right, its proven he was close to the mark, but to some extent got it wrong, if the labels are placed correctly. With the labels in the right place it can be seen that the distortion detected at LIGO is not a gravity wave. It is the detection of a distortion in the matter-distance-time (MDT) universe we occupy. Any gravitational effect is "push-back" or "Resistance" from ubiquitous Space: a by-product of the distortion that is a response from Space. Why this cannot be seen is because Einstein mistakenly labels distance as Space in his concept of Space-Time or Relativity Theory.
Matter-Distance-Time (MDT)
The diagram shows that mass, gravity and acceleration
are one and the same and do not emanate from our dimension.
According to Einstein distortion geometry, gravity or bending Space-Time
is what is pulling the blue object down, which is incorrect because Einstein mistakenly refers to the distortion geometry itself as "Space" or "Space-Time" which is a critical perception based misdirect in physics. Shown in red gravity
emanates from ubiquitous Space as a form of resistance to the Einstein's distortion
creating a push or acceleration demonstrating that they are not one and the same. This distinction is critical and can only be
made by correcting Einstein's model.
|
The diagram above shows that mass, gravity and acceleration do not emanate from our dimension. Correcting Einstein's model shows that they are all created by the same force. Since it emanates from Space, it is outside our dimension. Consequently, as I mentioned earlier: objects have no actual or substantive volume or mass and therefore no genuine weight. Mass and time are useful for the experiential Universe, but are not practical or efficient to the mechanics of how the Universe is created (that is, the operational Universe) it is not scientifically practical for matter or objects to be of excessive volume or weight and of primitive top view "Einsteinian-Space" itself to be of great “distance” or of time to be of a burdensome duration; these will all be inevitably seen as very crude ways of understanding the Universe and the physics that applies to it.
Why do I keep be-labouring this point? Thus far very little is known about Space. We tend to think we know a great deal about it because Einstein mistakenly took distance and called it Space. This is fine because its a perception based error anyone can make and is one belonging to an age in science. It has had immense repercussions in physics, but as it is with any subject of importance changes in perspective bring about new ways of approaching the same ideas. If for instance, distance and Space are completely different things, then the push effect depicted in red in this diagram does not have to be caused by distortion geometry in our matter-distance-time (MDT) universe. LIGO has proven that our MDT universe is very stiff or inflexible. Geometric distortion as a technology or means of creating gravity would require tremendous amounts of power to generate the resistance that would induce acceleration and that we would observe as gravity. Since gravity does not emerge directly from distortion geometry, but from Space, which is outside our MDT universe and the fact that it is stiff provides a very simple explanation for why gravity is experienced as such a weak force. However, since gravity is a by product of geometric distortion why use this very difficult, extremely weak, nearly impossible route to manipulate gravity? Why don't we boldly go where no man has gone before and instead go straight to the source of gravity, namely Space? Tiny manipulations of Space can induce a much larger push effect on matter-distance-time. However, finding Space, understanding what it is and how it works outside of matter-distance and time is the frontier physics needs to delve into. But you cannot look for something that you believe you have already found. You will simply stop looking, which is the tragedy. Today we mistakenly point to distortion geometry and call it "Space" or "Space-Time" when in fact we have mislabelled and therefore have not as yet found what it is we speak of inevitably misleading ourselves. When this happens physics stops moving forward in leaps and bounds because its trapped in a theory loop caused by a misdirect. This is why I keep be-labouring and stressing the need for science to correct Einstein's model.
Recent Media to Watch:
This interesting documentary released recently (January 2019) concurs with my analysis. Here is the link to it. Its called Einstein's Quantum Riddle. The Institute of Advanced Studies in the documentary is getting closer to the truth (at 46:34 in the video). When Robbert Dijkgraaf (Director of the Institute of Advanced Studies and Leon Levy Professor) talks about correcting Einstein's model or understanding of the universe he implies Space-Time is actually incorrectly interpreted by Einstein. The Institute is absolutely right in the sense that when Robbert Dijkgraaf talks about removing "Space and Time" altogether what he actually alludes to is removing the concept of "Distance and therefore Motion" (as they are conventionally understood) from Einstein's interpretation. Space and Distance are separate and distinct as are Time and Motion and the general mistake that Robbert Dijkgraaf remarkably corrects by removing Space-Time (sic Distance-Motion) is Einstein's assumption Space and Distance are one and the same which is why it then became impossible for Einstein to complete Unified Field Theory, which, with this problem now resolved, it should be possible to. Its certainly interesting to see that Space can exist independently and irrespective of Distance to create a "Holographic Universe" where spooky action at a "distance" becomes somewhat redundant when distance is removed to hypothetically create a universe consisting purely of quantum entanglement. This is a great documentary. Its nice to see my analysis made many years ago proving to be correct today.
Notes:
Space and Time
According to Einstein's model, when you get up in the morning, get dressed then go to the kitchen for breakfast; then stop in the living room to catch the news on TV you have been to three different rooms at different times: the bedroom at 7am, the kitchen at 8am and the living room at 9am. However, according to my theory you woke up and got dressed, had breakfast and watched TV in one location, one frame or one dimension. Imagine you were watching these events on your TV, they would all have taken place in one location, that is, the TV screen in front of you. The bedroom, kitchen and living room are in fact in the same location, frame or dimension. If a physicist was calculating what you did based on the distance between each room and the time it took to move from one room to the next all these calculations would in a sense, be baloney: because you never actually moved to get from one room. You were in fact in the same place the whole time, therefore time itself, as you may have been taught to understand it, did not elapse.
A Final Conclusion: Economics and Theoretical Physics [July 2020]
I can conclusively say I have broken the seals, so to speak, on two important areas modern science has failed to date to deliver conclusive results: the first is the inability of economics, business, accounting and finance to intrinsically identify the cause of and provide a solution to ending poverty. The other is the inability of physics and the sciences in general to explain and provide a working model or mechanics of a system able to deploy and harness gravity. Even though I may try to play it down, I am glad to say in this month of July I have successfully and beyond reasonable doubt accomplished both these tasks. The arguments in the writing above demonstrate I have spent many years trying to get down to the root of these problems and the knowledge paradigms in which they were en-scrolled, therefore this month represents a personal triumph and I feel at peace. Gravity is the most powerful force in the universe when it comes to humanity's physical existence, but scarcity is the most powerful when it comes to the resources humanity needs for its well-being. The fact that the sciences were unable to provide conclusive answers to these problems was a troubling issue for me that raised many questions about inconsistencies and inadequacies in knowledge and ascribed intellectual limitations. These were perception based problems, the kind it seems, are the most difficult even for the most astute minds because they require counter-intuitive processes to unravel the mysteries that cloud the path to accurately determining their truths.
[1] Punabantu Siize (2004) “Time”, Revision of Punabantu S (Nov 2003) “African Time”, Post Newspaper
[2] Einstein Albert (5th May 1920) “Ether and the Theory of Relativity” (an address delivered on May 5th, 1920, in the University of Leyden)
[3] Ibid.
[4] Op. cit.
[5] Jeremy Chapman (2010) “Relativity and Black Holes : The Beginning Becomes the End Becomes the Beginning : A study of cosmological birth and death”
[6] Tim Folger (2007) “Newsflash: Time May Not Exist”
[7] Wikipedia (2010) “Wave–particle duality”
No comments:
Post a Comment