speed of light slowed down

Status
Not open for further replies.
i should rephrase the title of this thread as "light slowed down" because the speed of light is not actually completely constant...as any medium that is not a complete vaccuum slows light down a bit.

the implications to this experiment are huge, really. right now, electrons are used to send information. but with these new breakthroughs, photons (the smallest particles of light), can be used. and they are smaller, faster, and produce less heat/use less energy. they claim to produce commercial products using this technology by 2007. could be exciting...
 
What's to keep IBM from patenting this technology and hiking the prices to incredible amounts? I can see how this will majorly affect the architecture of the computer itself, as opposed to using traditional 1's and 0's. A whole new way to look at computers! Good stuff. I'm really looking forward to how much these would be priced at.
 
just to get a little more abstract:
I don't think it will do away with 1's and 0's as we are thinking, how bout maybe boosting it to say base 10 or hex ---- opening more pathways, boosting "logic" as it is in computers. With this we could very well realize the technology needed to replicate natural computers - the brain. Suddenly full 3D video and Stereo sound in tandem plus who know's how many other I/O methods could be realized, Im lovin this, but

"With great power comes great responsibility" - Ben Parker, right before his death.
 
TheMajor said:
I doubt Einstein will like this.
How does this contradict any of Einstein's work?

Originally posted by hillbillybob
just to get a little more abstract:
I don't think it will do away with 1's and 0's as we are thinking, how bout maybe boosting it to say base 10 or hex ---- opening more pathways, boosting "logic" as it is in computers. With this we could very well realize the technology needed to replicate natural computers - the brain. Suddenly full 3D video and Stereo sound in tandem plus who know's how many other I/O methods could be realized, Im lovin this, but
Not quite. Why would it be necessary to boost it to base 10? There are very basic issues with doing something like that, and there's no real need for such a change. It would not have a direct effect on boosting 'logic'. The reasons are the practical barriers. These include basic data storage. The whole point of binary is that it can be represented with a simple on and off, a 1 or a 0, a yes or a no. For example, when you store data on an optical disc such as a CD for example, these are represented by pits and no-pits in a spiral around the written surface of the medium. How could you change this to something more complex, where there aren't two possibilities but now 10? It would take much research to develop methods of levelling, or having a more detailed 'bit'. It is totally unnecessary, it would actually decrease data density, since such a 'bit' would now take up much more space. What's the point? We can increase 'logic' and data detail by adding parallels and finding better ways to move the data around and process it. The binary system is incredibly easy to represent in simple systems. What benifits can there be from complicating this?

Originally posted by molsen
Perhaps replace 1's and 0's with colors? Any thoughts?
Thoughts? Yes. When you're dealing with INDIVIDUAL PHOTONS, you do NOT have colours. edit: oops, that was a careless statement. Not entirely true, since light can exhibit properties of matter and a wave, each photon *technically* can have a wavelength, but the measurement of that would be a bit ridiculous. Better just to use pulses of light to represent the 'on's and the 'off's.
 
Qiranworms said:
How does this contradict any of Einstein's work?


i have to agree, in that Einstein theorized the universal speed limit was the speed of light in a vacuum. he said nothing about being to slow light itself down. i wonder if he ever considered it?
 
Qiranworms said:
How does this contradict any of Einstein's work?


Not quite. Why would it be necessary to boost it to base 10? There are very basic issues with doing something like that, and there's no real need for such a change. It would not have a direct effect on boosting 'logic'. The reasons are the practical barriers. These include basic data storage. The whole point of binary is that it can be represented with a simple on and off, a 1 or a 0, a yes or a no. For example, when you store data on an optical disc such as a CD for example, these are represented by pits and no-pits in a spiral around the written surface of the medium. How could you change this to something more complex, where there aren't two possibilities but now 10? It would take much research to develop methods of levelling, or having a more detailed 'bit'. It is totally unnecessary, it would actually decrease data density, since such a 'bit' would now take up much more space. What's the point? We can increase 'logic' and data detail by adding parallels and finding better ways to move the data around and process it. The binary system is incredibly easy to represent in simple systems. What benifits can there be from complicating this?


Thoughts? Yes. When you're dealing with INDIVIDUAL PHOTONS, you do NOT have colours.

Good point. I guess something I was trying to get at is this.
Life, all the data of life is based off of a base 4 (DNA) and this data, very compact (due to the molecular structure) is also very easy to read with transcription and translation times for "miles" of the stuff taking milliseconds. I figured with it base ten, we would be able to suppliment the basic fallacy of computers, the inability to reason. I honestly don't think base 2 logic, though correct, and extremely durable, will be propelled into a reasoning logic. If you think about the human brain, we think in base 10. It takes a damaged mind, like most of us here, to think on a regular basis in base 2. That said, If you just thought about how much data our brains processed on a very real basis. Sight, sound - two now basic forms of data that computers process. We still have touch, taste, and smell. We are trying to create computers in our image, well, lets take a logical look at humans. Every second, our brain is processing in real time terabytes or even more of data. Right now, CPU's are lucky to get 1 terabyte of data every second. On top of this, our brains are also processing other strings.....thoughts, ideas, what we need to do for that day, things to remember, memories. I guess, why re-invent the wheel - we already have one 4311 of a good template to follow off of.

I like this kind of discussion. :D:D:D
 
Status
Not open for further replies.
Back
Top Bottom