speed of light slowed down

Status
Not open for further replies.
If you must, point by point:

hillbillybob said:
Life, all the data of life is based off of a base 4 (DNA) and this data, very compact (due to the molecular structure) is also very easy to read with transcription and translation times for "miles" of the stuff taking milliseconds.

These aren't directly comparable. DNA is totally different, and without going into much detail, it isn't strictly the same time of 'information' that must be processed. These four 'numbers' are set into 64 possible codons (AAT, or GCT for example), 61 of which define 20 available amino acids, and the others which are things such as 'stop' or 'begin' indications. These amino acids are used to build proteins. In otherwords, DNA is more of a blueprint. Information is never actually processed in a form represented as codons.

I figured with it base ten, we would be able to suppliment the basic fallacy of computers, the inability to reason. I honestly don't think base 2 logic, though correct, and extremely durable, will be propelled into a reasoning logic.
I still don't see how changing the format of the computed information will change what the computer can actually do with it. Reasoning logic will not be derived from complicating a system that works. In theory, everything can be broken down into a binary representation to an infinite amount of detail. If you change the information structure so that the numbers it deals with have 10 possibilities instead of 2, it won't suddenly give the computer the ability to 'think'. I still fail to see how you come up with this conclusion.

If you think about the human brain, we think in base 10. It takes a damaged mind, like most of us here, to think on a regular basis in base 2.
What? No we don't...show me a source for that. If you mean we generally think of numbers in terms of base ten, that's irrelevant and furthermore, it's learned, not natural.

That said, If you just thought about how much data our brains processed on a very real basis. Sight, sound - two now basic forms of data that computers process. We still have touch, taste, and smell. We are trying to create computers in our image, well, lets take a logical look at humans. Every second, our brain is processing in real time terabytes or even more of data. Right now, CPU's are lucky to get 1 terabyte of data every second. On top of this, our brains are also processing other strings.....thoughts, ideas, what we need to do for that day, things to remember, memories. I guess, why re-invent the wheel - we already have one 4311 of a good template to follow off of.
Why are we trying to do that? There are some robotics projects like that, but otherwise, what's the point? Computers are not meant to be like us, they are meant to be useful to us.
 
interesting. point and counter point.

hmmm....give me a little while to think on this, and dig up the information where I got it from. For right now, we agree to disagree.

Thank you, this has been most enlighetining.

Any other hypotheses? This string has in a short while gone really deep. Im still interested.
 
Yes, they aren't comparible, because they are different. I was simply using it as an analogy. What im talking about is completly restructuring the logic. Base 10 precisely will handle everything base 2 will, while providing the space and capacity to handle the type and amount of data that will need to be processed. Yes, computers should be useful to us, that's what they are made for, but to continue to be useful, they will need to evolve, and be able to reason.

Yes, I agree, DNA is a blueprint, but I don't want computers to copy a blueprint. What im trying to espouse is an attitude, not an algorithim. For technology to continue to progress, there will need to be a huge change in attitude. The basic concepts developed in the 40's, 50's, and 60's will still have a bearing, but how it's used, it's application will change.

By changing from base 2 to base 10, we will have the ability to process more data in a smaller frame. It's not complicating, because you can use ranges, limits, so on and so forth when needed. Take for example, the base 10 number "128" and the same number in base 2, 8 bytes - "01000000" The B10 number takes 3 charecters, the B2 number takes 8 charecters. Which takes longer to process? All things aside, literally, which is a longer string? No, this will not suddenly give the computer the ability to reason, but it will allow the computer the ability to see other avenues, to see the requisite "grey areas" if you will, that are part and parcel of reasoning. It will also allow more data to be conveyed in a smaller space.

We don't just think of numbers in base ten. We think of other items in base 10. I have to disagree on it being learned. Work done with apes show they think in base 10. A child before induction in to formal education will classify and catagorize data based(not wholly, but closely) on their fingers, which are base 10. Base 10 is just a natural progression of out own minipulative limbs (fingers). Which is just as well, because we could have just as easily thought in base 2(two arms) but we use base 10 because it applies more to our daily world, or environment.
lol, as for the damaged mind portion, obviously you didn't see the tongue planted firmly in cheek when I mentioned "we" have a damaged mind.

Yes, in a very real sense, we are trying to create computers in our own image, and im not talking physical. Mental, Psychological, absolutly. I don't care if it's the most basic PDA or a highly advanced robot, it all stems the same - we are expecting computers to do more and more of our daily menial tasks (calculations, welding, typing, time management) Is it not natural for them to be expected to follow in our image???

As for all this, this is, mostly my opinion, thoughts derived from thinking(albeit creativly) and if given the chance, I would pursue it, this line of thinking into chip and program design. I adamantly believe in this philosophy. You are more than welcome to ask questions, as I will be asking questions, but in no means am I pushing my philosophy as right or wrong, that's too binary, but as simply another angle. The idea of right, wrong, AND otherwise, are what is going to be what sends computers into the next generation, note that we are still operating 4th generation machines.
 
There are fundamental flaws in your logic here...

hillbillybob said:
What im talking about is completly restructuring the logic. Base 10 precisely will handle everything base 2 will, while providing the space and capacity to handle the type and amount of data that will need to be processed.
No, no, and no. Here's the issue with this whole thing: Base 10 offers NO advantages to base 2. There is NO more space and capacity it offers that base 2 doesn't. If you have five apples, you can represent the number as '5', or '101'. It doesn't change the number of apples you have. Any number that can be represented by base 10 can be represented by binary.

By changing from base 2 to base 10, we will have the ability to process more data in a smaller frame.
No we don't. In base 2, we can use the smallest possible representations of each bit because all it needs is to be one way, or another. If you were to change it to base 10, you suddenly require that you work on a larger scale to represent the ten different 'kinds' of bits. You need 10 unique bit statuses, instead of just 2. If you can take a particle and point it one way, or another, and have the ability to read which it is, you have a bit. With base 10, you are now required to to have many more statuses of this particle. You can't just have one direction or another, you have to have partial directions, which would require much more much more precise reading heads. If you have such precise reading heads, you might as well use them to make the bit much smaller, instead of strangely detailed at the larger size.

It's not complicating, because you can use ranges, limits, so on and so forth when needed.
That doesn't make any sense at all...

Take for example, the base 10 number "128" and the same number in base 2, 8 bytes - "01000000" The B10 number takes 3 charecters, the B2 number takes 8 charecters. Which takes longer to process? All things aside, literally, which is a longer string? No, this will not suddenly give the computer the ability to reason, but it will allow the computer the ability to see other avenues, to see the requisite "grey areas" if you will, that are part and parcel of reasoning.
Which takes longer to process?128. Why? Because 01000000 is representable by simple pulses of electricity. An on or and off. You have a pulse, or you don't have a pulse. Modulation of this at extremely high frequences can be done. When you have 9 different types of pulses instead of 1 (the 10th being the 'no pulse'), you have to vary the intensity to distinguish or something. You need extremely accurate methods of measuring the received pulses, and the sent ones, so that the distinction of what type of bit it is can be measured. It isn't a simple 'is there a pulse or not'. It becomes 'if there's a pulse, on a scale of 1-9 which is it?' That's ridiculously impractical, and it'll actually slow things down, since sending the data as electrical modulation through a wire will have to become MUCH more precise. Then it becomes extremely complicated when you have to increase length. The signal at the end won't have as much energy as it started, right? What if it was then mistaken as a bit with the intesity of the one lower? That would mean increasing the wire length can result in data full of errors! Keeping data in a base 10 situation accurate would require that the receiving end TAKES INTO ACCOUNT the length of the wire. That would mean that anytime you plug a cable into your computer, or you make a circuit, you can't just send the data through it, you now have to callibrate both sides of the connection. Now, think of the complications that would have WITHIN a CPU for example. It would result in extremely slowing down computing.

It will also allow more data to be conveyed in a smaller space.
Again, no it won't. Bits where there are only two possible options can be scaled down the the smallest size we can practically work with. When you need ten types of bits, it will result in a LARGER size. It will actually have the opposite effect on storage space. Binary is the simplest possible system, and because of this, it can continue to be scaled down many times smaller than a base 10 system would be.

We don't just think of numbers in base ten. We think of other items in base 10. I have to disagree on it being learned. Work done with apes show they think in base 10. A child before induction in to formal education will classify and catagorize data based(not wholly, but closely) on their fingers, which are base 10. Base 10 is just a natural progression of out own minipulative limbs (fingers). Which is just as well, because we could have just as easily thought in base 2(two arms) but we use base 10 because it applies more to our daily world, or environment.
Base 10 is a number system that we work with, probably, as you stated, because we have ten fingers to count on. Note, however, that base 10 did not always exist. Morever, we do not THINK in base 10. You've compared the human brain to computer a number of times, but it doesn't work like that. We don't process information with a number system. We count in base 10, but this is only because we are used to it. Apes do not *think* in base 10, they may 'use' their ten fingers to count for something, however. I don't know if it's true that they count, and I'd appreciate a link for that. When you think about it, the immediately apparent way to count, assuming you werent raised understanding base 10, would be to give every number a symbol. There is absolutely NOTHING special about ten, other than the number of fingers we have. We do not PROCESS INFORMATION the way a computer does. The functions of our thought and feeling are not derived from millions of numerical calculations every second.

lol, as for the damaged mind portion, obviously you didn't see the tongue planted firmly in cheek when I mentioned "we" have a damaged mind.
I got that, I was responding to the first line where you stated that we think in base 10. We do not think in base ten, read my above paragraph.

Yes, in a very real sense, we are trying to create computers in our own image, and im not talking physical. Mental, Psychological, absolutly. I don't care if it's the most basic PDA or a highly advanced robot, it all stems the same - we are expecting computers to do more and more of our daily menial tasks (calculations, welding, typing, time management) Is it not natural for them to be expected to follow in our image???
It isn't natural to expect that. They are tools, not new people. Granted, there are future projects that will allow us to interact with computers in a more 'human-like' way, but the goal isn't to make them 'humans'.

I adamantly believe in this philosophy.
Waitaminute...how old are you?

The idea of right, wrong, AND otherwise, are what is going to be what sends computers into the next generation, note that we are still operating 4th generation machines.
There still can be right, wrong, and otherwise in our systems if you need it. Just program for those options. Any number of possibilities can be represented by the binary system, it does not pose any such limitations. I assure you that our programs currently do tasks that have more possible outcomes than 'right and wrong'. That isn't going to be what 'sends computers into the next generation'.

You are more than welcome to ask questions, as I will be asking questions, but in no means am I pushing my philosophy as right or wrong, that's too binary, but as simply another angle.
No. I'm sorry. There is no way this can work. A computer by a base 10 design will be extremely expensive, be prone to errors and altered/damaged data, compute slower than base 2 with the same technologies, store much less data in the same amount of space, and in some ways be barely possible to create. All in all, it totally unnecessarily complicates EVERYTHING established today, for NO BENEFIT at all. Base 10 has NO ADVANTAGES over base 2. There is nothing it would improve in computing. There is absolutely no reason to get rid of binary. There is NOTHING base 10 can represent that binary cannot. There is nothing you can process in base 10 that you cannot in binary. Number systems simply don't work that way. They are just that. Systems. Your system will not work. It will not happen. People doing research in these fields clearly understand, otherwise it would seem like an obvious thing to do. Sorry.
 
Thank you, but I think we can agree that we disagree on this subject. You've made some valid points, and this is all in the spirit of sharing ideas, insight and knowledge.

by the way, *grin* im 23
 
Thank you, but I think we can agree that we disagree on this subject. You've made some valid points, and this is all in the spirit of sharing ideas, insight and knowledge.

Does this mean you cannot dispute any of my points?
 
No, that does not mean that I cannot dispute any of your points. Rather, I saw it as an oppurtunity to throw some ideas around. I've spent a lot of time thinking about this. Doesn't mean I'm 100 percent right, but I do however like to seek an audience with an open mind. I believe that was part of the spirit when it comes to discussing new technology.

Second, I for one was starting to lose my temper. I didn't want a fight, I think flaming is childish. I came here as a professional, and will maintain my professionality. That said, the problem lies in that one of us(or both for that matter) was failing, whether is was my failing to convey my ideas in a conscise, clear manner or your failure to have an open mind, we will never know, and it really isn't for either one of us to decide.

This said, I think moving on is the order of the day. It is a great joy of mine to carry on debates, but it has to stop when tempers start flaring.
 
i have to agree, in that Einstein theorized the universal speed limit was the speed of light in a vacuum. he said nothing about being to slow light itself down. i wonder if he ever considered it?

Yes. Yes, I have considered it.
 
I find this interesting. Speeding it up:


Light Exceeds Its Own Speed Limit, or Does It?
By JAMES GLANZ


# Related Articles The Nature of the Universe
# The New York Times on the Web: Science/Health

Forum
# Join a Discussion on Science in the News

The speed at which light travels through a vacuum, about 186,000 miles per second, is enshrined in physics lore as a universal speed limit. Nothing can travel faster than that speed, according freshman textbooks and conversation at sophisticated wine bars; Einstein's theory of relativity would crumble, theoretical physics would fall into disarray, if anything could.

Two new experiments have demonstrated how wrong that comfortable wisdom is. Einstein's theory survives, physicists say, but the results of the experiments are so mind-bending and weird that the easily unnerved are advised--in all seriousness--not to read beyond this point.

In the most striking of the new experiments a pulse of light that enters a transparent chamber filled with specially prepared cesium gas is pushed to speeds of 300 times the normal speed of light. That is so fast that, under these peculiar circumstances, the main part of the pulse exits the far side of the chamber even before it enters at the near side.
 
Status
Not open for further replies.
Back
Top Bottom