Greetings Everyone, I would just like to thank you in advance if you are reading this Thread. A little background information about this challenge: I've been developing the "template" for how this program should work and operate over the last 2 years. Now, as i am NOT a programmer, by any means, I will leave it up to the community at large to use the template i'm including in this thread in any way they desire. A working model of this template will have very large implications in the way data is stored and retrieved accross all disciplines.
On to the challenge!
Ok, first you must understand the basic principle behind this system.
To do that, you must convert all your thinking into Binary for a brief period.
Ok, now that we're all in the right headspace i will explain the very heart of the situation.
Let's say you had a big pile of decimal numbers, no pattern, no organization at all. Let's say we're talking about the first 8192 digits of PI. Now i'm not gonna post them all here as that would be rhetorical and they are easily found elsewhere. But i will show an example with the first 32 digits.
1415 9265 3589 7932 3846 2643 3832 7950
Now while you may be able to find patterns of some sort in this small sample, consider millions more digits that follow them as i explain the rest.
Taking the above, let's say i applied a 10-digit, binary KEY to the above.
9 8 7 6 5 4 3 2 1 0 would be the value places for each digit of the KEY.
0 0 0 0 0 0 0 0 0 0 would turn all the decimal digits in the sample to a binary 0
0 0 0 0 0 0 0 0 0 1 would turn all the decimal 0's in the sample to a binary 1, the rest to 0's
0 0 0 0 0 0 0 0 1 0 would turn all the decimal 1's in the sample into a binary 1, the rest to 0's
ETC, so on and so forth. Now, excluding 0 0 0 0 0 0 0 0 0 0, and 1 1 1 1 1 1 1 1 1 1, as they would make any sample all 0's or all 1's, which wouldn't be very useful. We could create, 1022 new combinations, of the sample. now, if we apply that to the first 8192 digits of PI, then we have created 1022 - 1024 byte samples, or nearly 1MB - 2048 bytes for the 2 ommited KEY's
Now, let's say we increased our possible variables to 40 MILLION digits of PI, taking 1022 samples for every 8192 digits, and shifting over 1 digits and repeating, til the very end.
This would mean, that you would have (39,991,1808 x 1022) "frames" of 1KB
39,991,1808 x 1022 = nearly 39 TB of data that would have to be stored.
Just the tip of the iceberg. Using the "frames" the goal is to create an Algorithm that would be able to correctly build 1KB of data using multiple frames stored in the system, using the best match method.
The Result will be a system that can create 1kb of data using only the 10 digit key, a 40 million digit database of PI (20MB stored on an average users computer/cell/dvd player etc) and the location within the database of where to go and how to build. NOW, the secret to this system is, once the system has determined how to build 1kb - it will never need to determine how to do it again. it will use that code to build that segment every time. So in essence, you will never need to store, the original 1kb every again, just the code to build it from the system.
The trick is, since we're limiting the system to only use 40 million possible comibination, the addresses to build the data will only be a few bytes length stored in Hexdeimal
10 bits for the key + up to 2625A00 in HEX (4 x 7 bits = 28 bits)
38 bits + modifiers, = 48 bits max per address build, meaning, if 1kb of data already has 8192 bits, then you could use up to 8192/48 = or 170 different FRAMES to build 1kb. the odds of being able to build 1kb of data using 170 different frames are astronomically good. In fact it's a 99.9999% probability that, along with using special build instructions per kb, you can make any combination that you wish.
The biggest part of this....and you'll all probably need to brace yourselves if you've read to this point already. Once you have all the build codes stored for each KB of data in a file, you can then use a compression program, such as winzip or gzip or whatever, to make a zip file, and then repeat the entire process over again.
So a little math:
1GB file, each KB in the file is stored using ~100 frames. 4800 bits per 8192, or roughly 50%. then they are compressed, to 101% capacity, or 52% of the original size, then 50%'d again, now we're down to 250MB of codes etc, etc etc. I estimate, that if the system is perfected, it will be possible to compress any file (providing you assume that the comrpession doesn't happen on a home based computer, and is instead happening at a facility designed specifically to build KB's of data for this type of compression) to no less than 6kb. and then reverse the process on a home based computer using a very small intricate database comprising only the first 40 million digits of PI and the code to decompress any zip 'midway" points, and the ability to connect to the main facility, to allow compression of files already owned by most users that they wish to preserve into this format.
Thank you very much if you decided to read all of this. Any questions or comments, feel free to post.
On to the challenge!
Ok, first you must understand the basic principle behind this system.
To do that, you must convert all your thinking into Binary for a brief period.
Ok, now that we're all in the right headspace i will explain the very heart of the situation.
Let's say you had a big pile of decimal numbers, no pattern, no organization at all. Let's say we're talking about the first 8192 digits of PI. Now i'm not gonna post them all here as that would be rhetorical and they are easily found elsewhere. But i will show an example with the first 32 digits.
1415 9265 3589 7932 3846 2643 3832 7950
Now while you may be able to find patterns of some sort in this small sample, consider millions more digits that follow them as i explain the rest.
Taking the above, let's say i applied a 10-digit, binary KEY to the above.
9 8 7 6 5 4 3 2 1 0 would be the value places for each digit of the KEY.
0 0 0 0 0 0 0 0 0 0 would turn all the decimal digits in the sample to a binary 0
0 0 0 0 0 0 0 0 0 1 would turn all the decimal 0's in the sample to a binary 1, the rest to 0's
0 0 0 0 0 0 0 0 1 0 would turn all the decimal 1's in the sample into a binary 1, the rest to 0's
ETC, so on and so forth. Now, excluding 0 0 0 0 0 0 0 0 0 0, and 1 1 1 1 1 1 1 1 1 1, as they would make any sample all 0's or all 1's, which wouldn't be very useful. We could create, 1022 new combinations, of the sample. now, if we apply that to the first 8192 digits of PI, then we have created 1022 - 1024 byte samples, or nearly 1MB - 2048 bytes for the 2 ommited KEY's
Now, let's say we increased our possible variables to 40 MILLION digits of PI, taking 1022 samples for every 8192 digits, and shifting over 1 digits and repeating, til the very end.
This would mean, that you would have (39,991,1808 x 1022) "frames" of 1KB
39,991,1808 x 1022 = nearly 39 TB of data that would have to be stored.
Just the tip of the iceberg. Using the "frames" the goal is to create an Algorithm that would be able to correctly build 1KB of data using multiple frames stored in the system, using the best match method.
The Result will be a system that can create 1kb of data using only the 10 digit key, a 40 million digit database of PI (20MB stored on an average users computer/cell/dvd player etc) and the location within the database of where to go and how to build. NOW, the secret to this system is, once the system has determined how to build 1kb - it will never need to determine how to do it again. it will use that code to build that segment every time. So in essence, you will never need to store, the original 1kb every again, just the code to build it from the system.
The trick is, since we're limiting the system to only use 40 million possible comibination, the addresses to build the data will only be a few bytes length stored in Hexdeimal
10 bits for the key + up to 2625A00 in HEX (4 x 7 bits = 28 bits)
38 bits + modifiers, = 48 bits max per address build, meaning, if 1kb of data already has 8192 bits, then you could use up to 8192/48 = or 170 different FRAMES to build 1kb. the odds of being able to build 1kb of data using 170 different frames are astronomically good. In fact it's a 99.9999% probability that, along with using special build instructions per kb, you can make any combination that you wish.
The biggest part of this....and you'll all probably need to brace yourselves if you've read to this point already. Once you have all the build codes stored for each KB of data in a file, you can then use a compression program, such as winzip or gzip or whatever, to make a zip file, and then repeat the entire process over again.
So a little math:
1GB file, each KB in the file is stored using ~100 frames. 4800 bits per 8192, or roughly 50%. then they are compressed, to 101% capacity, or 52% of the original size, then 50%'d again, now we're down to 250MB of codes etc, etc etc. I estimate, that if the system is perfected, it will be possible to compress any file (providing you assume that the comrpession doesn't happen on a home based computer, and is instead happening at a facility designed specifically to build KB's of data for this type of compression) to no less than 6kb. and then reverse the process on a home based computer using a very small intricate database comprising only the first 40 million digits of PI and the code to decompress any zip 'midway" points, and the ability to connect to the main facility, to allow compression of files already owned by most users that they wish to preserve into this format.
Thank you very much if you decided to read all of this. Any questions or comments, feel free to post.