Multinet: Overnet, Lednet and Internets
Web reinvented. Many times.
Proprietary protocols were developed by the I.D.E.A. to maintain contact with their Explorers on duty. Heavy and poorly scalable, they were not suited for mass usage and the whole Dominium found itself unprepared. Dimensional exploitation facilities suffered from a delay in the connection to their homeworld of Terra but the employees had to bear with it. However, the first Terran settlers were not to completely give up on their beloved Web 6.0.
No solution was found to maintain the network as it was across multiple dimensions. The IP+ system worked poorly over the gateways and wasn't fit to handle the loss and corruption of data at such a level.
We tried everything we could, Donovan. Repeating the request countless times just overload the system. Yes, we've tried to connect the dimension using cables, do you think we're that stupid? It eats the matter, anything we've left in the aether for more than a few hours comes back completely corroded and unusable. If you're willing to spend hundreds of billions daily just to maintain a connection that would cut for a few minutes every hour, tell me. Otherwise, let the experts do their jobs in peace. We'll find something. Eventually.
Overnet
There is always a need for a ubiquitous web, no matter how expensive. The ever-accessible Overnet is built over the Gemini protocol heavily twisted. Hosting only static sites, either governmental, news or megacorporations. It is the front page for many big groups with very little interactivity. The biggest websites have links that connect to their site on the local internet, for more modern browsing.
The Overnet relies on an updated client-side TLS and the one-line-by-content-type concept of Gemini, mostly for practical reasons. The lightweightness that made the essence of early Gemini capsules disappeared in the profit of a more robust communication. The reviewed protocol, OGemini, contains the size of the payload in the header, along with many parameters taken from the now-dead HTTP. The response is divided into chunks of different sizes, as one chunk transmitted is equal to one OGemini line, be it text, image, or link. Each chunk starts with four leading bytes: the first two specify the position of the size indicator in the request and the next two are the type of the data. Lastly, a size indicator is put in the line, usually after the first four bytes, and specifies the exact size of the data chunk.
To counter the heavy data loss due to the interdimensional travel, each request and response are sent continuously until they receive a pingback, which is sent five times. If no pingback is received after a certain time, the request time out to avoid over cluttering the network. This model makes up for the loss and corruption of packets, as both clients and servers should have utilities to rebuild a shattered request. The Overnet is centralised in Terra, the responses are dispatched through one of the many portal rooms.
The Overnet addressing went back to the old IPv4 system that represents the request dimension. In there, the requested information is cached by the transit server which uses the local protocol to convey the responses to the client.
Web evolution
The web came a long way since its origins. First, it was designed to allow global access to information, then its second version focused heavily on website interactivity. The decentralised 3.0 suffered a great resistance at first but its model stayed in place for a while until the creation of the United Nations of Terra. Web version 4 was about doing everything online, with access to wonderful virtual worlds through cheap VR gear. After the great recession of the 22nd century, this model was not a suitable one anymore, and the very temporary Web5 was the only one that reduced functionality. Focused on sobriety and alternative, smaller networks than the web, it was mostly dominated by gated communities on private networks. The age of prosperity brought by the exploration allowed for a new breakthrough in augmented reality. The new web guarantees access to information and web content from anywhere at any time. The official change of version to get to Web 6.0 came way later than this new paradigm, with the upcoming DNS/IP+ protocol.
DNS/IP+ Protocol
With the inevitable shortage of IPv6 addresses, a new way to specify physical addresses of network endpoints had to be found. Rather than a 256 or 512 bits IPv7 which would eventually become limited, a new protocol was designed. Named DNS/IP+, the domain name are considered physical addresses. The routing is made by a new URI scheme, ordering the domain levels by their hierarchy. Hence, the top-level domain requested is the first one requested, and the corresponding DNS server will then send a request for the second-level domain and so on until the whole IP+ request is resolved. For example, the URI www.idea.com would become com.idea.www in the DNS/IP+ URI scheme, and would correspond to a physical address.
Great article! It was super interesting to see how they deal with the troubles of inter-dimensional communications XD I love how you've explained the way they come up with a solution and what are the consequences for them and their culture :D