If you like DNray Forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...

 

Achieving Server Autonomy

Started by ssnrobtcok, Aug 18, 2023, 06:01 AM

Previous topic - Next topic

ssnrobtcokTopic starter

The goal is to host a certain amount of information without losing it and ensure it remains accessible on the Internet indefinitely. Ideally, the server should require minimal or no maintenance actions, as there may not be anyone available to support it in the future.

One option is to consider using an "eternal server," such as those found on vdsi na(dot)ru/eternal-server, etern alhost(dot)net/vps,  or clou datcost(dot)com/dedicated-servers. It's even possible to have multiple servers and configure mirrors through DNS. However, with domain hosting, it's necessary to pay for a maximum period of 10 years.

Alternatively, it's worth exploring hosting technologies themselves to achieve the desired level of fault tolerance. While static hosting seems like an ideal solution, it may not be feasible in practice. The question then becomes: how can we ensure maximum fault tolerance in this situation?

Unfortunately, relying on modern technologies like docker may not be a viable long-term solution, as they are likely to evolve significantly within a decade and may no longer be compatible with the server's existing version.
  •  


debbiec12323

Ensuring maximum fault tolerance and long-term accessibility for your hosted information is indeed a challenging task. Here are a few approaches you can consider:

1. Distributed Storage: Instead of relying on a single server, you can distribute your data across multiple servers or storage systems. Technologies like RAID (Redundant Array of Independent Disks) or erasure coding can help protect against hardware failures. Additionally, using distributed file systems like IPFS (InterPlanetary File System) or Storj can ensure data availability even if individual servers go down.

2. Blockchain-based Storage: Blockchain technology provides decentralized and immutable storage solutions. Platforms like Arweave or Sia utilize blockchain to store data securely and guarantee its long-term accessibility, as the data becomes part of a distributed ledger.

3. Decentralized Web: Projects like Solid, IPFS, or Dat aim to create a more resilient and long-lasting web infrastructure by decentralizing content storage and removing reliance on central servers. By hosting your data on these platforms, you can increase fault tolerance and ensure long-term accessibility.

4. Self-hosted Infrastructure: Instead of relying on external hosting providers, you can consider setting up your own self-hosted infrastructure using open-source software and commodity hardware. This way, you have full control over the environment and can adapt it to specific needs, ensuring compatibility and maintenance over time.

5. dоcument Standards: Using open and widely adopted dоcument standards (such as HTML, XML, or PDF/A) ensures that your information remains accessible in the future. By adhering to universal formats, you reduce the risk of data loss due to technological obsolescence.

6. Redundant Data Centers: Hosting your data across multiple geographically dispersed data centers provides redundancy and minimizes the risk of data loss due to natural disasters or infrastructure failures. Cloud providers like Amazon Web Services (AWS) or Google Cloud Platform (GCP) offer data replication options across regions or availability zones.

7. Regular Backups: Implement a robust backup strategy to create regular copies of your data. Store backups in different physical locations or cloud storage services to safeguard against data loss or corruption. Automated backup solutions can simplify this process.

8. Version Control Systems: Utilize version control systems like Git to track changes and revisions in your hosted content. It provides a way to roll back to previous versions if necessary and ensures the integrity and availability of historical data.

9. Monitoring and Alerting: Implement automated monitoring systems that continuously monitor the health and availability of your hosting infrastructure. This allows you to detect and address any issues promptly before they impact accessibility.

10. dоcumentation: Maintain comprehensive dоcumentation that describes the structure, organization, and dependencies of your hosted information. This dоcumentation will aid future administrators or users in understanding and maintaining the system.

11. Long-Term Planning: Acknowledge that technologies will evolve over time, and plan for periodic reviews and updates to your hosting infrastructure. This includes regular hardware upgrades, software updates, and migration strategies to adapt to changing technologies and ensure continued accessibility.
  •  

berilo

1. The concept of "eternal servers" is misleading; it is essentially a marketing tactic that will cease to exist once the company undergoes changes, reorganization, or closure. In reality, these servers are only promised to function for a limited period, typically three years, with uncertainty beyond that time frame.

2. The nature and scope of the information you possess, as well as the definition of the term "host," remain unclear. Furthermore, the criteria for autonomy are not well-defined. Should the information be made publicly accessible? Does it require indexing? Is access control necessary? Which protocol should be used for accessibility? Will the information solely benefit you in 10 years or others as well? Is it licensed content susceptible to removal under the DMCA (such as movies, TV series, or music)? Does it contain personal data (such as database leaks)? Is the information currently popular or in demand? Is there a likelihood of sustained relevance over the next decade? Is the information categorized or cataloged? Does the information revolve around a specific theme, scientific field, or dedicated archive? Lastly, does convenience and speed of access play a crucial role?

There exists a plethora of technologies, each with different purposes. It is imperative to answer these questions to discard unsuitable options and thoroughly consider appropriate alternatives.

3. If the information is public and will retain its relevance for the next decade, then utilizing DC++, BitTorrent, and web storage with direct file links, alongside including torrent file links as webseeds, would be beneficial.
Bittorrent, emerging in 2006, is a widely accepted and reliable solution compatible across various operating systems.
DC++ still maintains popularity, primarily due to advantages such as easy file search by name or directory, as well as effortless information updating without relying on a "directory" in the form of a .torrent file.

However, if the information is confidential or necessitates access control, and you, or anyone else, cannot maintain it for ten years (assuming a scenario where a ten-year prison sentence is involved), then investing in cloud storage services from reputable companies like Google, Yandex, Microsoft, or Apple for the entire duration may be a more reliable option than general shared hosting, or even a virtual private server (VPS).

For smaller amounts of non-copyrighted, unindexed, but valuable data, free hosting platforms like Ucoz, Google Sites, Neocities, Bitbucket, Github, or Sourceforge can be utilized. In particular, Sourceforge supports larger file storage and direct link downloads, making it suitable as a webseed for torrents.

If you are open to exploring developing technologies that have not yet gained widespread adoption, the InterPlanetary File System (IPFS) is worth considering. IPFS operates on the principles of Bittorrent while allowing information access through HTTP. It is supported by major players such as Cloudflare, which provides an Internet-to-IPFS gateway: https://cloudflare-ipfs.com/
Personally, I store multiple static websites with their own domains on IPFS, hosted on my home computer behind the Cloudflare gateway. The advantages include all the benefits of BitTorrent, accessibility to both the website and its domain, indexing capability by search engines, options for long-term paid file storage (eternum.io, pinata.cloud), and ease of information updates. However, it should be noted that drawbacks exist, such as its current relatively slow and unstable performance, as well as limited functionality for static sites only.

4. Based on your previous statement, it seems you have a relatively small collection of media files, amounting to just 100 GB. While this may seem insignificant, if these files hold value, even within a limited community, you can consider sharing them through the Schare project: https://valdikss.org.ru/schare/
My primary criterion for autonomy is to achieve maximum independence from third-party infrastructure. To fulfill this goal, files are hosted on a personal home server and distributed across decentralized file sharing networks.
  •  

Tango6911

Partitioning the service into multiple reputable hosting companies (none of which are listed) and prepaying for a decade or setting up automatic payments. Establish a load balancer among them. However, unforeseen circumstances are likely to occur within ten years. If the data is not static, consider incorporating the probability of detecting 0day vulnerabilities that could lead to the service being compromised and rightfully taken down by a botnet. It may be easier and more dependable to sign a support contract for this service, which could potentially be more cost-effective as well.

By dividing the service between different hosting providers, you can distribute the workload and ensure reliability. Prepaying for a long period or enabling automatic payments adds convenience. Implementing a load balancer will further enhance the service's stability. Nonetheless, it is essential to account for the unpredictable nature of the future. Including the possibility of detecting and addressing emerging security loopholes is vital, especially when dealing with potential botnet threats. An alternative approach could involve securing a support contract for the service, providing added reliability and potentially lowering costs.
  •  


If you like DNray forum, you can support it by - BTC: bc1qppjcl3c2cyjazy6lepmrv3fh6ke9mxs7zpfky0 , TRC20 and more...