TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The Pirate Bay Runs on 21 “Raid-Proof” Virtual Machines

139 点作者 tjaerv超过 10 年前

6 条评论

fooyc超过 10 年前
&quot; This saved costs, guaranteed better uptime, and made the site more portable and thus harder to take down &quot;<p>Probably not true for &quot; This saved costs &quot;. From what i&#x27;ve seen, virtual machines usually cost more than twice the price of renting the equivalent &quot;real&quot; machine monthly.<p>They could have used dedicated servers; there are more dedicated server providers than VM providers, thus achieving the same goal, less expensively.<p>Probably not true for &quot; better uptime &quot; either; VMs are still hosted on real hardware, which fails, too. (Although distributing the work on more independent machines can improve uptime.)
评论 #8347339 未加载
评论 #8347489 未加载
TomAnthony超过 10 年前
If the load balancer is the weak point that would be first to be discovered, then I imagine they must have some mechanism to stop it leaving evidence that leads to the other machines if it were to get raided (it isn&#x27;t on their hardware, so they can&#x27;t prevent the files being backed up).<p>Is there a way the codebase could be entirely encrypted and not even accessible to the cloud provider (with some &#x27;boot password&#x27; needed each time the server starts up)?
评论 #8347624 未加载
评论 #8347413 未加载
tjaerv超过 10 年前
&quot;At the time of writing the site uses 21 virtual machines (VMs) hosted at different providers. [...] All virtual machines are hosted with commercial cloud hosting providers, who have no clue that The Pirate Bay is among their customers.&quot;
评论 #8347220 未加载
verroq超过 10 年前
Why can&#x27;t people find where their servers? I understand they have their own IP allocation, thus they can use BGP tricks. But don&#x27;t they need a sympathetic ISP or similar to help them get the routes in?
评论 #8347852 未加载
评论 #8347774 未加载
Nanzikambe超过 10 年前
Interesting, so I&#x27;m presuming there&#x27;s several VPNs involved between the load-balancer and all the discrete servers. I wonder if they use a VPN provider with a static IP and no-logs policy or if it&#x27;s simply yet another VPS.<p>I&#x27;d love to hear a little more about the architecture.
评论 #8348336 未加载
评论 #8347401 未加载
Theodores超过 10 年前
&gt; In total the VMs use 182 GB of RAM and 94 CPU cores. The total storage capacity is 620 GB, but that’s not all used.<p>That level of hardware&#x2F;cores seems a bit over the top given what TPB does.<p>When I was a boy we had this thing called &#x27;Alta Vista&#x27;. It was <i>the</i> search engine before Bing! came along. Processors did not run at gigahertz speeds back then and a large disk was 2Gb. Nonetheless most offices had the internet and when people went searching &#x27;Alta Vista&#x27; was the first port of call for many.<p>TPB has an index of a selective part of the internets, i.e. movies, software, music, that sort of thing. Meanwhile, back in the 1990&#x27;s, AltaVista indexed everything, as in the entire known internets, with everything stored away in less than the 620Gb used by TPB for their collection of &#x27;stolen&#x27; material.<p>From <a href="http://en.wikipedia.org/wiki/AltaVista" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;AltaVista</a><p>Alta Vista is a very large project, requiring the cooperation of at least 5 servers, configured for searching huge indices and handling a huge Internet traffic load. The initial hardware configuration for Alta Vista is as follows:<p>Alta Vista -- AlphaStation 250 4&#x2F;266 4 GB disk 196 MB memory Primary web server for gotcha.com Queries directed to WebIndexer or NewsIndexer<p>NewsServer -- AlphaStation 400 4&#x2F;233 24 GB of RAID disks 160 MB memory News spool from which news index is generated Serves articles (via http) to those without news server<p>NewsIndexer -- AlphaStation 250 4&#x2F;266 13 GB disk 196 MB memory Builds news index using articles from NewsServer Answers news index queries from Alta Vista<p>Spider -- DEC 3000 Model 900 (replacement for Model 500) 30 GB of RAID disk 1GB memory Collects pages from the web for WebIndexer<p>WebIndexer -- Alpha Server 8400 5&#x2F;300 210 GB RAID disk (expandable) 4 GB memory (expandable) 4 processors (expandable) Builds the web index using pages sent by Spider. Answers web index queries from Alta Vista
评论 #8348066 未加载
评论 #8348059 未加载
评论 #8347931 未加载
评论 #8348029 未加载