Server mania

Five days later, I am still baffled by the amounts of equipment we saw on our tour of the Middlebury servers. We take for granted every day, that we can sit down with the simple equipment of a lap top, and be “connected” to the internet, and almost anything we want, with the click of a button and no wires or extra hardware. Walking into the server rooms and seeing how much we are ACTUALLY “connected” to, made me rethink how we use our computers and other technologies. I realize how ignorant and uneducated most of us are about the technologies we use on a regular basis. How the heck does the internet actually work? I don’t get it. We’re connected to the servers, which are in a room connected to a bazillion wires that go into the ceiling and under the floors… and apparently some fiber runs underground connecting us to the internet connection in Albany… and that is connected to something else? Where is the central THING that makes the internet? And what is floating around in the air that allows us to have a “wireless” connection?

While our laptop fits nicely in a drawer and in our backpack, and we “store” things online in some seemingly infinite electronic folder, someone else somewhere is watching towers of machinery, changing tapes and hard drives that are holding all that information in a tangible way.

It’s just crazy. I’d like to see what the server rooms look like for programs like “Second Life…”

2 thoughts on “Server mania

  1. Ian McBride

    Second Life actually has a rather insane server architecture where they spin off virtual machines, or ‘shards’, on the fly to accomodate users. In 2006, they had about 2600 servers, each handling three in-game users: http://news.zdnet.com/2100-1040_22-148320.html?tag=zdnn.alert This has since evolved to a model where resources are shared between the company and users running their own servers: http://www.infoq.com/news/2008/12/Second-Life-Ian-Wilkes (1 hour presentation video).

    Another MMO with an unusual server architecture is EVE Online, which houses the entire game universe on a single server: http://www.massively.com/2008/09/28/eve-evolved-eve-onlines-server-model/ Usually, games like Warcraft and Everquest split their player populations between many servers to balance the load on machines. Blizzard maintains four datacenters with clusters of servers (there are additional servers to handle dungeons and chat not shown on this list) just in the US: http://www.wowwiki.com/US_realm_list_by_datacenter

    One of the more interesting cases of large scaling for a web application is Wikipedia, which takes 300 servers to run (http://meta.wikimedia.org/wiki/Wikimedia_servers). The blame for this often falls back on how wiki articles are stored in the database using wiki markup, rather than HTML. This makes editing them a low-overhead operation, since no translation needs to occur, but every view of an article requires it to be translated from wiki markup to HTML, which is why there are a large number of servers that do nothing but stored cached copies of wiki articles already translated to HTML.

    Check out http://highscalability.com for more.

  2. Jason Mittell

    Hannah – remind me on Wed, and I’ll give the class a quick “what is the Internet” overview. It got squeezed out of the schedule this year, but I can cover the info pretty quickly…

Leave a Reply

Your email address will not be published. Required fields are marked *