One of the questions that I have received recently was about a website that is hosted by 3rd-party. The website, at times, experiences some slowness. One of the reasons for this is a bandwidth consumption problem from our corporate site. However, this was not the case today. I was asked why our site was slow, when Facebook, Weather Channel, and WRAL works fine. Here is the answer:
Let’s look at these sites:
Facebook – Facebook has two data centers that they own. Each data center is approximately 307,000 sq. ft. (Larger than two Wal-Marts, and about 3 times the size of Raleigh’s warehouse). They also lease space in six other data centers on both coasts. At the present time, Facebook runs approximately 60,000 web servers (2010 estimate – Based upon the stats that I see, that number could be nearing 250,000 servers today). They have a larger data transfer rate than Google. Their network equipment to provide service is valued at over $1 Billion (SEC filing 2011). According to the American Registry for Internet Numbers (ARIN), (the organization that gives out IP addresses), Facebook has over 3.09 x 10^26 computer numbers available (big growth plans there), and several thousands of the old IPv4 numbers (too many different networks to calculate the numbers quickly).
Weather Channel – I found one set of network addresses that belong to the Weather Channel. They currently have the capability of using 4,096 servers. In looking at data for the Weather Channel, I can see that they have servers on both coasts. After that point, I’m hypothesizing on the number of locations that their data is being sent from. However, large organizations typically use multiple servers in geographically diverse regions of the world to serve up their content. I don’t have any idea on the amount of bandwidth that they have in each location serving up data.
WRAL – I found that they have data centers in Raleigh, Charlotte, Indianapolis, IN and a location that I can’t determine. From ARIN, they have the capability of nearly 24,500 servers that they can use to serve data. I know that one of their hosting locations in Raleigh has the ability of providing nearly 1 GB data over 9 different providers (This is their 10% number).
Our site – One server handling three web sites. Unlimited use of bandwidth to the facility. From notices I receive, I know that there are a couple of data centers that host our site. But, it comes from one location. I have also seen that they have done upgrades to the amount of bandwidth that they can handle, but, it’s not to the point of supporting 1GB per website hosted (They have over 1.5 Million sites). I’m sure that is over the sum of their data centers.
So, we don’t have multiple servers in multiple locations serving the same data. There is also “network latency”. This is something that can change on a second by second basis. When you go to a website, the web browser makes a call to get an IP address of a server that is providing the information. Since this request can be handled by servers a short distance away, it yields a faster time to get a number. Then networking takes over and tries to find the shortest path first to get to the location. When it finds that path, it tries sending data that way. Once the data is on it’s way, if there is a problem somewhere it is taking longer to respond to a request, that adds up in time to get there. Most of the companies that you suggested, have very short paths to get to their data. In looking at the time to respond to Facebook, there were 19 hops (or19x places that your data traveled through to get to the FB page. I can see that if I try to retrieve an FB page that goes through Raleigh, to Atlanta and gets on the backbone of one of the major internet providers, and heads to Miami. The page is served from there. If looking at our site, it took 23 hops to get there, going though Atlanta, DC, to Los Angeles before hitting a server on the West Coast. Things started slowing down once it hit the DC area to get on cross-country fiber before being served up. Right now, the trend of the internet is showing an index of 94 out of 100 (Yes, there is an internet traffic report — www.internettrafficreport.com). 100 is very good response times. 94 isn’t bad, but, the trend is a downward trend. Response times may increase before they start to decrease. This could change at any second.
Any of these variables (server load, network latency across the states, capacity usage at the hosting site, capacity used in any of our locations, our combined locations going through a firewall) can change at any second in time for the better or worse.
Leave a Reply