Task force/Recommendations/Local Language 4
Outline
- Question/Problem
Bandwidth differs significantly from region to region, and is a barrier to reach. If users within a region are not able to use any Wikimedia projects, be it their local Wikipedia or the English version, there will certainly not be any growth of the local projects from within that region.
This article contains a map that shows international bandwidth per capita as of 2005. In Europe and North America, where the growth of Wikimedia projects has been highest, the bandwidth is likewise. In large parts of the Middle East, South Asia, and Africa the bandwidth per capita is less than 1Mbps.
See also Regional bandwidth and this figure for more updated and detailed data.
- Strategy
There is a couple of things that can be done to limit these problems:
- Give visitor the opportunity to turn of automatic loading of media. A highly visible button labeled "Turn of media" or even something more explaining as "Does Wikipedia load slowly? Click here to turn of media and decrease loading times." could be displayed.
- The local projects could be hosted localy to take advantage of the higher "within country" trafic speeds. (Be sure to catch medias attention in case of such an action. If handled right, this could be an essential outreach move ass well.)
- Local mirrors or catches can in the same way as local hosting decrease the loading times.
Note: There might be legal and technical problems with hosting, mirroring or catching Wikimedia projects outside the US that has to be considered. Catching could maybe avoid some legal issues that hosting and mirroring presents. Local hosting, mirroring and catching does not necessarily implies that this should be done in every country. There could for example be possible to find a country on each continent that complies with the legal framework that the WMF operates under. Such continent hosting, mirroring and catching would also increase the access speed.
Assertion: The amount of data that on an average needs to be loaded when a new article is loaded is estimated to be around 200kb, in some cases more than 1Mb of data has to be loaded.
Using the Firefox extension Firebug the amount of data that was needed to be loaded when fully reloading the articles that was featured at en.wikipedia.org from 1st to 30th of December where 340kb, visiting the page without being loged in. Moreover, even though no such large article was found in December there was at least one in November where 1.2Mb where needed to be loaded. Namely the article wind that was featured the 18th of November.
Further a random walk through Wikipedia articles where a random blue link where followed from a featured article, a new random blue link from this new article followed and so on, seemed to indicate that the amount of data that was needed to be loaded mostly where in the interval 10kb-500kb. This time there was no reloading done, so as to try to find out how much data that has to be loaded when a random new article is loaded but cathced material reused. Most of the material that was loaded came according to firebug from upload.wikimedia.org, sometimes there where material from other destinations that accounted for almost half the loaded data, but very often the upload.wikimedia.org accounted for the significant part. This indicates that when visiting Wikipedia, most of the information that is loaded is media.
Assertion: On 50kbps connections the average time it takes to load an article is estimated to be about 30 seconds.
Assume that 200kb of data has to be loaded on an average when visiting a new article, which seems reasonable from the random walk. With an internet connection at 50kbits it would at full speed take the user 200*8/50 = 32 seconds to load a new article. The featured article of the 18th November would with the same assumptions take over three minutes to load at full speed.
Assertion: It is likely that many internet connections in the near future will be 50kbps or slower. Also "within country" speeds are severly higher than "outside country" speeds.
More interesting is the bandwidth per connection. When looking at "surf speed" statistics, the average "outside country" surf speeds in the countries in the list very often is 100kbps or below, not very uncommon with values lower than 50kbps. Most of the countries in that list is also quite well developed which makes it very likely that Internet connections in less developed countries is even lower. The statistics does however also show that the "within country" speeds often are much higher than the "outside country" speeds.
With the large amount of mobile phone subscriptions in developing countries it is not very unlikely that many of the Internet connections in the near future will be through mobile technologies. GPRS has maximal connection speeds of 56-114kbps while 3G as a maximum has 14Mbps down speed and 5.8Mbps up speed. Because these are the max speeds of the mobile connections it is not very unlikely that many will connect to the internet through connections that effectively are slower than this.