answersLogoWhite

0

Search results

Wikibooks was created in 2004.

1 answer


They are all the wiki projects, e.g. Wikibooks, Wiktionary, Wikiquote, and even Wikipedia

1 answer


It was invented by Niklaus Wirth in 1968 as a research project into the nascent field of compiler theory.

Source: Wikibooks

1 answer


According to Wikibooks and some other sources, St. Brendans has 17% alcohol (ABV), and is therefore 34 proof.

1 answer


Still have questions?
magnify glass
imp

Wikipedia is managed by a non-profit parent organization, there is not a particular owner. The Wikimedia Foundation also manages the operation of Wikipedia's sister projects, such as Wiktionary and Wikibooks.

1 answer


One can find a list of card games at the following places such as: Ranker, eZineArticles, Wikibooks, BoardGames, Pagat, Wikipedia and as well as at The House of Cards.

1 answer


Lists of mathematical symbols can be found online at Wikepedia, Rapid Tables, PSU website, Binghamton, Radio Electronics, Nikhef, J.D. Fielder, and Wikibooks.

1 answer


One can purchase books about embedded Pcs on Amazon or Wikibooks. Examples for books are "Embedded Computer Vision" or "The Embedded PCs ISA Bus" by Ed Nisley.

1 answer


There are numerous places online one can learn about The Business Strategy Process. One can find information at any of the following site: Wikibooks, Intellego, Thecqi or Businessinsider.

1 answer


Several schools have great computer science and IT courses in their curriculum which will educate you on the entire topic of networking. In addition to enrolling in classes, there are several free resources scattered all throughout the Internet on websites like WikiBooks.

1 answer


Yes, wikipedia is a non-profit.

Wikipedia is part of The Wikimedia Foundation Inc, which is a non-profit charitable organization, which also includes: Wikipedia, Wiktionary, Wikiquote, Wikibooks, Wikisource, Wikimedia Commons, Wikispecies, Wikinews, Wikiversity, Wikimedia Incubator and Meta-Wiki.

1 answer


Wikipedia, Wikitionary, Wikiquote, Wikibooks (including Wikijunior), Wikisource, Wikimedia Commons, Wikispecies, Wikinews, Wikiversity,MetaWiki, wikiHow, WikiAnswers, Uncyclopedia, Encyclopedia Dramatica, MarvelWiki, Bulbapedia, GuildWiki, Left4Dead Wiki, SleepWiki, YugiohWiki, to name a few of the most popular wikis.

1 answer


If you want to set the color of all your paragraphs to red, you should write:

p {color:red;}

You can use CSS classes also. I provide a link to a "CSS" book in Wikibooks for further information

5 answers


It depends on how much you use of the other ingredients. The Wikibooks Cookbook says:

  • � pound Flour
  • � pound Butter
  • � pound Eggs (4 eggs)
  • � pound Sugar
  • � pound assorted dried fruit (Currants, Sultanas, and Raisins, with perhaps a lesser quantity of Glac� cherries).

The traditional recipe doubled the quantities above, and eliminated the fruit. The name comes from the fact that it used a pound each of sugar, flour, butter and eggs. Happy baking.

1 answer


There are a few places where one can find a water cycle diagram. The most reliable source for a water cycle diagram is offered by the USGS online site.

3 answers


Youtube would have several, or just start by getting a barebones kit (such as the type that TigerDirect sells... and no, I don't work for them) and read the assembly instructions that would come with the motherboard.

If you are thinking of making a career of assembly/repair, I would suggest taking an A+ course.

3 answers


You can learn Internet Marketing online at WikiHow-Learn Internet Marketing. They have nine easy steps to learning Internet Marketing. You can also post on their Discussion page while you are there.

5 answers


Pucker your lips by tightening the edges. The tighter your lips are, the higher the notes are going to go. If you loosen your lips, the notes will go lower. Insert the mouthpiece into the trombone and twist so that you will secure it and make sure that it won't fall out. Unlike other instruments, the trombone player must move the slide to get the right pitch. The farther out the slide goes, the lower the notes go. WARNING! The slide is very fragile. Even a slight bend in the slide will ruin the trombone. You will have to oil the trombone very frequently to keep the slide moving very fast. Here are the notes and positions for the trombone. There are 7 positions. 1: B♭, F, B♭', D', F' 2: A, E, A', C#', E' 3: A♭, E♭, A♭', C', E♭' 4: G', D, G, B, D' (alt) 5: G♭', D♭, G♭, B♭' (alt), D♭' (alt) 6: F', C, F (alt), A' (alt), C' (alt) 7: E', B, E (alt), G#' (alt), B' (alt) -WikiBooks To empty the spit out of the trombone, you have to press the spit valve down and blow into the trombone.

2 answers


To find out how to register and where to take the SAT test, visit www.collegeboard.com. These are the makers of the test and will be able to give you any information you need including tips and strategies for taking the test.

7 answers


#include<stdio.h>

#include<conio.h>

#include<math.h>

int a[30],count=0;

int place(int pos)

{

int i;

for(i=1;i<pos;i++)

{

if((a[i]==a[pos])((abs(a[i]-a[pos])==abs(i-pos))))

return 0;

}

return 1;

}

void print_sol(int n)

{

int i,j;

count++;

printf("\n\nSolution #%d:\n",count);

for(i=1;i<=n;i++)

{

for(j=1;j<=n;j++)

{

if(a[i]==j)

printf("Q\t");

else

printf("*\t");

}

printf("\n");

}

}

void queen(int n)

{

int k=1;

a[k]=0;

while(k!=0)

{

a[k]=a[k]+1;

while((a[k]<=n)&&!place(k))

a[k]++;

if(a[k]<=n)

{

if(k==n)

print_sol(n);

else

{

k++;

a[k]=0;

}

}

else

k--;

}

}

void main()

{

int i,n;

clrscr();

printf("Enter the number of Queens\n");

scanf("%d",&n);

queen(n);

printf("\nTotal solutions=%d",count);

getch();

}

4 answers


Mga Uri ng Pangangatwiran.

1. Pangangatwirang Pabuo (Inductive Reasoning)

Nagsisimula sa maliit na katotohanan tungo sa isang panlahat na simulain o paglalahat ang pangangatwirang pabuod. Nahahati ang pangangatwirang ita sa tatlong bahagi.

a. Pangangatwirang gumagamit ng pagtutulad. Inilahad dito ang magkatulad na katangian , sinusuri ang katangian, at pinalulutang ang katotohanan. Ang nabubuong paglalahat sa ganitong pangangatwiran ay msasabing pansamantala lamang at maaaring mapasinungalingan. Maaring maging pareho ang pinaghahambing sa isa lamang katangian subalit magkaiba naman sa ibang katangian.

b. Pangangatwiran sa pamamagitan ng pag-uugnay ng pangyayari sa sanhi.

Nananalunton ito sa paniniwalang may sanhi kung kaya nagaganap ang isang pangyayari.

c. Pangangatwiran sa pamamagitan ng mga katibayan at pagpapatunay. Napapalooban ito ng mga katibayan o ebidensyang higit na magpapatunay o magpapatutuo sa tinutukoy na paksa o kalagayan.

2. Pangangatwirang Pasaklaw (Deductive Reasoning)

Humahango ng isang pangyayari sa pamamagitan ng pagkakapit ng isang simulang panlahat ang pangangatwirang pasaklaw. Ang silohismo na siyang tawag sa ganitong pangangatwiran ay bumubuo ng isang pangungunang batayan, isang pangalawang batayan at isang konklusyon. Isang payak na balangkas ng pangangatwiran ang silohismo.

5 answers


The Pan Galactic Gargle Blaster was invented by Zaphod Beeblebrox, a major character in Douglas Adams' novel The Hitchhiker's Guide to the Galaxy. The Pan Galactic Gargle Blaster has also been described in the novel as the alcoholic equivalent to a mugging; expensive and bad for the head. Its original, fictional recipe, is as follows:

Recipe
  1. Take the juice from one bottle of that Ol' Janx Spirit.
  2. Pour into it one measure of water from the seas of Santraginus V.
  3. Allow three cubes of Arcturan Mega-gin to melt into the mixture (it must be properly iced or the benzene is lost).
  4. Allow four litres of Fallian marsh gas to bubble through it.
  5. Over the back of a silver spoon float a measure of Qalactin Hypermint extract.
  6. Drop in the tooth of an Algolian Suntiger.
  7. Sprinkle Zamphuor.
  8. Add an olive.
  9. Drink . . . but . . . very carefully . . .
Earth recipes

Recipe published in Mostly Harmless

Because of the popularity of the Hitchhiker's Guideseries, several recipes exist for "Earth versions" of the Pan Galactic Gargle Blaster. The following is but one version, published in "Mostly Harmless", the newsletter of ZZ9 Plural Z Alpha, the official Hitchhiker's Guide appreciation society.

To make a Pan Galactic Gargle Blaster using Terran ingredients:Take the liquid contained in a 200 ml (6.75 oz) bottle of EverClear to remind you that your head will be clear forever if you drink too many Pan Galactic Gargle Blasters, and that your brain will clear of anything soon after you start drinking some, if not before.Into it, slowly pour a 750 ml (25 oz) bottle of Bombay Sapphire to remind you of the marvelous beauty of the old Santraginean seas, or an equal amount of Jeremiah Weed in acknowledgement of what has happened to the Santraginean Seas and their lifeforms.Now add 750 ml (25 oz) of Cold Wild Turkey, letting it run into the mixture as we run through life to remind us of all the lifeforms we meet and experience while hitchhiking through the galaxy.Speedily stirring, add 375 ml (12.7 oz) of Herradura Tequila, mixing it in to commemorate the galactic hitchhikers who died of pleasure among the vapors and gasses in the marshes of Fallia.Over the bowl of a silver spoon, let flow 1 litre (34 oz) of rum in memory of the waterfalls and their glorious rainbows encountered on your journeys through the galaxy of life.Next, drop in the worm found in a bottle of Musquil, watching it dissolve into the mixture. If the bottom falls out and the worm survives, drink at your own risk.Finally, sprinkle into the mixture some Gatorade to commemorate the lifeforms which have vanished and are becoming extinct, both sentient and non-sentient, especially those most in need of aid.

If this many Pan Galactic Gargle Blasters are too many for the number of people you think you are, mix together the following amounts of ingredients as described above for a single serving.

Recipe from Project Galactic Guide

Another recipe posted online calls for:

  • 1 oz. (30 ml) EverClear
  • 4 oz. (120 ml) Bombay Sapphire or Jeremiah Weed
  • 4 oz. (120 ml) Cold Wild Turkey
  • 2 oz. (60 ml) Herradura Tequila
  • 5 oz. (150 ml) Rum
  • 1 worm from bottle of Mezcal
  • 2 oz. (60 ml.) Gatorade

This makes one approximately 18 ounce (0,5 l) Pan Galactic Gargle Blaster. The reason this drink seems so large is that Zaphod Beeblebrox has two heads, so when he created it, it came out to 9 ounces (2,5 dl) per head, so both were happy.

Before drinking, eat one olive to create a sweetness in it which is not there. Drink extremely carefully at your own risk, and remember where your towel is (if you can).

Recipe from Drinksmixer

Yet another recipe, found online: Ingredients:

  • 35 ml Tia Maria® coffee liqueur
  • 35 ml vodka
  • 17.5 ml cherry brandy
  • 1 dash lime juice
  • 7-Up® soda
  • dry cider

Method Pour the Tia maria, vodka and cherry brandy together into a pint glass. Add a dash of lime juice. Fill with equal amounts of dry cider and 7-up, and add ice cubes. Suggested serving: beer glass.

Recipe number four

Another recipe:

  • 1 1/2 shots 151 Proof Rum
  • 1/4 shot Tequila
  • 1/4 shot Gin
  • 2/3 shot Triple Sec
  • 1 shot Blue Curacao
  • 1 dash Bitters
  • 1 dash Grenadine

Recipe number five

  • 4 parts Bacardi Citron
  • 1 part Bacardi 151 (for color)
  • 1 part Goldschallager
  • 1 part Everclear (to cut Goldschallager)

Recipe number six

Another recipe: • 1 part Everclear (or any other strong grain alcohol such as Bourbon, Moonshine, or Vodka) • 1 part Bitter Lemon (or plain Tonic Water) • 1 part Bombay Sapphire Gin (or other gin) • 1 part Yukon Jack Perma-Frost Schnapps (or other mint schnapps, or white crème de menthe) • Enough blue food coloring to make the mixture a very light sky blue

Also needed: • Sugar cubes • Cinnamon extract • Yellow food coloring (optional) • Angostura Bitters • Olives

Mix the first five ingredients and chill (usually for 24 hours). Then, take a sugar cube and let it absorb 1 milliliter of cinnamon extract and 1 drop of yellow food coloring (optional). Place three ice cubes in a glass and pour the chilled liquid mixture over these. Drop in the sugar cube and stir to dissolve, or just let it sit (if food coloring is used, often the sugar cube is just left to sit, to create a layering effect with the color). Sprinkle the Angostura Bitters in the drink, and add an olive. Drink, but very carefully.

Alternatively, one can let the sugar cube absorb 1 milliliter of Angostura Bitters, and sprinkle ground cinnamon on the top of the drink, thus omitting the cinnamon extract.

It is often more convenient to make a large amount of the mixture ahead of time and chill for a day, rather than to constantly mix more of it. It's also best to save it for a special occasion, such as Towel Day due to the work put into making the drink.

Recipe from The Bartender's Best Friend

The Bartender's Best Friend also has a recipe for the Pan Galactic Gargle Blaster as follows:

  • 1/2 ounce Vodka
  • 1/2 ounce Triple Sec
  • 1/2 ounce Yukon Jack liqueur
  • 1/2 ounce Peach Schnapps
  • 1/2 ounce Jack Daniel's Tennessee whiskey
  • 1/2 ounce fresh lime juice
  • 1/2 ounce cranberry juice
  • Fill with lemon-lime soda

Build in an ice-filled Collins glass, filling it with the soda. Stir with a long straw.

Fan Modified Shooter Version:

  • ½ ounce Vodka
  • ½ ounce Triple Sec
  • ½ ounce Yukon Jack liqueur
  • ½ ounce peach schnapps
  • ½ ounce Jack Daniel's Tennessee whiskey
  • ½ ounce lime juice

Pour ingredients into a shaker 2/3 of ice. Shake well. Strain into a stem-less cocktail glass. Top with one large ice cube.

Recipe from "Spiders" Nightclub in Hull,UK

Very nice and a pint of it is served for the reasonable price of £2.80

  • 25ml Vodka
  • 25ml Pernod
  • 25ml Galliano
  • 25ml Blackcurrant
  • 25ml Orange juice
  • Enough cider to top up

Pour over some ice, stir and enjoy!

Recipe reverse-engineered from a stain on RJ Lanning's towel

  • 1 oz vodka (that Ol' Janx Spirit)
  • 1 oz Clamato (oh, those Santraginus fish)
  • 1 oz ice cold gin (Arcturan Mega-gin)
  • 4 oz Zipang Sparking Sake (Fallian marsh gas)
  • 1 oz Creme de Menthe (Qalactin Hypermint extract)
  • 1 Jalepeno (tooth of an Algolian Suntiger)
  • Sprinkle with lemon zest (Zamphuor)
  • add an olive

WARNING: Drinking Pan Galctic Gargle Blasters may cause serious damage to the rods and cones in the human eye, thus explaining why consumers of this beverage have often reported it to be green in color when, in fact, it is not.

AvailabilityA Pan Galactic Gargle Blaster may be had at the Zaphod Beeblebrox bar in Ottawa, ON (Canada), and in Spiders gothic nightclub in Hull, UK (recipe mentioned above), and also at the Black Spot Cafe Bar in Hyannis, Ma, USA. Other bars may be able to mix any of the recipes above (or one of their own). Pan Galactic Gargle Blasters were also served at a couple of the stage show adaptations of The Hitchhiker's Guide to the Galaxy in the late 1970s, according to Neil Gaiman in his book Don't Panic: The Official Hitchhikers Guide to the Galaxy Companion.

NOTE: This answer was taken from wikibooks. I just thought that it would serve the people of wikianswers very very well.

1 answer


curtesy of wikipedia: Nigirizushi * Nigirizushi (握り寿司, lit. hand-formed sushi). This consists of an oblong mound of sushi rice that is pressed between the palms of the hands, usually with a bit of wasabi, and a topping draped over it. Toppings are typically fish such as salmon, tuna or seafood. Certain toppings are typically bound to the rice with a thin strip of nori, most commonly tako (octopus), unagi (freshwater eel), anago (sea eel), ika (squid), and tamago (sweet egg). Nigiri is generally served in pairs. * Gunkanmaki (軍艦巻, lit. warship roll). A special type of nigiri-zushi: an oval, hand-formed clump of sushi rice that has a strip of "nori" wrapped around its perimeter to form a vessel that is filled with some soft, loose or fine-chopped ingredient that requires the confinement of nori such as roe, natto, oysters, sea urchin, corn with mayonnaise, and quail eggs.Gunkan-maki was invented at the Ginza Kyubey (Kubei) restaurant in 1931;[6][7] its invention significantly expanded the repertoire of soft toppings used in sushi. * Temarizushi (手まり寿司, lit. ball sushi). It is a ball-shaped sushi made by pressing rice and fish into a ball-shaped form by hand using a plastic wrap. They are quite easy to make and thus a good starting point for beginners.[8] Rolling maki

Makizushi rolls

Makizushi and Inarizushi in a Japanese supermarket.

* Makizushi (巻き寿司, lit. rolled sushi). A cylindrical piece, formed with the help of a bamboo mat, called a makisu(巻き簾). Makizushi is generally wrapped in nori, but can occasionally be found wrapped in a thin omelette, soy paper, cucumber, or parsley. Makizushi is usually cut into six or eight pieces, which constitutes a single roll order. Below are some common types of makizushi, but many other kinds exist. ** Futomaki (太巻き, lit. large or fat rolls). A large cylindrical piece, with nori on the outside. A typical futomaki is three or four centimeters (1.5 in) in diameter. They are often made with two or three fillings that are chosen for their complementary tastes and colors. During the Setsubun festival, it is traditional in Kansai to eat uncut futomaki in its cylindrical form. Futomaki is generally vegetarian, but may include toppings such as tiny fish eggs. ** Hosomaki (細巻き, lit. thin rolls). A small cylindrical piece, with the nori on the outside. A typical hosomaki has a diameter of about two centimeters (0.75 in). They generally contain only one filling, often tuna, cucumber, kanpyō, thinly sliced carrots, or, more recently, avocado. *** Kappamaki, (河童巻き) a kind of Hosomaki filled with cucumber, is named after the Japanese legendary water imp fond of cucumbers called the kappa. Traditionally, Kappamaki is consumed to clear the palate between eating raw fish and other kinds of food, so that the flavors of the fish are distinct from the tastes of other foods. *** Tekkamaki (鉄火巻き) is a kind of Hosomaki filled with raw tuna. Although some believe that the name "Tekka", meaning 'red hot iron', alludes to the color of the tuna flesh, it actually originated as a quick snack to eat in gambling dens called "Tekkaba (鉄火場)", much like the sandwich.[9][10] *** Negitoromaki (ねぎとろ巻) is a kind of Hosomaki filled with scallion and chopped tuna. Fatty tuna is often used in this style. *** Tsunamayomaki (ツナマヨ巻) is a kind of Hosomaki filled with canned tuna tossed with mayonnaise. * Temaki (手巻き, lit. hand rolls). A large cone-shaped piece of nori on the outside and the ingredients spilling out the wide end. A typical temaki is about ten centimeters (4 in) long, and is eaten with fingers because it is too awkward to pick it up with chopsticks. For optimal taste and texture, Temaki must be eaten quickly after being made because the nori cone soon absorbs moisture from the filling and loses its crispness and becomes somewhat difficult to bite. * Uramaki (裏巻き, lit. inside-out rolls). A medium-sized cylindrical piece, with two or more fillings. Uramakidiffers from other maki because the rice is on the outside and the nori inside. The filling is in the center surrounded by nori, then a layer of rice, and an outer coating of some other ingredients such as roe or toasted sesame seeds. It can be made with different fillings such as tuna, crab meat, avocado, mayonnaise, cucumber, carrots. * Oshizushi (押し寿司, lit. pressed sushi), pressed sushi from the Kansai Region, a favourite and specialty of Osaka. A block-shaped piece formed using a wooden mold, called an oshibako. The chef lines the bottom of the oshibako with the toppings, covers them with sushi rice, and then presses the lid of the mold down to create a compact, rectilinear block. The block is removed from the mold and then cut into bite-sized pieces. Wikibooks Cookbook has a recipe/module on Inarizushi

* Inari-zushi (稲荷寿司, stuffed sushi). A pouch of fried tofu filled with usually just sushi rice. It is named after the Shinto god Inari, who is believed to have a fondness for fried tofu. The pouch is normally fashioned as deep-fried tofu (油揚げ, abura age). Regional variations include pouches are made of a thin omelette (帛紗寿司, fukusa-zushi or 茶巾寿司, chakin-zushi) or dried gourd shavings (干瓢, kanpyō). It should not be confused with inari maki, which is a roll filled with flavored fried tofu. A very large version, sweeter than normal and often containing bits of carrot, is popular in Hawaii, where it is called "cone sushi." Chirashizushi

* Chirashizushi (ちらし寿司, lit. scattered sushi). A bowl of sushi rice with other ingredients mixed in (also refers to barazushi). It is commonly eaten in Japan because it is filling, fast and easy to make. Chirashizushi most often varies regionally because it is eaten annually as a part of the Doll Festival, celebrated only during March in Japan. Chirashizushi is sometimes interesting because the ingredients are often chef's choice. ** Edomae chirashizushi (Edo-style scattered sushi) is an uncooked ingredient that is arranged artfully on top of the sushi rice in a bowl. ** Gomokuzushi(Kansai-style sushi). Cooked or uncooked ingredients mixed in the body of rice in a bowl. * Narezushi (熟れ寿司, lit. matured sushi) is a traditional form of fermented sushi. Skinned and gutted fish are stuffed with salt, placed in a wooden barrel, doused with salt again, then weighed down with a heavy tsukemonoishi (pickling stone). As days pass, water seeps out and is removed. After six months this funazushi can be eaten, remaining edible for another six months or more. Western sushi

The increasing popularity of sushi in North America, as well as around the world, has resulted in variations of sushi typically found in the West and rarely if at all in Japan. Such creations to suit the Western palate[11] were initially fueled by the invention of the California roll. A wide variety of popular rolls has evolved since. Some examples include: * California roll consists of avocado, kani kama (imitation crab stick), and cucumber, often made uramaki (with rice on the outside, nori on the inside) * Caterpillar roll generally includes avocado, unagi, kani kama, and cucumber. * Dynamite rollincludes yellowtail (hamachi), and fillings such as bean sprouts, carrots, chili and spicy mayonnaise. (Dynamite roll and Crunchy roll are essentially reversed in some parts of Canada, especially western Canada.) * Rainbow roll is typically a California roll topped with several various sashimi. * Spider rollincludes fried soft shell crab and other fillings such as cucumber, avocado, daikon sprouts or lettuce, roe, and spicy mayonnaise. * Philadelphia roll almost always consists of smoked salmon, cream cheese, cucumber, and/or onion. * Salmon roll has grilled salmon skin with sweet sauce and cucumber. * Crunchy roll a California roll deep fried tempura-style, often topped with sweet eel sauce or chili sauce. (Dynamite roll and Crunchy roll are essentially reversed in some parts of Canada, especially western Canada.) * Seattle roll consists of cucumber, avocado, and raw or smoked salmon. Other rolls may include scallops, spicy tuna, beef or chicken or teriyaki roll, okra, vegetables, and cheese. Sushi rolls can also be made with Brown rice and black rice. These have also appeared in Japanese cuisine.

7 answers


A wiki is a world wide web application that allows people to add, modify, and delete content in a collaborative way. Wikipedia is a made up word. Wiki combined with "pedia" suggests an encyclopedia that's a wiki.

Wikipedia (pronunciation ) is a free, multilingual encyclopedia project operated by the non-profit Wikimedia Foundation. Its name is a portmanteau of the words wiki (a technology for creating collaborative websites) and encyclopedia. Wikipedia's 10 million articles, about a quarter of which are in English, have been written collaboratively by volunteers around the world, and almost all of its articles can be edited by anyone who can access the Wikipedia website. Launched in 2001 by Jimmy Wales and Larry Sanger, it is currently the largest and most popular general reference work on the Internet.

Critics of Wikipedia target its systemic bias and inconsistencies and its policy of favoring consensus over credentials in its editorial process. Wikipedia's reliability and accuracy are also an issue. Other criticisms are centered on its susceptibility to vandalism and the addition of spurious or unverified information. Scholarly work suggests that vandalism is generally short-lived.

In addition to being an encyclopedic reference, Wikipedia has received major media attention as an online source of breaking news as it is constantly updated. When Time magazine recognized "You" as its Person of the Year 2006, praising the accelerating success of online collaboration and interaction by millions of users around the world, Wikipedia was the first particular "Web 2.0" service mentioned, followed by YouTube and MySpace.

History

Wikipedia began as a complementary project for Nupedia, a free online English-language encyclopedia project whose articles were written by experts and reviewed under a formal process. Nupedia was founded on March 9, 2000, under the ownership of Bomis, Inc, a web portal company. Its main figures were Jimmy Wales, Bomis CEO, and Larry Sanger, editor-in-chief for Nupedia and later Wikipedia. Nupedia was licensed initially under its own Nupedia Open Content License, switching to the GNU Free Documentation License before Wikipedia's founding at the urging of Richard Stallman.

Larry Sanger and Jimmy Wales are the founders of Wikipedia. While Wales is credited with defining the goal of making a publicly editable encyclopedia, Sanger is usually credited with the counter-intuitive strategy of using a wiki to reach that goal. On January 10, 2001, Larry Sanger proposed on the Nupedia mailing list to create a wiki as a "feeder" project for Nupedia. Wikipedia was formally launched on January 15, 2001, as a single English-language edition at www.wikipedia.com, and announced by Sanger on the Nupedia mailing list. Wikipedia's policy of "neutral point-of-view" was codified in its initial months, and was similar to Nupedia's earlier "nonbiased" policy. Otherwise, there were relatively few rules initially and Wikipedia operated independently of Nupedia.

Wikipedia gained early contributors from Nupedia, Slashdot postings, and search engine indexing. It grew to approximately 20,000 articles, and 18 language editions, by the end of 2001. By late 2002 it had reached 26 language editions, 46 by the end of 2003, and 161 by the final days of 2004. Nupedia and Wikipedia coexisted until the former's servers went down permanently in 2003, and its text was incorporated into Wikipedia. English Wikipedia passed the 2 million-article mark on September 9, 2007, making it the largest encyclopedia ever assembled, eclipsing even the Yongle Encyclopedia (1407), which had held the record for exactly 600 years.

Citing fears of commercial advertising and lack of control in a perceived English-centric Wikipedia, users of the Spanish Wikipedia forked from Wikipedia to create the Enciclopedia Libre in February 2002. Later that year, Wales announced that Wikipedia would not display advertisements, and its website was moved to wikipedia.org. Various other projects have since forked from Wikipedia for editorial reasons. Wikinfo does not require neutral point of view and allows original research. New Wikipedia-inspired projects - such as Citizendium, Scholarpedia, Conservapedia and Google's Knol - have been started to address perceived limitations of Wikipedia, such as its policies on peer review, original research and commercial advertising.

The Wikimedia Foundation was created from Wikipedia and Nupedia on June 20, 2003. It applied to the United States Patent and Trademark Office to trademark Wikipedia on September 17, 2004. The mark was granted registration status on January 10, 2006. Trademark protection was accorded by Japan on December 16, 2004, and in the European Union on January 20, 2005. Technically a service mark, the scope of the mark is for: "Provision of information in the field of general encyclopedic knowledge via the Internet". There are plans to license the use of the Wikipedia trademark for some products, such as books or DVDs.

Nature of WikipediaEditing model

Unlike traditional encyclopedias such as Encyclopædia Britannica, no article in Wikipedia undergoes formal peer-review process and changes to articles are made available immediately. No article is owned by its creator or any other editor, or is vetted by any recognized authority. Except for a few vandalism-prone pages that can be edited only by administrators, every article may be edited anonymously or with a user account, while only registered users may create a new article. Consequently, Wikipedia "makes no guarantee of validity" of its content. Wikipedia also does not censor itself, and it contains materials that some people, including Wikipedia editors, may find objectionable, offensive or pornographic. For instance, in 2008, Wikipedia rejected an online petition against the inclusion of Muhammad's depictions in its English edition, citing this policy. The presence of politically sensitive materials in Wikipedia had also led China to block the access to parts of the site.

Content in Wikipedia, however, is subject to the laws (in particular copyright law) in Florida, United States, where Wikipedia servers are hosted, and several editorial policies and guidelines that are intended to reinforce the notion that Wikipedia is an encyclopedia. Each entry in Wikipedia must be about a topic that is encyclopedic and thus is worthy of inclusion. A topic is deemed encyclopedic if it is "notable in the Wikipedia jargon; i.e., if it has received significant coverage in secondary reliable sources (i.e., mainstream media or major academic journals) that are independent of the subject of the topic. Second, Wikipedia must expose knowledge that is already established and recognized. In other words, it must not present, for instance, new information or original works. A claim that is likely to be challenged requires a reference to reliable sources. Within the Wikipedia community, this is often phrased as "verifiability, not truth" to express the idea that the readers are left themselves to check the truthfulness of what appears in the articles and to make their own interpretations. Finally, Wikipedia does not take a side. All opinions and viewpoints, if attributable to external sources, must enjoy appropriate share of coverage within an article. Wikipedia editors as a community write and revise those policies and guidelines and enforce them by deleting, annotating with tags or modifying article materials failing to meet them. (See also Deletionism and inclusionism)

Contributors, registered or not, can take advantage of features available in the software that empowers Wikipedia. The "History" page attached to each article contains every single past revision of the article, though a revision with libelous content, criminal threats or copyright infringements may be removed afterwards. The feature makes it easy to compare old and new versions, undo changes that an editor consider undesirable, or restore lost content. The "Discussion" pages associated with each article are used to coordinate work among multiple editors. Regular contributors often maintain a "watchlist" of articles of interest to them, so that they can easily keep tabs on all recent changes to those articles. Computer programs called bots have been used widely to remove vandalism as soon as it was made, or start articles such as geography entries in a standard format from statistical data.

The open nature of the editing model has been central to any form of criticism on Wikipedia. For example, at any point, a reader of an article cannot be certain, without consulting its "history" page, whether or not the article she is reading has been vandalized. Critics argue that non-expert editing undermines quality. Because contributors usually submit edits, rewriting small portions of an entry rather than making full-length revisions, high- and low-quality content may be intermingled within an entry. Historian Roy Rosenzweig noted: "Overall, writing is the Achilles' heel of Wikipedia. Committees rarely write well, and Wikipedia entries often have a choppy quality that results from the stringing together of sentences or paragraphs written by different people. All of these led to the question of the reliability of Wikipedia as a source of accurate information.

In 2008 two researchers proved the hypothesis that the growth of Wikipedia is sustainable.

Reliability and bias

Wikipedia has been accused of exhibiting systemic bias and inconsistency; critics argue that Wikipedia's open nature and a lack of proper sources for much of the information makes it unreliable. Some commentators suggest that Wikipedia is generally reliable, but that the reliability of any given article is not always clear. Editors of traditional reference works such as the Encyclopædia Britannica have questioned the project's utility and status as an encyclopedia. Many university lecturers discourage students from citing any encyclopedia in academic work, preferring primary sources; some specifically prohibit Wikipedia citations. Co-founder Jimmy Wales stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as authoritative. Technology writer Bill Thompson commented that the debate was possibly "symptomatic of much learning about information which is happening in society today".

Concerns have also been raised regarding the lack of accountability that results from users' anonymity, and that it is vulnerable to vandalism, the insertion of spurious information and similar problems. In one particularly well-publicized incident, false information was introduced into the biography of American political figure John Seigenthaler, Sr. and remained undetected for four months. Some critics claim that Wikipedia's open structure makes it an easy target for Internet trolls, advertisers, and those with an agenda to push. The addition of political spin to articles by organizations including members of the U.S. House of Representatives and special interest groups has been noted, and organizations such as Microsoft have offered financial incentives to work on certain articles. These issues have been parodied, notably by Stephen Colbert in The Colbert Report.

Economist Tyler Cowen writes, "If I had to guess whether Wikipedia or the median refereed journal article on economics was more likely to be true, after a not so long think I would opt for Wikipedia." He comments that many traditional sources of non-fiction suffer from systemic biases. Novel results are over-reported in journal articles, and relevant information is omitted from news reports. But he also cautions that errors are frequently found on Internet sites, and that academics and experts must be vigilant in correcting them.

In February 2007, an article in The Harvard Crimsonnewspaper reported that some of the professors at Harvard University include Wikipedia in their syllabus, but that there is a split in their perception of using Wikipedia. In June 2007, former president of the American Library Association Michael Gorman condemned Wikipedia, along with Google, stating that academics who endorse the use of Wikipedia are "the intellectual equivalent of a dietitian who recommends a steady diet of Big Macs with everything". He also said that "a generation of intellectual sluggards incapable of moving beyond the Internet" was being produced at universities. He complains that the web-based sources are discouraging students from learning from the more rare texts which either are found only on paper or are on subscription-only web sites. In the same article Jenny Fry (a research fellow at the Oxford Internet Institute) commented on the academics who cite Wikipedia saying that: "You cannot say children are intellectually lazy because they are using the Internet when academics are using search engines in their research. The difference is that they have more experience of being critical about what is retrieved and whether it is authoritative. Children need to be told how to use the Internet in a critical and appropriate way."

There have been efforts within the Wikipedia community to improve the reliability of Wikipedia. The English-language Wikipedia has introduced an assessment scale against which the quality of articles is judged; other editions have also adopted this. Roughly 2000 articles in English have passed a rigorous set of criteria to reach the highest rank, "featured article" status; such articles are intended to provide thorough, well-written coverage of their topic, supported by many references to peer-reviewed publications. In order to improve reliability, some editors have called for "stable versions" of articles, or articles that have been reviewed by the community and locked from further editing-but the community has been unable to form a consensus in favor of such changes, partly because they would require a major software overhaul. However a similar system is being tested on the German Wikipedia, and there is an expectation that some form of that system will make its way onto the English version at some future time. Software created by Luca de Alfaro and colleagues at the University of California, Santa Cruz is now being tested that will assign "trust ratings" to individual Wikipedia contributors, with the intention that eventually only edits made by those who have established themselves as "trusted editors" will be made immediately visible.

Wikipedia community

The community has a power structure. Wikipedia's community has also been described as "cult-like, although not always with entirely negative connotations, and criticized for failing to accommodate inexperienced users. Editors in good standing in the community can run for one of many of levels of volunteer stewardship; this begins with "administrator" and goes up with "steward" and "bureaucrat". Administrators, the largest group of privileged users (for the English edition on September 30, 2008), have the ability to delete pages, lock articles from being changed in case of vandalism or editorial disputes, and block users from editing. Contrary to the name, the administrators do not enjoy any special privilege in decision-making and are prohibited from using their powers to settle content dispute. The roles of administrators, often described as "janitorial", are mostly limited to making edits that have project-wide effects and thus are disallowed to ordinary editors in order to minimize disruption, as well as banning users from making disruptive edits such as vandalism.

As Wikipedia grows with an unconventional model of encyclopedia building, "Who writes Wikipedia?" has become one of the questions frequently asked on the project, often with a reference to other Web 2.0 projects such as Digg. Jimmy Wales once argued that only "a community ... a dedicated group of a few hundred volunteers" makes the bulk of contributions to Wikipedia and that the project is therefore "much like any traditional organization". This was later disputed by Aaron Swartz, who noted that several articles he sampled had large portions of their content contributed by users with low edit counts. A 2007 study by researchers from Dartmouth College found that anonymous and infrequent contributors to Wikipedia are as reliable a source of knowledge as those contributors who register with the site. Although some contributors are authorities in their field, Wikipedia requires that even their contributions be supported by published and verifiable sources. The project's preference for consensus over credentials has been labeled "anti-elitism".

In August 2007, a website developed by computer science graduate student Virgil Griffith named WikiScanner made its public debut. WikiScanner traces the source of millions of changes made to Wikipedia by editors who are not logged in, which reveals that many of these edits come from corporations or sovereign government agencies about articles related to them, their personnel or their work, and were attempts to remove criticism.

In a 2003 study of Wikipedia as a community, economics Ph.D. student Andrea Ciffolilli argued that the low transaction costs of participating in wiki software create a catalyst for collaborative development, and that a "creative construction" approach encourages participation. In his 2008 book, The Future of the Internet and How to Stop It, Jonathan Zittrain of the Oxford Internet Institute and Harvard Law School's Berkman Center for Internet & Society cites Wikipedia's success as a case study in how open collaboration has fostered innovation on the web.

OperationWikimedia Foundation and the Wikimedia chapters

Wikipedia is hosted and funded by the Wikimedia Foundation, a non-profit organization which also operates Wikipedia-related projects such as Wikibooks. The Wikimedia chapters, local associations of Wikipedians, also participate in the promotion, the development and the funding of the project.

Software and hardware

The operation of Wikipedia depends on MediaWiki, a custom-made, free and open source wiki software platform written in PHP and built upon the MySQL database. The software incorporates programming features such as a macro language, variables, a transclusion system for templates, and URL redirection. MediaWiki is licensed under the GNU General Public License and used by all Wikimedia projects, as well as many other wiki projects. Originally, Wikipedia ran on UseModWiki written in Perl by Clifford Adams (Phase I), which initially required CamelCase for article hyperlinks; the present double bracket style was incorporated later. Starting in January 2002 (Phase II), Wikipedia began running on a PHP wiki engine with a MySQL database; this software was custom-made for Wikipedia by Magnus Manske. The Phase II software was repeatedly modified to accommodate the exponentially increasing demand. In July 2002 (Phase III), Wikipedia shifted to the third-generation software, MediaWiki, originally written by Lee Daniel Crocker.

Wikipedia currently runs on dedicated clusters of Ubuntu servers, 300 in Florida, 26 in Amsterdam, and 23 in Yahoo!'s Korean hosting facility in Seoul. Wikipedia employed a single server until 2004, when the server setup was expanded into a distributed multitier architecture. In January 2005, the project ran on 39 dedicated servers located in Florida. This configuration included a single master database server running MySQL, multiple slave database servers, 21 web servers running the Apache HTTP Server, and seven Squid cache servers.

Wikipedia receives between 20,000 and 45,000 page requests per second, depending on time of day. Page requests are first passed to a front-end layer of Squid caching servers. Requests that cannot be served from the Squid cache are sent to load-balancing servers running the Linux Virtual Server software, which in turn pass the request to one of the Apache web servers for page rendering from the database. The web servers deliver pages as requested, performing page rendering for all the language editions of Wikipedia. To increase speed further, rendered pages for anonymous users are cached in a distributed memory cache until invalidated, allowing page rendering to be skipped entirely for most common page accesses. Two larger clusters in the Netherlands and Korea now handle much of Wikipedia's traffic load.

License and language editions

All text in Wikipedia is covered by GNU Free Documentation License (GFDL), a copyleft license permitting the redistribution, creation of derivative works, and commercial use of content while authors retain copyright of their work. The position that Wikipedia is merely a hosting service has been successfully used as a defense in court. Wikipedia has been working on the switch to Creative Commons licenses because the GFDL, initially designed for software manuals, is not suitable for online reference works and because the two licenses are currently incompatible.

The handling of media files (e.g., image files) varies across language editions. Some language editions, such as the English Wikipedia, include non-free image files under fair use doctrine, while the others have opted not to. This is in part because of the difference in copyright laws between countries; for example, the notion of fair use does not exist in Japanese copyright law. Media files covered by free content licenses (e.g., Creative Commons' cc-by-sa) are shared across language editions via Wikimedia Commons repository, a project operated by the Wikimedia Foundation.

There are currently 262 language editions of Wikipedia; of these, 22 have over 100,000 articles and 79 have over 1,000 articles. (See List of Wikipedias for the full list.) According to Alexa, the English subdomain (en.wikipedia.org; English Wikipedia) receives approximately 52% of Wikipedia's cumulative traffic, with the remaining split among the other languages (Spanish: 19%, French: 5%, Polish: 3%, German: 3%, Japanese: 3%, Portuguese: 2%). As of July 2008, the five largest language editions are (in order of article count) English, German, French, Polish and Japanese Wikipedias.

Since Wikipedia is web-based and therefore worldwide, contributors of a same language edition may use different dialects or may come from different countries (as is the case for the English edition). These differences may lead to some conflicts over spelling differences, (e.g. color vs. colour) or points of view. Though the various language editions are held to global policies such as "neutral point of view," they diverge on some points of policy and practice, most notably on whether images that are not licensed freely may be used under a claim of fair use. Jimmy Wales has described Wikipedia as "an effort to create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language". Though each language edition functions more or less independently, some efforts are made to supervise them all. They are coordinated in part by Meta, the Wikimedia Foundation's wiki devoted to maintaining all of its projects (Wikipedia and others). For instance, Meta-Wiki provides important statistics on all language editions of Wikipedia and maintain a list of articles every Wikipedia should have The list concerns basic content by subject: biography, history, geography, society, culture, science, technology, foodstuffs, and mathematics. As for the rest, it is not rare for articles strongly related to a particular language not to have counterparts in another edition. For example, articles about small towns in the United States might only be available in English.

Translated articles represent only a small portion of articles in most editions, in part because automated translation of articles is disallowed. Articles available in more than one language may offer "InterWiki" links, which link to the counterpart articles in other editions.

Several language versions have published a selection of Wikipedia articles on an optical disk version. An English version, 2006 Wikipedia CD Selection, contained about 2,000 articles. Another English version developed by Linterweb contains "1988 + articles". The Polish version contains nearly 240,000 articles. There are also a few German versions.

Cultural significance

In addition to logistic growth in the number of its articles, Wikipedia has steadily gained status as a general reference website since its inception in 2001. According to Alexa and comScore, Wikipedia is among the ten most visited websites world-wide. Of the top ten, Wikipedia is the only non-profit website. The growth of Wikipedia has been fueled by its dominant position in Google search results; about 50% of search engine traffic to Wikipedia comes from Google, a good portion of which is related to academic research. In April 2007 the Pew Internet and American Life project found that one third of US Internet users consulted Wikipedia. In October 2006, the site was estimated to have a hypothetical market value of $580 million if it ran advertisements.

Wikipedia's content has also been used in academic studies, books, conferences, and court cases. The Parliament of Canada's website refers to Wikipedia's article on same-sex marriage in the "related links" section of its "further reading" list for the Civil Marriage Act. The encyclopedia's assertions are increasingly used as a source by organizations such as the U.S. Federal Courts and the World Intellectual Property Organization - though mainly for supporting information rather than information decisive to a case. Content appearing on Wikipedia has also been cited as a source and referenced in some U.S. intelligence agency reports.

Wikipedia has also been used as a source in journalism, sometimes without attribution, and several reporters have been dismissed for plagiarizing from Wikipedia. In July 2007, Wikipedia was the focus of a 30-minute documentary on BBC Radio 4 which argued that, with increased usage and awareness, the number of references to Wikipedia in popular culture is such that the term is one of a select band of 21st-century nouns that are so familiar (Google, Facebook, YouTube) that they no longer need explanation and are on a par with such 20th-century terms as Hoovering or Coke. Many parody Wikipedia's openness, with characters vandalizing or modifying the online encyclopedia project's articles. Notably, comedian Stephen Colbert has parodied or referenced Wikipedia on numerous episodes of his show The Colbert Report and coined the related term "wikiality".

Wikipedia has also created an impact upon forms of media. Some media sources satirize Wikipedia's susceptibility to inserted inaccuracies, such as a front-page article in The Onion in July 2006 with the title "Wikipedia Celebrates 750 Years of American Independence". Others may draw upon Wikipedia's statement that anyone can edit, such as "The Negotiation", an episode of The Office, where character Michael Scott said that "Wikipedia is the best thing ever. Anyone in the world can write anything they want about any subject, so you know you are getting the best possible information", and a select few parody Wikipedia's policies, such as the xkcd strip named "Wikipedian Protester", that also included the joke "Semi-protect the Constitution!"

The first documentary film about Wikipedia, entitled Truth in Numbers: The Wikipedia Story, is scheduled for 2009 release. Shot on several continents, the film will cover the history of Wikipedia and feature interviews with Wikipedia editors around the world. Dutch filmmaker IJsbrand van Veelen premiered his 45-minute documentary The Truth According to Wikipedia in April, 2008.

On September 16, 2007, The Washington Post reported that Wikipedia had become a focal point in the 2008 election campaign, saying, "Type a candidate's name into Google, and among the first results is a Wikipedia page, making those entries arguably as important as any ad in defining a candidate. Already, the presidential entries are being edited, dissected and debated countless times each day. An October 2007 Reuters article, entitled "Wikipedia page the latest status symbol", reported the recent phenomenon of how having a Wikipedia article vindicates one's notability.

Wikipedia won two major awards in May 2004. The first was a Golden Nica for Digital Communities of the annual Prix Ars Electronica contest; this came with a €10,000 (£6,588; $12,700) grant and an invitation to present at the PAE Cyberarts Festival in Austria later that year. The second was a Judges' Webby Award for the "community" category. Wikipedia was also nominated for a "Best Practices" Webby. On January 26, 2007, Wikipedia was also awarded the fourth highest brand ranking by the readers of brandchannel.com, receiving 15% of the votes in answer to the question "Which brand had the most impact on our lives in 2006?

In September 2008, Wikipedia received Quadriga A Mission of Enlightenment award of Werkstatt Deutschland along with Boris Tadić, Eckart Höfling and Peter Gabriel. The award was presented to Jimmy Wales by David Weinberger.

Related projects

A number of interactive multimedia encyclopedias incorporating entries written by the public existed long before Wikipedia was founded. The first of these was the 1986 BBC Domesday Project, which included text (entered on BBC Micro computers) and photographs from over 1 million contributors in the UK, and covering the geography, art and culture of the UK. This was the first interactive multimedia encyclopedia (and was also the first major multimedia document connected through internal links), with the majority of articles being accessible through an interactive map of the UK. The user-interface and part of the content of the Domesday Project have now been emulated on a website. One of the most successful early online encyclopedias incorporating entries by the public was h2g2, which was also created by the BBC. The h2g2 encyclopedia was relatively light-hearted, focusing on articles which were both witty and informative. Both of these projects had similarities with Wikipedia, but neither gave full editorial freedom to public users.

Wikipedia has also spawned several sister projects. The first, "In Memoriam: September 11 Wiki", created in October 2002, detailed the September 11 attacks; this project was closed in October 2006. Wiktionary, a dictionary project, was launched in December 2002; Wikiquote, a collection of quotations, a week after Wikimedia launched, and Wikibooks, a collection of collaboratively written free books. Wikimedia has since started a number of other projects, including Wikiversity, a project for the creation of free learning materials and the provision of online learning activities.

A similar non-wiki project, the GNUPedia project, co-existed with Nupedia early in its history; however, it has been retired and its creator, free software figure Richard Stallman, has lent his support to Wikipedia.

Other websites centered on collaborative knowledge base development have drawn inspiration from or inspired Wikipedia. Some, such as Susning.nu, Enciclopedia Libre, and WikiZnanie likewise employ no formal review process, whereas others use more traditional peer review, such as Encyclopedia of Life, Stanford Encyclopedia of Philosophy, Scholarpedia, h2g2 and Everything2.

Jimmy Wales, the de facto leader of Wikipedia, said in an interview in regard to the online encyclopedia Citizendium which is overviewed by experts in their respective fields: "We welcome a diversity of efforts. If Larry's project is able to produce good work, we will benefit from it by copying it back into Wikipedia."

2 answers


A wiki is a world wide web application that allows people to add, modify, and delete content in a collaborative way. Wikipedia is a made up word. Wiki combined with "pedia" suggests an encyclopedia that's a wiki.

Wikipedia (pronunciation ) is a free, multilingual encyclopedia project operated by the non-profit Wikimedia Foundation. Its name is a portmanteau of the words wiki (a technology for creating collaborative websites) and encyclopedia. Wikipedia's 10 million articles, about a quarter of which are in English, have been written collaboratively by volunteers around the world, and almost all of its articles can be edited by anyone who can access the Wikipedia website. Launched in 2001 by Jimmy Wales and Larry Sanger, it is currently the largest and most popular general reference work on the Internet.

Critics of Wikipedia target its systemic bias and inconsistencies and its policy of favoring consensus over credentials in its editorial process. Wikipedia's reliability and accuracy are also an issue. Other criticisms are centered on its susceptibility to vandalism and the addition of spurious or unverified information. Scholarly work suggests that vandalism is generally short-lived.

In addition to being an encyclopedic reference, Wikipedia has received major media attention as an online source of breaking news as it is constantly updated. When Time magazine recognized "You" as its Person of the Year 2006, praising the accelerating success of online collaboration and interaction by millions of users around the world, Wikipedia was the first particular "Web 2.0" service mentioned, followed by YouTube and MySpace.

History

Wikipedia began as a complementary project for Nupedia, a free online English-language encyclopedia project whose articles were written by experts and reviewed under a formal process. Nupedia was founded on March 9, 2000, under the ownership of Bomis, Inc, a web portal company. Its main figures were Jimmy Wales, Bomis CEO, and Larry Sanger, editor-in-chief for Nupedia and later Wikipedia. Nupedia was licensed initially under its own Nupedia Open Content License, switching to the GNU Free Documentation License before Wikipedia's founding at the urging of Richard Stallman.

Larry Sanger and Jimmy Wales are the founders of Wikipedia. While Wales is credited with defining the goal of making a publicly editable encyclopedia, Sanger is usually credited with the counter-intuitive strategy of using a wiki to reach that goal. On January 10, 2001, Larry Sanger proposed on the Nupedia mailing list to create a wiki as a "feeder" project for Nupedia. Wikipedia was formally launched on January 15, 2001, as a single English-language edition at www.wikipedia.com, and announced by Sanger on the Nupedia mailing list. Wikipedia's policy of "neutral point-of-view" was codified in its initial months, and was similar to Nupedia's earlier "nonbiased" policy. Otherwise, there were relatively few rules initially and Wikipedia operated independently of Nupedia.

Wikipedia gained early contributors from Nupedia, Slashdot postings, and search engine indexing. It grew to approximately 20,000 articles, and 18 language editions, by the end of 2001. By late 2002 it had reached 26 language editions, 46 by the end of 2003, and 161 by the final days of 2004. Nupedia and Wikipedia coexisted until the former's servers went down permanently in 2003, and its text was incorporated into Wikipedia. English Wikipedia passed the 2 million-article mark on September 9, 2007, making it the largest encyclopedia ever assembled, eclipsing even the Yongle Encyclopedia (1407), which had held the record for exactly 600 years.

Citing fears of commercial advertising and lack of control in a perceived English-centric Wikipedia, users of the Spanish Wikipedia forked from Wikipedia to create the Enciclopedia Libre in February 2002. Later that year, Wales announced that Wikipedia would not display advertisements, and its website was moved to wikipedia.org. Various other projects have since forked from Wikipedia for editorial reasons. Wikinfo does not require neutral point of view and allows original research. New Wikipedia-inspired projects - such as Citizendium, Scholarpedia, Conservapedia and Google's Knol - have been started to address perceived limitations of Wikipedia, such as its policies on peer review, original research and commercial advertising.

The Wikimedia Foundation was created from Wikipedia and Nupedia on June 20, 2003. It applied to the United States Patent and Trademark Office to trademark Wikipedia on September 17, 2004. The mark was granted registration status on January 10, 2006. Trademark protection was accorded by Japan on December 16, 2004, and in the European Union on January 20, 2005. Technically a service mark, the scope of the mark is for: "Provision of information in the field of general encyclopedic knowledge via the Internet". There are plans to license the use of the Wikipedia trademark for some products, such as books or DVDs.

Nature of WikipediaEditing model

Unlike traditional encyclopedias such as Encyclopædia Britannica, no article in Wikipedia undergoes formal peer-review process and changes to articles are made available immediately. No article is owned by its creator or any other editor, or is vetted by any recognized authority. Except for a few vandalism-prone pages that can be edited only by administrators, every article may be edited anonymously or with a user account, while only registered users may create a new article. Consequently, Wikipedia "makes no guarantee of validity" of its content. Wikipedia also does not censor itself, and it contains materials that some people, including Wikipedia editors, may find objectionable, offensive or pornographic. For instance, in 2008, Wikipedia rejected an online petition against the inclusion of Muhammad's depictions in its English edition, citing this policy. The presence of politically sensitive materials in Wikipedia had also led China to block the access to parts of the site.

Content in Wikipedia, however, is subject to the laws (in particular copyright law) in Florida, United States, where Wikipedia servers are hosted, and several editorial policies and guidelines that are intended to reinforce the notion that Wikipedia is an encyclopedia. Each entry in Wikipedia must be about a topic that is encyclopedic and thus is worthy of inclusion. A topic is deemed encyclopedic if it is "notable in the Wikipedia jargon; i.e., if it has received significant coverage in secondary reliable sources (i.e., mainstream media or major academic journals) that are independent of the subject of the topic. Second, Wikipedia must expose knowledge that is already established and recognized. In other words, it must not present, for instance, new information or original works. A claim that is likely to be challenged requires a reference to reliable sources. Within the Wikipedia community, this is often phrased as "verifiability, not truth" to express the idea that the readers are left themselves to check the truthfulness of what appears in the articles and to make their own interpretations. Finally, Wikipedia does not take a side. All opinions and viewpoints, if attributable to external sources, must enjoy appropriate share of coverage within an article. Wikipedia editors as a community write and revise those policies and guidelines and enforce them by deleting, annotating with tags or modifying article materials failing to meet them. (See also Deletionism and inclusionism)

Contributors, registered or not, can take advantage of features available in the software that empowers Wikipedia. The "History" page attached to each article contains every single past revision of the article, though a revision with libelous content, criminal threats or copyright infringements may be removed afterwards. The feature makes it easy to compare old and new versions, undo changes that an editor consider undesirable, or restore lost content. The "Discussion" pages associated with each article are used to coordinate work among multiple editors. Regular contributors often maintain a "watchlist" of articles of interest to them, so that they can easily keep tabs on all recent changes to those articles. Computer programs called bots have been used widely to remove vandalism as soon as it was made, or start articles such as geography entries in a standard format from statistical data.

The open nature of the editing model has been central to any form of criticism on Wikipedia. For example, at any point, a reader of an article cannot be certain, without consulting its "history" page, whether or not the article she is reading has been vandalized. Critics argue that non-expert editing undermines quality. Because contributors usually submit edits, rewriting small portions of an entry rather than making full-length revisions, high- and low-quality content may be intermingled within an entry. Historian Roy Rosenzweig noted: "Overall, writing is the Achilles' heel of Wikipedia. Committees rarely write well, and Wikipedia entries often have a choppy quality that results from the stringing together of sentences or paragraphs written by different people. All of these led to the question of the reliability of Wikipedia as a source of accurate information.

In 2008 two researchers proved the hypothesis that the growth of Wikipedia is sustainable.

Reliability and bias

Wikipedia has been accused of exhibiting systemic bias and inconsistency; critics argue that Wikipedia's open nature and a lack of proper sources for much of the information makes it unreliable. Some commentators suggest that Wikipedia is generally reliable, but that the reliability of any given article is not always clear. Editors of traditional reference works such as the Encyclopædia Britannica have questioned the project's utility and status as an encyclopedia. Many university lecturers discourage students from citing any encyclopedia in academic work, preferring primary sources; some specifically prohibit Wikipedia citations. Co-founder Jimmy Wales stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as authoritative. Technology writer Bill Thompson commented that the debate was possibly "symptomatic of much learning about information which is happening in society today".

Concerns have also been raised regarding the lack of accountability that results from users' anonymity, and that it is vulnerable to vandalism, the insertion of spurious information and similar problems. In one particularly well-publicized incident, false information was introduced into the biography of American political figure John Seigenthaler, Sr. and remained undetected for four months. Some critics claim that Wikipedia's open structure makes it an easy target for Internet trolls, advertisers, and those with an agenda to push. The addition of political spin to articles by organizations including members of the U.S. House of Representatives and special interest groups has been noted, and organizations such as Microsoft have offered financial incentives to work on certain articles. These issues have been parodied, notably by Stephen Colbert in The Colbert Report.

Economist Tyler Cowen writes, "If I had to guess whether Wikipedia or the median refereed journal article on economics was more likely to be true, after a not so long think I would opt for Wikipedia." He comments that many traditional sources of non-fiction suffer from systemic biases. Novel results are over-reported in journal articles, and relevant information is omitted from news reports. But he also cautions that errors are frequently found on Internet sites, and that academics and experts must be vigilant in correcting them.

In February 2007, an article in The Harvard Crimsonnewspaper reported that some of the professors at Harvard University include Wikipedia in their syllabus, but that there is a split in their perception of using Wikipedia. In June 2007, former president of the American Library Association Michael Gorman condemned Wikipedia, along with Google, stating that academics who endorse the use of Wikipedia are "the intellectual equivalent of a dietitian who recommends a steady diet of Big Macs with everything". He also said that "a generation of intellectual sluggards incapable of moving beyond the Internet" was being produced at universities. He complains that the web-based sources are discouraging students from learning from the more rare texts which either are found only on paper or are on subscription-only web sites. In the same article Jenny Fry (a research fellow at the Oxford Internet Institute) commented on the academics who cite Wikipedia saying that: "You cannot say children are intellectually lazy because they are using the Internet when academics are using search engines in their research. The difference is that they have more experience of being critical about what is retrieved and whether it is authoritative. Children need to be told how to use the Internet in a critical and appropriate way."

There have been efforts within the Wikipedia community to improve the reliability of Wikipedia. The English-language Wikipedia has introduced an assessment scale against which the quality of articles is judged; other editions have also adopted this. Roughly 2000 articles in English have passed a rigorous set of criteria to reach the highest rank, "featured article" status; such articles are intended to provide thorough, well-written coverage of their topic, supported by many references to peer-reviewed publications. In order to improve reliability, some editors have called for "stable versions" of articles, or articles that have been reviewed by the community and locked from further editing-but the community has been unable to form a consensus in favor of such changes, partly because they would require a major software overhaul. However a similar system is being tested on the German Wikipedia, and there is an expectation that some form of that system will make its way onto the English version at some future time. Software created by Luca de Alfaro and colleagues at the University of California, Santa Cruz is now being tested that will assign "trust ratings" to individual Wikipedia contributors, with the intention that eventually only edits made by those who have established themselves as "trusted editors" will be made immediately visible.

Wikipedia community

The community has a power structure. Wikipedia's community has also been described as "cult-like, although not always with entirely negative connotations, and criticized for failing to accommodate inexperienced users. Editors in good standing in the community can run for one of many of levels of volunteer stewardship; this begins with "administrator" and goes up with "steward" and "bureaucrat". Administrators, the largest group of privileged users (for the English edition on September 30, 2008), have the ability to delete pages, lock articles from being changed in case of vandalism or editorial disputes, and block users from editing. Contrary to the name, the administrators do not enjoy any special privilege in decision-making and are prohibited from using their powers to settle content dispute. The roles of administrators, often described as "janitorial", are mostly limited to making edits that have project-wide effects and thus are disallowed to ordinary editors in order to minimize disruption, as well as banning users from making disruptive edits such as vandalism.

As Wikipedia grows with an unconventional model of encyclopedia building, "Who writes Wikipedia?" has become one of the questions frequently asked on the project, often with a reference to other Web 2.0 projects such as Digg. Jimmy Wales once argued that only "a community ... a dedicated group of a few hundred volunteers" makes the bulk of contributions to Wikipedia and that the project is therefore "much like any traditional organization". This was later disputed by Aaron Swartz, who noted that several articles he sampled had large portions of their content contributed by users with low edit counts. A 2007 study by researchers from Dartmouth College found that anonymous and infrequent contributors to Wikipedia are as reliable a source of knowledge as those contributors who register with the site. Although some contributors are authorities in their field, Wikipedia requires that even their contributions be supported by published and verifiable sources. The project's preference for consensus over credentials has been labeled "anti-elitism".

In August 2007, a website developed by computer science graduate student Virgil Griffith named WikiScanner made its public debut. WikiScanner traces the source of millions of changes made to Wikipedia by editors who are not logged in, which reveals that many of these edits come from corporations or sovereign government agencies about articles related to them, their personnel or their work, and were attempts to remove criticism.

In a 2003 study of Wikipedia as a community, economics Ph.D. student Andrea Ciffolilli argued that the low transaction costs of participating in wiki software create a catalyst for collaborative development, and that a "creative construction" approach encourages participation. In his 2008 book, The Future of the Internet and How to Stop It, Jonathan Zittrain of the Oxford Internet Institute and Harvard Law School's Berkman Center for Internet & Society cites Wikipedia's success as a case study in how open collaboration has fostered innovation on the web.

OperationWikimedia Foundation and the Wikimedia chapters

Wikipedia is hosted and funded by the Wikimedia Foundation, a non-profit organization which also operates Wikipedia-related projects such as Wikibooks. The Wikimedia chapters, local associations of Wikipedians, also participate in the promotion, the development and the funding of the project.

Software and hardware

The operation of Wikipedia depends on MediaWiki, a custom-made, free and open source wiki software platform written in PHP and built upon the MySQL database. The software incorporates programming features such as a macro language, variables, a transclusion system for templates, and URL redirection. MediaWiki is licensed under the GNU General Public License and used by all Wikimedia projects, as well as many other wiki projects. Originally, Wikipedia ran on UseModWiki written in Perl by Clifford Adams (Phase I), which initially required CamelCase for article hyperlinks; the present double bracket style was incorporated later. Starting in January 2002 (Phase II), Wikipedia began running on a PHP wiki engine with a MySQL database; this software was custom-made for Wikipedia by Magnus Manske. The Phase II software was repeatedly modified to accommodate the exponentially increasing demand. In July 2002 (Phase III), Wikipedia shifted to the third-generation software, MediaWiki, originally written by Lee Daniel Crocker.

Wikipedia currently runs on dedicated clusters of Ubuntu servers, 300 in Florida, 26 in Amsterdam, and 23 in Yahoo!'s Korean hosting facility in Seoul. Wikipedia employed a single server until 2004, when the server setup was expanded into a distributed multitier architecture. In January 2005, the project ran on 39 dedicated servers located in Florida. This configuration included a single master database server running MySQL, multiple slave database servers, 21 web servers running the Apache HTTP Server, and seven Squid cache servers.

Wikipedia receives between 20,000 and 45,000 page requests per second, depending on time of day. Page requests are first passed to a front-end layer of Squid caching servers. Requests that cannot be served from the Squid cache are sent to load-balancing servers running the Linux Virtual Server software, which in turn pass the request to one of the Apache web servers for page rendering from the database. The web servers deliver pages as requested, performing page rendering for all the language editions of Wikipedia. To increase speed further, rendered pages for anonymous users are cached in a distributed memory cache until invalidated, allowing page rendering to be skipped entirely for most common page accesses. Two larger clusters in the Netherlands and Korea now handle much of Wikipedia's traffic load.

License and language editions

All text in Wikipedia is covered by GNU Free Documentation License (GFDL), a copyleft license permitting the redistribution, creation of derivative works, and commercial use of content while authors retain copyright of their work. The position that Wikipedia is merely a hosting service has been successfully used as a defense in court. Wikipedia has been working on the switch to Creative Commons licenses because the GFDL, initially designed for software manuals, is not suitable for online reference works and because the two licenses are currently incompatible.

The handling of media files (e.g., image files) varies across language editions. Some language editions, such as the English Wikipedia, include non-free image files under fair use doctrine, while the others have opted not to. This is in part because of the difference in copyright laws between countries; for example, the notion of fair use does not exist in Japanese copyright law. Media files covered by free content licenses (e.g., Creative Commons' cc-by-sa) are shared across language editions via Wikimedia Commons repository, a project operated by the Wikimedia Foundation.

There are currently 262 language editions of Wikipedia; of these, 22 have over 100,000 articles and 79 have over 1,000 articles. (See List of Wikipedias for the full list.) According to Alexa, the English subdomain (en.wikipedia.org; English Wikipedia) receives approximately 52% of Wikipedia's cumulative traffic, with the remaining split among the other languages (Spanish: 19%, French: 5%, Polish: 3%, German: 3%, Japanese: 3%, Portuguese: 2%). As of July 2008, the five largest language editions are (in order of article count) English, German, French, Polish and Japanese Wikipedias.

Since Wikipedia is web-based and therefore worldwide, contributors of a same language edition may use different dialects or may come from different countries (as is the case for the English edition). These differences may lead to some conflicts over spelling differences, (e.g. color vs. colour) or points of view. Though the various language editions are held to global policies such as "neutral point of view," they diverge on some points of policy and practice, most notably on whether images that are not licensed freely may be used under a claim of fair use. Jimmy Wales has described Wikipedia as "an effort to create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language". Though each language edition functions more or less independently, some efforts are made to supervise them all. They are coordinated in part by Meta, the Wikimedia Foundation's wiki devoted to maintaining all of its projects (Wikipedia and others). For instance, Meta-Wiki provides important statistics on all language editions of Wikipedia and maintain a list of articles every Wikipedia should have The list concerns basic content by subject: biography, history, geography, society, culture, science, technology, foodstuffs, and mathematics. As for the rest, it is not rare for articles strongly related to a particular language not to have counterparts in another edition. For example, articles about small towns in the United States might only be available in English.

Translated articles represent only a small portion of articles in most editions, in part because automated translation of articles is disallowed. Articles available in more than one language may offer "InterWiki" links, which link to the counterpart articles in other editions.

Several language versions have published a selection of Wikipedia articles on an optical disk version. An English version, 2006 Wikipedia CD Selection, contained about 2,000 articles. Another English version developed by Linterweb contains "1988 + articles". The Polish version contains nearly 240,000 articles. There are also a few German versions.

Cultural significance

In addition to logistic growth in the number of its articles, Wikipedia has steadily gained status as a general reference website since its inception in 2001. According to Alexa and comScore, Wikipedia is among the ten most visited websites world-wide. Of the top ten, Wikipedia is the only non-profit website. The growth of Wikipedia has been fueled by its dominant position in Google search results; about 50% of search engine traffic to Wikipedia comes from Google, a good portion of which is related to academic research. In April 2007 the Pew Internet and American Life project found that one third of US Internet users consulted Wikipedia. In October 2006, the site was estimated to have a hypothetical market value of $580 million if it ran advertisements.

Wikipedia's content has also been used in academic studies, books, conferences, and court cases. The Parliament of Canada's website refers to Wikipedia's article on same-sex marriage in the "related links" section of its "further reading" list for the Civil Marriage Act. The encyclopedia's assertions are increasingly used as a source by organizations such as the U.S. Federal Courts and the World Intellectual Property Organization - though mainly for supporting information rather than information decisive to a case. Content appearing on Wikipedia has also been cited as a source and referenced in some U.S. intelligence agency reports.

Wikipedia has also been used as a source in journalism, sometimes without attribution, and several reporters have been dismissed for plagiarizing from Wikipedia. In July 2007, Wikipedia was the focus of a 30-minute documentary on BBC Radio 4 which argued that, with increased usage and awareness, the number of references to Wikipedia in popular culture is such that the term is one of a select band of 21st-century nouns that are so familiar (Google, Facebook, YouTube) that they no longer need explanation and are on a par with such 20th-century terms as Hoovering or Coke. Many parody Wikipedia's openness, with characters vandalizing or modifying the online encyclopedia project's articles. Notably, comedian Stephen Colbert has parodied or referenced Wikipedia on numerous episodes of his show The Colbert Report and coined the related term "wikiality".

Wikipedia has also created an impact upon forms of media. Some media sources satirize Wikipedia's susceptibility to inserted inaccuracies, such as a front-page article in The Onion in July 2006 with the title "Wikipedia Celebrates 750 Years of American Independence". Others may draw upon Wikipedia's statement that anyone can edit, such as "The Negotiation", an episode of The Office, where character Michael Scott said that "Wikipedia is the best thing ever. Anyone in the world can write anything they want about any subject, so you know you are getting the best possible information", and a select few parody Wikipedia's policies, such as the xkcd strip named "Wikipedian Protester", that also included the joke "Semi-protect the Constitution!"

The first documentary film about Wikipedia, entitled Truth in Numbers: The Wikipedia Story, is scheduled for 2009 release. Shot on several continents, the film will cover the history of Wikipedia and feature interviews with Wikipedia editors around the world. Dutch filmmaker IJsbrand van Veelen premiered his 45-minute documentary The Truth According to Wikipedia in April, 2008.

On September 16, 2007, The Washington Post reported that Wikipedia had become a focal point in the 2008 election campaign, saying, "Type a candidate's name into Google, and among the first results is a Wikipedia page, making those entries arguably as important as any ad in defining a candidate. Already, the presidential entries are being edited, dissected and debated countless times each day. An October 2007 Reuters article, entitled "Wikipedia page the latest status symbol", reported the recent phenomenon of how having a Wikipedia article vindicates one's notability.

Wikipedia won two major awards in May 2004. The first was a Golden Nica for Digital Communities of the annual Prix Ars Electronica contest; this came with a €10,000 (£6,588; $12,700) grant and an invitation to present at the PAE Cyberarts Festival in Austria later that year. The second was a Judges' Webby Award for the "community" category. Wikipedia was also nominated for a "Best Practices" Webby. On January 26, 2007, Wikipedia was also awarded the fourth highest brand ranking by the readers of brandchannel.com, receiving 15% of the votes in answer to the question "Which brand had the most impact on our lives in 2006?

In September 2008, Wikipedia received Quadriga A Mission of Enlightenment award of Werkstatt Deutschland along with Boris Tadić, Eckart Höfling and Peter Gabriel. The award was presented to Jimmy Wales by David Weinberger.

Related projects

A number of interactive multimedia encyclopedias incorporating entries written by the public existed long before Wikipedia was founded. The first of these was the 1986 BBC Domesday Project, which included text (entered on BBC Micro computers) and photographs from over 1 million contributors in the UK, and covering the geography, art and culture of the UK. This was the first interactive multimedia encyclopedia (and was also the first major multimedia document connected through internal links), with the majority of articles being accessible through an interactive map of the UK. The user-interface and part of the content of the Domesday Project have now been emulated on a website. One of the most successful early online encyclopedias incorporating entries by the public was h2g2, which was also created by the BBC. The h2g2 encyclopedia was relatively light-hearted, focusing on articles which were both witty and informative. Both of these projects had similarities with Wikipedia, but neither gave full editorial freedom to public users.

Wikipedia has also spawned several sister projects. The first, "In Memoriam: September 11 Wiki", created in October 2002, detailed the September 11 attacks; this project was closed in October 2006. Wiktionary, a dictionary project, was launched in December 2002; Wikiquote, a collection of quotations, a week after Wikimedia launched, and Wikibooks, a collection of collaboratively written free books. Wikimedia has since started a number of other projects, including Wikiversity, a project for the creation of free learning materials and the provision of online learning activities.

A similar non-wiki project, the GNUPedia project, co-existed with Nupedia early in its history; however, it has been retired and its creator, free software figure Richard Stallman, has lent his support to Wikipedia.

Other websites centered on collaborative knowledge base development have drawn inspiration from or inspired Wikipedia. Some, such as Susning.nu, Enciclopedia Libre, and WikiZnanie likewise employ no formal review process, whereas others use more traditional peer review, such as Encyclopedia of Life, Stanford Encyclopedia of Philosophy, Scholarpedia, h2g2 and Everything2.

Jimmy Wales, the de facto leader of Wikipedia, said in an interview in regard to the online encyclopedia Citizendium which is overviewed by experts in their respective fields: "We welcome a diversity of efforts. If Larry's project is able to produce good work, we will benefit from it by copying it back into Wikipedia."

1 answer


Angels & Demons is a bestselling mystery novel by American author Dan Brown. The novel revolves around the quest of fictional Harvard symbologist Robert Langdon to unravel the mysteries of a secret society called the Illuminati, and preclude a plot from annihilating the Vatican City using destructive antimatter. The story recounts the conflict between science and religion that brought the establishment of the Illuminati, and, after centuries of non-existence, the group is thought to have resurfaced for retribution against the Roman Catholic Church. Published in 2000, it introduces the character Robert Langdon, who is also the principal character of Brown's subsequent novel, The Da Vinci Code. It also shares many stylistic elements with the latter, such as conspiracies of secret societies, a single-day time frame, and the Roman Catholic Church. Ancient history, architecture, and symbolism are heavily referenced throughout the novel. A film adaptation of the same name is due for release on May 15, 2009. Plot summary The plot follows Harvard symbologist Robert Langdon, as he tries to stop what seems to be the Illuminati, a legendary secret society, from destroying Vatican City with the newly-discovered power of antimatter. CERN director Maximilian Kohler discovers one of the facility's most respected physicists, Leonardo Vetra, murdered in his own secured, private quarter at the facility. His chest is branded with a symbol—the ambigramatic "Illuminati"—and his eye is dislodged. Instead of calling the police, Kohler researches the topic on the Internet and finally gains contact with Langdon, an expert on the Illuminati. Kohler requests his assistance in uncovering the murderer. What Langdon finds at the murder scene frightens him: the symbol appears to be authentic, and the legendary secret society, long thought to be defunct, seems to have resurfaced. Kohler calls Vetra's adopted daughter Vittoria to the scene, and it is later revealed that the Illuminati has also stolen a canister containing a quarter of a gram of antimatter—an extremely dangerous substance with destructive potential comparable to a small nuclear weapon, a potential unleashed upon contact with any form of normal matter. When charged with electricity at CERN, the canister's magnetic field controls the drop of antimatter to float suspended in a high vacuum, ensuring safety; but when it was taken away from its electricity supply, it automatically switched to its back-up battery, which will only power it for 24 hours. The horrible truth is that the Illuminati has put the stolen canister somewhere in Vatican City, with a security camera in front of it as its digital clock counts down to the explosion. Langdon and Vittoria make their way to Vatican City, where the Pope has recently died, and the Il conclave has convened to elect the new pontiff. Cardinal Mortati, host of the election, discovers that the four Preferiti, cardinals who are considered to be the most likely candidates in the current Il election, are missing. After they arrive, Langdon and Vittoria begin searching for the Preferiti in hopes that they will also find the antimatter canister in the process. Their search is assisted by Camerlengo Carlo Ventresca (the late pope's closest aide) and the Vatican's Swiss Guard, including Commander Olivetti, Captain Rocher and Lieutenant Chartrand. Convinced that the Illuminati are in some way responsible for the disappearance of the Preferiti, Langdon attempts to retrace the steps of the so-called "Path of Illumination", an ancient and elaborate process once used by the Illuminati as a means of induction of new members; prospective candidates for the Order were required to follow a series of subtle clues left in various landmarks in and around Rome. If the candidate followed the clues properly, he would be able to locate the secret meeting place of the Illuminati and be granted membership in the Order. Using his extensive knowledge of religious and occult history, Langdon sets off on the Path of Illumination in hopes of uncovering clues as to the disappearance of the Preferiti and the location of the antimatter canister. Bernini's Habbakuk and the Angel and Agostino Chigi's pyramidal wall tomb The Path leads Langdon to four major locations in Rome (Vatican City is within the city of Rome), each associated with what the Illuminati believed to be the four primordial elements of all things in existence: Earth, Air, Fire, and Water. Upon arriving at each location, Langdon finds one of the Preferiti murdered in a fashion appropriate to the location's respective element: The first cardinal was buried and had soil lodged in his throat (Earth); the second's lungs were pierced (Air); the third was engulfed in flames and burned alive (Fire); and the fourth was drowned in a large fountain (Water). West Ponente at Saint Peter's Square After finding the bodies of the first two Preferiti (Earth and Air), Langdon hurries to the Santa Maria della Vittoria Basilica and finds the Preferiti's abductor in the act of setting the cardinal on fire. The kidnapper, who is also responsible for the Leonardo Vetra's murder and the theft of the antimatter canister, is an unnamed Hassassin who is working under the orders of the Illuminati master "Janus", whose true identity is unknown. Commander Olivetti dies and Langdon is nearly killed himself in this encounter with the Hassassin, who manages to kidnap Vittoria. Langdon manages to escape and meets the Hassassin yet again at the final element's landmark (Water), but is unable to save the fourth cardinal. Ecstasy of St Teresa Langdon nevertheless attempts to complete the Path of Illumination in order to find the Hassassin and rescue Vittoria. His search leads him to an abandoned castle-like structure with an underground tunnel leading directly into the Pope's chambers in the Vatican. Langdon frees Vittoria, and together they send the Hassassin falling several hundred feet to his death. The two hurry back to St. Peter's Basilica, where they find that Kohler has arrived to confront the camerlengo in private. Langdon and Vittoria fear that Kohler is Janus, and that he has come to murder the camerlengo as the final step in his plot against the Church. Hearing the camerlengo scream in agony, the Swiss Guards burst into the room and open fire on Kohler. Just before he dies, Kohler gives Langdon a videotape that he claims will explain everything. The Fountain of Four Rivers With time on the canister running out, the Swiss Guard begins to evacuate the Basilica. As he is exiting the church, the camerlengo apparently goes into a trance and rushes back into the Basilica, claiming that he has received a vision from God revealing the location of the antimatter canister. With Langdon and a few others in pursuit, the camerlengo ventures deep into the catacombs beneath the Basilica and finds the canister sitting atop the tomb of Saint Peter. Langdon and the camerlengo retrieve the antimatter and get in a helicopter with only 5 minutes to spare. The camerlengo manages to parachute safely onto the roof of St. Peter's just as the canister explodes harmlessly in the sky. Langdon's fate is not immediately known, as there was not a second parachute on board the helicopter. The crowd in St. Peter's Square look in awe as the camerlengo stands triumphantly before them. Because of this "miracle", the Papal Conclave debate whether exception should be made to elect the camerlengo as the new Pope. Robert Langdon survived the explosion by using a window cover from the chopper as a parachute and landed in the Tiber River near Tiber Island, which is famous for its reputation as an island blessed with miracles of healing. He is hurt, but not seriously. Langdon returns to St. Peter's and views Kohler's tape with the College of Cardinals. Langdon, Vittoria, and the cardinals confront the camerlengo in the Sistine Chapel, where the truth is finally revealed. Shortly before the events of the novel, the Pope was scheduled to meet with Leonardo Vetra concerning his research at CERN. Vetra, a devout Catholic, believed that science was capable of establishing a link between Man and God, a belief that was manifested by his research on antimatter. Vetra's beliefs caused great discomfort to the camerlengo, who firmly believed that the Church alone, not science, should dictate the moral creed of the Christian faithful. While discussing Vetra, the Pope reveals that his support is due to science having created him a miracle: a son conceived by artificial insemination. Horrified that the Pope has fathered a child, the camerlengo plots to "rectify" the situation. He poisoned the Pope and, under the guise of an Illuminati master (Janus), he recruited the Hassassin, a killer fueled by the same zeal and animus towards the Church as his ancestors during The Crusades, to kill Vetra, steal the antimatter, and kidnap and murder the Preferiti just as the papal conclave was set to convene. The camerlengo planted the antimatter in St. Peter's and feigned his last-minute "vision" from God in order to be seen as a hero and the savior of Christendom by those who witnessed his brave acts. The Illuminati thus had no actual role in any of the novel's events, and its "involvement" was merely a plot engineered by the camerlengo to cover his own plans. As Langdon suspected from the very beginning, the Order of the Illuminati was indeed long extinct. As one final twist, it is revealed that Camerlengo Ventresca was the birth son of the late Pope, conceived through artificial insemination. Suddenly overcome with grief and guilt at having caused so much death, especially that of his own father, Ventresca soaks himself in oil and immolates himself before a crowd of onlookers in St. Peter's Square. The conclave elects Cardinal Mortati as the new pope. In an ironic twist, through a quibble, a loophole in the papal election process known as election by acclamation, two popes were chosen - Ventresca by all the cardinals cheering his name before he lights himself on fire, and Mortati through normal means. Langdon and Vittoria retire to the Hotel Bernini. Lieutenant Chartrand delivers a letter and package to Langdon from the new Pope. The package is the 'Illuminati Diamond' brand, which is loaned indefinitely to Langdon. [edit] Characters * Robert Langdon - A professor of symbology at Harvard University and the main protagonist of the novel. He is flown to CERN to help investigate the murder of Leonardo Vetra. He is described as wearing a pair of chinos pants, turtleneck, and tweed jacket. His name is a tribute to John Langdon. * Leonardo Vetra - A scientist working at CERN and a priest. He is researching on antimatter when he is murdered by the Hassassin. He is also the adoptive father of Vittoria. * Vittoria Vetra - The adopted daughter of Vetra. She, like her father, works with CERN. Her research focuses on biology and physics. The reader learns early in the novel that Vittoria worked with her father in their research of antimatter. * Camerlengo Carlo Ventresca - The Camerlengo (Papal Chamberlain) during the conclave. He murdered the pope, who is later revealed to have been his father. He is also Janus in the novel, named after the Roman god of beginnings and ends, in dealing with the Hassassin. * Cardinal Saverio Mortati - The most senior cardinal in the conclave, and the current Dean of the College of Cardinals. He was the Devil's Advocate for the late pope. * Commander Olivetti - The commandant of the Swiss Guard. He is initially skeptical on the claims of Langdon and Vittoria until he talks with the Hassassin. He, along with other Swiss Guards, search desperately for the missing antimatter hidden somewhere at the Vatican. He is killed by the Hassassin at the church of Santa Maria della Vittoria. * Captain Rocher - The second in command after Commander Olivetti. He is contacted by Max Kohler telling his knowledge on the real cause of the events. He is killed by Lt. Chartrand, who was under the impression that Rocher was an Illuminatus. * Hassassin - The killer hired by Janus, the Camerlengo in disguise, to fulfill his plans. He is of Middle Eastern origin and displays his sadistic lust for women throughout the novel. He murders Leonardo Vetra, the Preferiti, and Commander Olivetti. He dies after being pushed from a balcony by Langdon at the Castel Sant'Angelo and breaking his back on a pile of cannonballs below. * Maximilian Kohler - The director of CERN. He is feared at CERN despite his paralysis. His wheelchair contains electronic gadgets such as a computer, telephone, pager, video camera, and a gun. He contacts Langdon to help him find the killer of his friend, Leonardo Vetra. He thinks religion prevents him from leading the life he could and he becomes a scientist as a rebellion to religion. * Gunther Glick and Chinita Macri - A reporter and camera crew for the BBC. They are contacted by the Hassassin regarding the events happening in the Vatican. Glick has a notorious reputation as a sensationalist and conspiracy theorist journalist. Macri, meanwhile, is a veteran camera crew and a foil to Glick. They have the first-hand account on the events in the novel, from the beginning of the conclave to the election of Mortati as pope. * Lieutenant Chartrand - A young Swiss Guard. He, together with Commander Olivetti and Cptn. Rocher, search desperately for the antimatter hidden somewhere in the Vatican. He shoots and kills Captain Rocher after he is mistaken as an Illuminatus. In the end of the novel, he is sent by the new pope to give the Illuminati Diamond as an indefinite loan to Langdon. * Cardinal Ebner - One of the four Preferiti and a cardinal from Frankfurt, Germany. He is killed by putting dirt and soil into his mouth. * Cardinal Lamassé - One of the four Preferiti and a cardinal from Paris, France. He is killed by puncturing his lungs. * Cardinal Guidera - One of the four Preferiti and a cardinal from Barcelona, Spain. He is hanged and burned alive. * Cardinal Baggia - One of the four Preferiti and a cardinal from Milan, Italy and the favorite to succeed as the new pope. He was drowned. [edit] Inaccuracy Sister project The Wikibook Angels and Demons has a page on the topic of Divergence from reality The book's first edition contained numerous errors of location of places in Rome, as well as incorrect uses of Italian language. Some of the language issues were corrected in the following editions.[1] Aside from the explicit introduction, the book depicts various fictional experts explaining matters in science, technology and history in which critics have pointed out errors. An example of are the antimatter discussions, wherein the book suggests that anti-matter can be produced in useful and practical quantities and will be a limitless source of power. CERN has refuted this noting, that antimatter cannot be used as an energy source because it is artificial, and creating it takes more energy than it produces.[2] Another mistake made in the book is the claim that CERN is the organization that invented the Internet. In fact, Tim Berners-Lee and a small team at CERN invented the hypertext transport protocol, which led to the World Wide Web, not the Internet, which was engineered in the United States by DARPA.[2] [edit] Fact and fiction behind the book For more information on these elements of the book, refer to the following articles: * Illuminati, a secret brotherhood at the heart of the book's plot * Lockheed Martin X-33, an aircraft described early in the book * CERN, a research laboratory * Freemasonry, a fraternal organization into which the Illuminati supposedly merged * Great Seal of the United States, Background on the symbol included on the U.S. one-dollar bill, discussed in chapter 31 of the book. [edit] Altars of Science The book fictionalizes a story about the Altars of Science in Rome, consisting of four locations, each representing the four elements—earth, air, fire and water, which are believed to be "the Path of Illumination", a trail to the meeting place of the Illuminati in Rome. According to the book, the "altars" were hidden as religious artwork in order to avoid the wrath of the Vatican and secure the secrecy of the Illuminati. The artworks that make up the Four Altars were all sculpted by Gian Lorenzo Bernini. Although the book is not clear where exactly the meeting place was, it is stated to be within the famed Castel Sant'Angelo. Illuminati Diamond The book lists the artworks as: * Earth — Habakkuk and the Angel in Chigi Chapel of Santa Maria del Popolo * Air — West Ponente at Saint Peter's Square * Fire — The Ecstasy of St Teresa sculpture at the church of Santa Maria della Vittoria * Water — The famous Fountain of Four Rivers at Piazza Navona [edit] Ambigrams Main article: Ambigram The book contains several ambigrams created by John Langdon.[3] Besides "Angels And Demons & The Illuminati", the title of the book is also presented as an ambigram on the hardcover book jacket, and also on the inside cover of the paperback versions. The words 'Fire', 'Water', 'Earth' and 'Air' all appear in the book as ambigrams as well, and the 'Illuminati Diamond' mentioned in the book is also an ambigram- the four elements arranged in the shape of a diamond. [edit] Footnotes 1. ^ Gialli & Thriller:ANGELI E DEMONI di Dan Brown 2. ^ a b "CERN - Spotlight: Angels and Demons". CERN - European Organization for Nuclear Research. Archived from the original on 2008-01-29. http://web.archive.org/web/20080129234743/http://public.web.cern.ch/Public/en/Spotlight/SpotlightAandD-en.html. Retrieved on 2008-09-11. 3. ^ www.johnlangdon.net Official website of John Langdon, section "Angels & Demons" (retrieved 2007-01-30) [edit] References * Burstein, Dan (ed). Secrets of Angels & Demons: The unauthorized guide to the bestselling novel, 2004, CDS Books. ISBN 1-59315-140-3, Collection of many essays by world-class historians and other experts, discussing the fact & fiction of the novel * The Official Angels and Demons Tour in Rome * Angels and Demons Draws Tourists to Rome, January 20, 2005, NPR * CERN's own page about fact and fiction in the novel * Angels and Demons Movie News Site * Tour in Rome that separates fact from fiction * Path of Illumination (with photos of the places of Angels & Demons) * Tour of Rome visiting the landmarks mentioned in the book * Dan Brown's own page * Book 'Antimatter, The Ultimate Mirror' ISBN 978-0521893091 [edit] External links * Angels & Demons Novel * John Langdon official website, "Angels & Demons" ambigrams * Vatican Bans filming the movie in churches * Articles and term paper about Adam Weishaupt and the historic illuminati... (english/german) * Wikibooks

1 answer


Wikipedia is a free,[5] web-based and collaborative multilingual encyclopedia project supported by the non-profit Wikimedia Foundation. Its name is a portmanteau of the words wiki (a technology for creating collaborative websites, from the Hawaiian word wiki, meaning "quick") and encyclopedia. Wikipedia's 13 million articles (3 million in English) have been written collaboratively by volunteers around the world, and almost all of its articles can be edited by anyone who can access the Wikipedia website.[6] Launched in 2001 by Jimmy Wales and Larry Sanger,[7] it is currently the largest and most popular general reference work on the Internet.[3][8][9][10] Critics of Wikipedia accuse it of systemic bias and inconsistencies (including undue weight given to popular culture),[11] and allege that it favors consensus over credentials in its editorial process.[12] Wikipedia's reliability and accuracy are also an issue.[13] Other criticisms center on its susceptibility to vandalism and the addition of spurious or unverified information,[14] though scholarly work suggests that vandalism is generally short-lived.[15][16] Wikipedia's departure from the expert-driven style of the encyclopedia building mode and the large presence of unacademic contents have been noted several times. When Time magazine recognized You as its Person of the Year for 2006, acknowledging the accelerating success of online collaboration and interaction by millions of users around the world, it cited Wikipedia as one of three examples of Web 2.0 services, along with YouTube and MySpace.[17] Some noted the importance of Wikipedia not only as an encyclopedic reference but also as a frequently updated news resource because of how quickly articles about recent events appear.[18][19] [hide]

* 1 History * 2 Nature of Wikipedia ** 2.1 Editing model ** 2.2 Consequence of the open editing model ** 2.3 Coverage of topics ** 2.4 Wikipedia community * 3 Operation ** 3.1 Wikimedia Foundation and the Wikimedia chapters ** 3.2 Software and hardware ** 3.3 Delivery media * 4 License and language editions * 5 Cultural significance * 6 Related projects * 7 See also * 8 Notes * 9 References * 10 External links Main article: History of Wikipedia

Wikipedia originally developed from another encyclopedia project, Nupedia.

Wikipedia began as a complementary project for Nupedia, a free online English-language encyclopedia project whose articles were written by experts and reviewed under a formal process. Nupedia was founded on March 9, 2000, under the ownership of Bomis, Inc, a web portal company. Its main figures were Jimmy Wales, Bomis CEO, and Larry Sanger, editor-in-chief for Nupedia and later Wikipedia. Nupedia was licensed initially under its own Nupedia Open Content License, switching to the GNU Free Documentation License before Wikipedia's founding at the urging of Richard Stallman.[20] Larry Sanger and Jimmy Wales are the founders of Wikipedia.[21][22] While Wales is credited with defining the goal of making a publicly editable encyclopedia,[23][24] Sanger is usually credited with the strategy of using a wiki to reach that goal.[25] On January 10, 2001, Larry Sanger proposed on the Nupedia mailing list to create a wiki as a "feeder" project for Nupedia.[26] Wikipedia was formally launched on January 15, 2001, as a single English-language edition at www.wikipedia.com,[27] and announced by Sanger on the Nupedia mailing list.[23] Wikipedia's policy of "neutral point-of-view"[28] was codified in its initial months, and was similar to Nupedia's earlier "nonbiased" policy. Otherwise, there were relatively few rules initially and Wikipedia operated independently of Nupedia.[23] Graph of the article count for the English Wikipedia, from January 10, 2001, to September 9, 2007 (the date of the two-millionth article)

Wikipedia gained early contributors from Nupedia, Slashdot postings, and web search engine indexing. It grew to approximately 20,000 articles, and 18 language editions, by the end of 2001. By late 2002 it had reached 26 language editions, 46 by the end of 2003, and 161 by the final days of 2004.[29] Nupedia and Wikipedia coexisted until the former's servers were taken down permanently in 2003, and its text was incorporated into Wikipedia. English Wikipedia passed the 2 million-article mark on September 9, 2007, making it the largest encyclopedia ever assembled, eclipsing even the Yongle Encyclopedia (1407), which had held the record for exactly 600 years.[30] Citing fears of commercial advertising and lack of control in a perceived English-centric Wikipedia, users of the Spanish Wikipedia Forked from Wikipedia to create the Enciclopedia Libre in February 2002.[31] Later that year, Wales announced that Wikipedia would not display advertisements, and its website was moved to wikipedia.org.[32] Various other projects have since forked from Wikipedia for editorial reasons. Wikinfo does not require a neutral point of view and allows original research. New Wikipedia-inspired projects - such as Citizendium, Scholarpedia, Conservapedia, and Google's Knol[33] - have been started to address perceived limitations of Wikipedia, such as its policies on peer review, original research, and commercial advertising. Though the English Wikipedia reached 3 million articles in August 2009, the growth of the edition, in terms of the numbers of articles and of contributors, appeared to have abruptly flattened around Spring 2007.[34] In July 2007 about 2,200 articles were added daily to the encyclopedia; today that average is 1,300. A team led by Ed H Chi at the Palo Alto Research Center speculated that this is due to the increasing exclusiveness of the project. New or occasional editors have significantly higher rates of their edits reverted than an "elite" group of regular editors. This makes it difficult for the project to recruit and retain new contributors, resulting in the stagnation in article creation.

In April 2009, the Wikimedia Foundation conducted a Wikipedia usability study, questioning users about the editing mechanism.[35]

In departure from the style of traditional encyclopedias such as Encyclopædia Britannica, Wikipedia employs the open editing model called "wiki". Except for a few vandalism-prone pages that can be edited only by established users, or in extreme cases only by administrators, every article may be edited anonymously or with a user account, while only registered users may create a new article (only in English edition). No article is owned by its creator or any other editor, or is vetted by any recognized authority; rather, the articles are collectively owned by a community of editors. [36] Most importantly, when changes to an article are made, they become available immediately before undergoing any review, no matter if they contain an error, are somehow misguided or even patent nonsense. The German edition of Wikipedia is an exception to this rule: it has been testing a system of maintaining "stable versions" of articles,[37] to allow a reader to see versions of articles that have passed certain reviews. Other language editions have not reached a consensus to implement this "flagged revisions" proposal.[38][39] Another proposal is the use of software to create "trust ratings" for individual Wikipedia contributors and using those ratings to determine which changes will be made visible immediately.[40] Editors keep track of changes to articles by checking the difference between two revisions of a page, displayed here in red.

Contributors, registered or not, can take advantage of features available in the software that powers Wikipedia. The "History" page attached to each article records every single past revision of the article, though a revision with libelous content, criminal threats or copyright infringements may be removed afterwards.[41][42] This feature makes it easy to compare old and new versions, undo changes that an editor considers undesirable, or restore lost content. The "Discussion" pages associated with each article are used to coordinate work among multiple editors.[43] Regular contributors often maintain a "watchlist" of articles of interest to them, so that they can easily keep tabs on all recent changes to those articles. Computer programs called Internet bots have been used widely to remove vandalism as soon as it was made,[16] to correct common misspellings and stylistic issues, or to start articles such as geography entries in a standard format from statistical data. Articles in Wikipedia are organized roughly in three ways: according to their development statuses, their subject matters and the access levels required for edits. The most developed state of articles is called "featured article": they are precisely ones that someday get featured in the main page of Wikipedia.[44][45] Researcher Giacomo Poderi found that articles tend to reach the FA status via intensive works of few editors, and that the categories such as history, media, music and warfare have higher ratio of featured articles than those such as computing, mathematics, language & linguistics and philosophy & psychology, casting a doubt to the equation "more edits equal higher quality." In 2007, in preparation for producing a print version, the English-language Wikipedia introduced an assessment scale against which the quality of articles is judged;[46] other editions have also adopted this. In 2008, two researchers theorized that the growth of Wikipedia is sustainable.[47] : See also: Reliability of Wikipedia, Criticism of Wikipedia, Academic studies of Wikipedia. The open nature of the editing model has been central to most criticism of Wikipedia. For example, at any point, a reader of an article cannot be certain whether or not the article she is reading has been vandalized. Former Encyclopaedia Britannica editor-in-chief Robert McHenry once described this by saying:[48] The user who visits Wikipedia to learn about some subject, to confirm some matter of fact, is rather in the position of a visitor to a public restroom. It may be obviously dirty, so that he knows to exercise great care, or it may seem fairly clean, so that he may be lulled into a false sense of security. What he certainly does not know is who has used the facilities before him. and popularized the claim that Wikipedia is a "faith-based encyclopedia."[citation needed] Critics argue that non-expert editing undermines quality. Because contributors usually rewrite small portions of an entry rather than making full-length revisions, high- and low-quality content may be intermingled within an entry. Historian Roy Rosenzweig noted: "Overall, writing is the Achilles' heel of Wikipedia. Committees rarely write well, and Wikipedia entries often have a choppy quality that results from the stringing together of sentences or paragraphs written by different people."[49] All of these led to the question of the reliability of Wikipedia as a source of accurate information. John Seigenthaler has described Wikipedia as "a flawed and irresponsible research tool."[50]

As a consequence of the open structure, Wikipedia "makes no guarantee of validity" of its content, since no one is ultimately responsible for any claims appearing in it.[51] Concerns have been raised regarding the lack of accountability that results from users' anonymity,[52] the insertion of spurious information, vandalism, and similar problems. In one particularly well-publicized incident, false information was introduced into the biography of American political figure John Seigenthaler and remained undetected for four months.[50] John Seigenthaler, the founding editorial director of USA Today and founder of the Freedom Forum First Amendment Center at Vanderbilt University, called Jimmy Wales and asked him, "...Do you ...have any way to know who wrote that?" "No, we don't", said Jimmy.[53] Some critics claim that Wikipedia's open structure makes it an easy target for Internet trolls, spams, and those with an agenda to push.[41][54] The addition of political spin to articles by organizations including members of the U.S. House of Representatives and special interest groups[14] has been noted,[55] and organizations such as Microsoft have offered financial incentives to work on certain articles.[56] These issues have been parodied, notably by Stephen Colbert in The Colbert Report.[57] : See also: Notability in Wikipedia. As an encyclopedia building project, Wikipedia seeks to create a summary of all human knowledge: all of topics covered by a conventional print encyclopedia plus any other "notable" (therefore verifiable by published sources) topics, which are permitted by unlimited disk space.[58] In particular, it contains materials that some people, including Wikipedia editors,[59] may find objectionable, offensive, or pornographic.[60] It was made clear that this policy is not up for debate, and the policy has sometimes proved controversial. For instance, in 2008, Wikipedia rejected an online petition against the inclusion of Muhammad's depictions in its English edition, citing this policy. The presence of politically sensitive materials in Wikipedia had also led the People's Republic of China to block access to parts of the site.[61] (See also: IWF block of Wikipedia) Content in Wikipedia is subject to the laws (in particular copyright law) in Florida, where Wikipedia servers are hosted, and several editorial policies and guidelines that are intended to reinforce the notion that Wikipedia is an encyclopedia. Each entry in Wikipedia must be about a topic that is encyclopedic and thus is worthy of inclusion. A topic is deemed encyclopedic if it is "notable"[62] in the Wikipedia jargon; i.e., if it has received significant coverage in secondary reliable sources (i.e., mainstream media or major academic journals) that are independent of the subject of the topic. Second, Wikipedia must expose knowledge that is already established and recognized.[63] In other words, it must not present, for instance, new information or original works. A claim that is likely to be challenged requires a reference to reliable sources. Within the Wikipedia community, this is often phrased as "verifiability, not truth" to express the idea that the readers are left themselves to check the truthfulness of what appears in the articles and to make their own interpretations.[64] Finally, Wikipedia does not take a side.[65] All opinions and viewpoints, if attributable to external sources, must enjoy appropriate share of coverage within an article.[66] Wikipedia editors as a community write and revise those policies and guidelines[67] and enforce them by deleting, annotating with tags, or modifying article materials failing to meet them. (See also deletionism and inclusionism)[68][69] However, Wikipedia has been accused of exhibiting systemic bias and inconsistency;[13] critics argue that Wikipedia's open nature and a lack of proper sources for much of the information makes it unreliable.[70] Some commentators suggest that Wikipedia is generally reliable, but that the reliability of any given article is not always clear.[12] Editors of traditional reference works such as the Encyclopædia Britannica have questioned the project's utility and status as an encyclopedia.[71] Many university lecturers discourage students from citing any encyclopedia in academic work, preferring primary sources;[72] some specifically prohibit Wikipedia citations.[73] Co-founder Jimmy Wales stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as authoritative.[74] Andrew Lih, author of the 2009 book The Wikipedia Revolution, notes: "A wiki has all its activities happening in the open for inspection... Trust is built by observing the actions of others in the community and discovering people with like or complementary interests."[75] Economist Tyler Cowen writes, "If I had to guess whether Wikipedia or the median refereed journal article on economics was more likely to be true, after a not so long think I would opt for Wikipedia." He comments that many traditional sources of non-fiction suffer from systemic biases. Novel results are over-reported in journal articles, and relevant information is omitted from news reports. However, he also cautions that errors are frequently found on Internet sites, and that academics and experts must be vigilant in correcting them.[76] In February 2007, an article in The Harvard Crimsonnewspaper reported that some of the professors at Harvard University include Wikipedia in their syllabus, but that there is a split in their perception of using Wikipedia.[77] In June 2007, former president of the American Library Association Michael Gorman condemned Wikipedia, along with Google,[78] stating that academics who endorse the use of Wikipedia are "the intellectual equivalent of a dietitian who recommends a steady diet of Big Macs with everything". He also said that "a generation of intellectual sluggards incapable of moving beyond the Internet" was being produced at universities. He complains that the web-based sources are discouraging students from learning from the more rare texts which are either found only on paper or are on subscription-only web sites. In the same article Jenny Fry (a research fellow at the Oxford Internet Institute) commented on academics who cite Wikipedia, saying that: "You cannot say children are intellectually lazy because they are using the Internet when academics are using search engines in their research. The difference is that they have more experience of being critical about what is retrieved and whether it is authoritative. Children need to be told how to use the Internet in a critical and appropriate way."[78] The Wikipedia community has established "a bureaucracy of sorts", including "a clear power structure that gives volunteer administrators the authority to exercise editorial control."[79][80][81] Wikipedia's community has also been described as "cult-like",[82] although not always with entirely negative connotations,[83] and criticized for failing to accommodate inexperienced users.[84] Editors in good standing in the community can run for one of many levels of volunteer stewardship; this begins with "administrator",[85][86] a group of privileged users who have the ability to delete pages, lock articles from being changed in case of vandalism or editorial disputes, and block users from editing. Despite the name, administrators do not enjoy any special privilege in decision-making; instead they are mostly limited to making edits that have project-wide effects and thus are disallowed to ordinary editors, and to ban users making disruptive edits (such as vandalism).[87] Wikimania, an annual conference for users of Wikipedia and other projects operated by the Wikimedia Foundation.

As Wikipedia grows with an unconventional model of encyclopedia building, "Who writes Wikipedia?" has become one of the questions frequently asked on the project, often with a reference to other Web 2.0 projects such as Digg.[88] Jimmy Wales once argued that only "a community ... a dedicated group of a few hundred volunteers" makes the bulk of contributions to Wikipedia and that the project is therefore "much like any traditional organization". Wales performed a study finding that over 50% of all the edits are done by just .7% of the users (at the time: 524 people). This method of evaluating contributions was later disputed by Aaron Swartz, who noted that several articles he sampled had large portions of their content (measured by number of characters) contributed by users with low edit counts.[89] A 2007 study by researchers from Dartmouth College found that "anonymous and infrequent contributors to Wikipedia ... are as reliable a source of knowledge as those contributors who register with the site."[90] Although some contributors are authorities in their field, Wikipedia requires that even their contributions be supported by published and verifiable sources. The project's preference for consensus over credentials has been labeled "anti-elitism".[11] In August 2007, WikiScanner, a website developed by Virgil Griffith began to trace the sources of changes made to Wikipedia by anonymous editors without Wikipedia accounts. The program revealed that many such edits were made by corporations or government agencies changing the content of articles related to them, their personnel or their work.[91] In a 2003 study of Wikipedia as a community, economics Ph.D. student Andrea Ciffolilli argued that the low transaction costs of participating in wiki software create a catalyst for collaborative development, and that a "creative construction" approach encourages participation.[92] In his 2008 book, The Future of the Internet and How to Stop It, Jonathan Zittrain of the Oxford Internet Institute and Harvard Law School's Berkman Center for Internet & Society cites Wikipedia's success as a case study in how open collaboration has fostered innovation on the web.[93] A 2008 study found that Wikipedia users were less agreeable and open, though more conscientious, than non-Wikipedia users.[94][95] A 2009 study suggested there was "evidence of growing resistance from the Wikipedia community to new content."[96] The Wikipedia Signpost is the community newspaper on the English Wikipedia,[97] and was founded by Michael Snow, an administrator and the current chair of the Wikimedia Foundation board of trustees.[98] It covers news and events from the site, as well as major events from sister projects, such as Wikimedia Commons.[99] Wikimedia Foundation logo

Wikipedia is hosted and funded by the Wikimedia Foundation, a non-profit organization which also operates Wikipedia-related projects such as Wikibooks. The Wikimedia chapters, local associations of Wikipedia users, also participate in the promotion, the development, and the funding of the project. The operation of Wikipedia depends on MediaWiki, a custom-made, free and open source wiki software platform written in PHP and built upon the MySQL database.[100] The software incorporates programming features such as a macro language, variables, a transclusion system for templates, and URL redirection. MediaWiki is licensed under the GNU General Public License and used by all Wikimedia projects, as well as many other wiki projects. Originally, Wikipedia ran on UseModWiki written in Perl by Clifford Adams (Phase I), which initially required CamelCase for article hyperlinks; the present double bracket style was incorporated later. Starting in January 2002 (Phase II), Wikipedia began running on a PHP wiki engine with a MySQL database; this software was custom-made for Wikipedia by Magnus Manske. The Phase II software was repeatedly modified to accommodate the exponentially increasing demand. In July 2002 (Phase III), Wikipedia shifted to the third-generation software, MediaWiki, originally written by Lee Daniel Crocker. Several MediaWiki extensions are installed[101] to extend the functionality of MediaWiki software. In April 2005 a Lucene extension[102][103] was added to MediaWiki's built-in search and Wikipedia switched from MySQL to Lucene for searching. Currently Lucene Search 2,[104] which is written in Java and based on Lucene library 2.0,[105] is used. Overview of system architecture, April 2009. See server layout diagrams on Meta-Wiki.

Wikipedia currently runs on dedicated clusters of Linux servers (mainly Ubuntu),[106][107] with a few OpenSolaris machines for ZFS. As of February 2008, there were 300 in Florida, 26 in Amsterdam, and 23 in Yahoo!'s Korean hosting facility in Seoul.[108] Wikipedia employed a single server until 2004, when the server setup was expanded into a distributed multitier architecture. In January 2005, the project ran on 39 dedicated servers located in Florida. This configuration included a single master database server running MySQL, multiple slave database servers, 21 web servers running the Apache HTTP Server, and seven Squid cache servers. Wikipedia receives between 25,000 and 60,000 page requests per second, depending on time of day.[109] Page requests are first passed to a front-end layer of Squid caching servers.[110] Requests that cannot be served from the Squid cache are sent to load-balancing servers running the Linux Virtual Server software, which in turn pass the request to one of the Apache web servers for page rendering from the database. The web servers deliver pages as requested, performing page rendering for all the language editions of Wikipedia. To increase speed further, rendered pages are cached in a distributed memory cache until invalidated, allowing page rendering to be skipped entirely for most common page accesses. Two larger clusters in the Netherlands and Korea now handle much of Wikipedia's traffic load. Wikipedia's original medium was for users to read and edit content using any standard web browser through a fixed internet connection. However, Wikipedia content is now also accessible through offline media, and through the mobile web. On mobile devices access to Wikipedia from mobile phones was possible as early as 2004, through the Wireless Application Protocol (WAP), through the Wapedia service. In June 2007, Wikipedia launched en.mobile.wikipedia.org, an official website for wireless devices. In 2009 a newer mobile service was officially released,[111] located at en.m.wikipedia.org, which caters to more advanced mobile devices such as the iPhone, Android-based devices, or the Palm Pre. Several other methods of mobile access to Wikipedia have emerged (See Wikipedia:Mobile access). Several devices and applications optimise or enhance the display of Wikipedia content for mobile devices, while some also incorporate additional features such as use of Wikipedia metadata (See Wikipedia:Metadata), such as geoinformation.[112] Collections of Wikipedia articles have been published on optical disks. An English version, 2006 Wikipedia CD Selection, contained about 2,000 articles.[113][114] The Polish version contains nearly 240,000 articles.[115] There are also German versions.[116] See also: List of Wikipedias All text in Wikipedia was covered by GNU Free Documentation License (GFDL), a copyleft license permitting the redistribution, creation of derivative works, and commercial use of content while authors retain copyright of their work,[117] up until June 2009, when the site switched to Creative Commons Attribution-ShareAlike (CC-by-SA) 3.0.[118] Wikipedia had been working on the switch to Creative Commons licenses because the GFDL, initially designed for software manuals, is not suitable for online reference works and because the two licenses were incompatible.[119] In response to the Wikimedia Foundation's request, in November 2008, the Free Software Foundation (FSF) released a new version of GFDL designed specifically to allow Wikipedia to relicense its content to CC-BY-SA by August 1, 2009. Wikipedia and its sister projects held a community-wide referendum to decide whether or not to make the license switch.[120] The referendum took place from April 9 to 30.[121] The results were 75.8% "Yes", 10.5% "No", and 13.7% "No opinion".[122] In consequence of the referendum, the Wikimedia Board of Trustees voted to change to the Creative Commons license, effective June 15, 2009.[122] The position that Wikipedia is merely a hosting service has been successfully used as a defense in court.[123][124] Percentage of all Wikipedia articles in English (red) and top ten largest language editions (blue). As of July 2007, less than 23% of Wikipedia articles are in English.

The handling of media files (e.g., image files) varies across language editions. Some language editions, such as the English Wikipedia, include non-free image files under fair use doctrine, while the others have opted not to. This is in part because of the difference in copyright laws between countries; for example, the notion of fair use does not exist in Japanese copyright law. Media files covered by free content licenses (e.g., Creative Commons' cc-by-sa) are shared across language editions via Wikimedia Commons repository, a project operated by the Wikimedia Foundation. There are currently 262 language editions of Wikipedia; of these, 24 have over 100,000 articles and 81 have over 1,000 articles.[1] According to Alexa, the English subdomain (en.wikipedia.org; English Wikipedia) receives approximately 52% of Wikipedia's cumulative traffic, with the remaining split among the other languages (Spanish: 19%, French: 5%, Polish: 3%, German: 3%, Japanese: 3%, Portuguese: 2%).[3] As of July 2008, the five largest language editions are (in order of article count) English, German, French, Polish, and Japanese Wikipedias.[125] Since Wikipedia is web-based and therefore worldwide, contributors of a same language edition may use different dialects or may come from different countries (as is the case for the English edition). These differences may lead to some conflicts over spelling differences, (e.g. color vs. colour)[126] or points of view.[127] Though the various language editions are held to global policies such as "neutral point of view," they diverge on some points of policy and practice, most notably on whether images that are not licensed freely may be used under a claim of fair use.[128][129][130] Contributors for English Wikipedia by country as of September 2006.[131]

Jimmy Wales has described Wikipedia as "an effort to create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language".[132] Though each language edition functions more or less independently, some efforts are made to supervise them all. They are coordinated in part by Meta-Wiki, the Wikimedia Foundation's wiki devoted to maintaining all of its projects (Wikipedia and others).[133] For instance, Meta-Wiki provides important statistics on all language editions of Wikipedia,[134] and it maintains a list of articles every Wikipedia should have.[135] The list concerns basic content by subject: biography, history, geography, society, culture, science, technology, foodstuffs, and mathematics. As for the rest, it is not rare for articles strongly related to a particular language not to have counterparts in another edition. For example, articles about small towns in the United States might only be available in English. Translated articles represent only a small portion of articles in most editions, in part because automated translation of articles is disallowed.[136] Articles available in more than one language may offer "InterWiki" links, which link to the counterpart articles in other editions. Graph showing the number of days between every 10,000,000th edit.

Wikipedia shown in Weird Al's music video for his song "White & Nerdy".

Main article: Wikipedia in culture In addition to logistic growth in the number of its articles,[137] Wikipedia has steadily gained status as a general reference website since its inception in 2001.[138] According to Alexa and comScore, Wikipedia is among the ten most visited websites worldwide.[10][139] Of the top ten, Wikipedia is the only non-profit website. The growth of Wikipedia has been fueled by its dominant position in Google search results;[140] about 50% of search engine traffic to Wikipedia comes from Google,[141] a good portion of which is related to academic research.[142] In April 2007 the Pew Internet and American Life project found that one third of US Internet users consulted Wikipedia.[143] In October 2006, the site was estimated to have a hypothetical market value of $580 million if it ran advertisements.[144] Wikipedia's content has also been used in academic studies, books, conferences, and court cases.[145][146][147] The Parliament of Canada's website refers to Wikipedia's article on same-sex marriage in the "related links" section of its "further reading" list for the Civil Marriage Act.[148] The encyclopedia's assertions are increasingly used as a source by organizations such as the U.S. Federal Courts and the World Intellectual Property Organization[149] - though mainly for supporting information rather than information decisive to a case.[150] Content appearing on Wikipedia has also been cited as a source and referenced in some U.S. intelligence agency reports.[151] In December 2008, the scientific journal RNA Biology launched a new section for descriptions of families of RNA molecules and requires authors who contribute to the section to also submit a draft article on the RNA family for publication in Wikipedia.[152] The Onion newspaper headline "Wikipedia Celebrates 750 Years Of American Independence"

Wikipedia has also been used as a source in journalism,[153] sometimes without attribution, and several reporters have been dismissed for plagiarizing from Wikipedia.[154][155][156] In July 2007, Wikipedia was the focus of a 30-minute documentary on BBC Radio 4[157] which argued that, with increased usage and awareness, the number of references to Wikipedia in popular culture is such that the term is one of a select band of 21st-century nouns that are so familiar (Google, Facebook, YouTube) that they no longer need explanation and are on a par with such 20th-century terms as Hoovering or Coca-Cola. Many parody Wikipedia's openness, with characters vandalizing or modifying the online encyclopedia project's articles. Notably, comedian Stephen Colbert has parodied or referenced Wikipedia on numerous episodes of his show The Colbert Report and coined the related term "wikiality".[57] The site has created an impact upon several forms of media. Some media sources satirize Wikipedia's susceptibility to inserted inaccuracies, such as a front-page article in The Onion in July 2006 with the title "Wikipedia Celebrates 750 Years of American Independence".[158] Others may draw upon Wikipedia's statement that anyone can edit, such as "The Negotiation," an episode of The Office, where character Michael Scott said that "Wikipedia is the best thing ever. Anyone in the world can write anything they want about any subject, so you know you are getting the best possible information". Other media sources parody Wikipedia's policies, such as the xkcd strip named "Wikipedian Protester." An xkcd strip entitled "Wikipedian Protester"

Dutch filmmaker IJsbrand van Veelen premiered his 45-minute television documentary The Truth According to Wikipedia in April, 2008.[159] Another documentary film about Wikipedia, entitled Truth in Numbers: The Wikipedia Story, is scheduled for a 2009 release. Shot on several continents, the film will cover the history of Wikipedia and feature interviews with Wikipedia editors around the world.[160][161] On September 28, 2007, Italian politician Franco Grillini raised a parliamentary question with the Minister of Cultural Resources and Activities about the necessity of freedom of panorama. He said that the lack of such freedom forced Wikipedia, "the seventh most consulted website" to forbid all images of modern Italian buildings and art, and claimed this was hugely damaging to tourist revenues.[162] Jimmy Wales receiving the Quadriga A Mission of Enlightenment award

On September 16, 2007, The Washington Post reported that Wikipedia had become a focal point in the 2008 U.S. election campaign, saying, "Type a candidate's name into Google, and among the first results is a Wikipedia page, making those entries arguably as important as any ad in defining a candidate. Already, the presidential entries are being edited, dissected and debated countless times each day."[163] An October 2007 Reuters article, entitled "Wikipedia page the latest status symbol", reported the recent phenomenon of how having a Wikipedia article vindicates one's notability.[164] Wikipedia won two major awards in May 2004.[165] The first was a Golden Nica for Digital Communities of the annual Prix Ars Electronica contest; this came with a €10,000 (£6,588; $12,700) grant and an invitation to present at the PAE Cyberarts Festival in Austria later that year. The second was a Judges' Webby Award for the "community" category.[166] Wikipedia was also nominated for a "Best Practices" Webby. On January 26, 2007, Wikipedia was also awarded the fourth highest brand ranking by the readers of brandchannel.com, receiving 15% of the votes in answer to the question "Which brand had the most impact on our lives in 2006?"[167] In September 2008, Wikipedia received Quadriga A Mission of Enlightenment award of Werkstatt Deutschland along with Boris Tadić, Eckart Höfling, and Peter Gabriel. The award was presented to Jimmy Wales by David Weinberger.[168] In July 2009, BBC Radio 4 broadcast a comedy series called Bigipedia, which was set on a website which was a parody of Wikipedia. Some of the sketches were directly inspired by Wikipedia and its articles.[169] Find more about Wikipedia on Wikipedia's sister projects:Definitions from Wiktionary

Textbooks from Wikibooks

Quotations from Wikiquote

Source texts from Wikisource

Images and media from Commons

News stories from Wikinews

Learning resources from Wikiversity

A number of interactive multimedia encyclopedias incorporating entries written by the public existed long before Wikipedia was founded. The first of these was the 1986 BBC Domesday Project, which included text (entered on BBC Micro computers) and photographs from over 1 million contributors in the UK, and covering the geography, art, and culture of the UK. This was the first interactive multimedia encyclopedia (and was also the first major multimedia document connected through internal links), with the majority of articles being accessible through an interactive map of the UK. The user-interface and part of the content of the Domesday Project have now been emulated on a website.[170] One of the most successful early online encyclopedias incorporating entries by the public was h2g2, which was created by Douglas Adams and is run by the BBC. The h2g2 encyclopedia was relatively light-hearted, focusing on articles which were both witty and informative. Both of these projects had similarities with Wikipedia, but neither gave full editorial freedom to public users. A similar non-wiki project, the GNUPedia project, co-existed with Nupedia early in its history; however, it has been retired and its creator, free software figure Richard Stallman, has lent his support to Wikipedia.[20] Wikipedia has also spawned several sister projects, which are also run by the Wikimedia Foundation. The first, "In Memoriam: September 11 Wiki",[171] created in October 2002,[172] detailed the September 11 attacks; this project was closed in October 2006. Wiktionary, a dictionary project, was launched in December 2002;[173] Wikiquote, a collection of quotations, a week after Wikimedia launched, and Wikibooks, a collection of collaboratively written free books. Wikimedia has since started a number of other projects, including Wikiversity, a project for the creation of free learning materials and the provision of online learning activities.[174] None of these sister projects, however, has come to meet the success of Wikipedia. Some subsets of Wikipedia's information have been developed, often with additional review for specific purposes. For example, the Wikipedia series of CDs/DVDs, produced by Wikipedians and SOS Children (aka "Wikipedia for Schools"), is a free, hand-checked, non-commercial selection from Wikipedia, targeted around the UK National Curriculum and intended to be useful for much of the English speaking world. Wikipedia for Schools is available on-line: an equivalent print encyclopedia would require roughly twenty volumes. There has also been a attempt to put a select subset of Wikipedia's articles into printed book form.[175] Other websites centered on collaborative knowledge base development have drawn inspiration from or inspired Wikipedia. Some, such as Susning.nu, Enciclopedia Libre, and WikiZnanie likewise employ no formal review process, whereas others use more traditional peer review, such as Encyclopedia of Life, Stanford Encyclopedia of Philosophy, Scholarpedia, h2g2, and Everything2. Citizendium, an online encyclopedia, was started by Wikipedia co-founder Larry Sanger in an attempt to create an "expert-friendly" Wikipedia.[176][177][178]

2 answers


Limatation means a disadvantage or weakness in somebody or something!! Hope I helpedYour answer is INCORRECT!! the answer is for example if u have a table with data the limitation is the missing data in the table♥ Ur answer was quite right!!!

Actually.. i think she maybe right..there are different kinds of ways to answer that question.. depending what you need the answer for.. For example liminations in hairdressing means -

if a client has some sort of condition on their scalp.. e.g dandruff,alopecia, head lice or eczema etc.. depending on what condition the client has or how bad it is jus by looking the hairdresser will no how far she can take the treatment that the clients wants on their hair..so if the client had alopeacia then the hairdresser can carry out a service aslong as they dont use chemical products on the scalp or hair

i think that is right !

6 answers


Nanotechnology advances affect all branches of engineering and science that deal directly with device components ranging in size between 1/10,000,000 (one ten http://www.answers.com/topic/millionth of a http://www.answers.com/topic/millimeter) and 1/10,0000 millimeter. At these scales, even the most sophisticated microtechnology-based instrumentation is useless. Engineers anticipate that advances in nanotechnology will allow the direct manipulation of molecules in biological samples (e.g., proteins or http://www.answers.com/topic/nucleic-acid) paving the way for the development of new materials that have a biological component or that can provide a biological interface. In addition to new tools, nanotechnology programs advance practical understanding of quantum physics. The internalization of quantum concepts is a necessary component of nanotechnology research programs because the laws of classical physics (e.g., classical mechanics or generalized gas laws) do not always apply to the atomic and near-atomic level. Nanotechnology and quantum physics. Quantum theory and mechanics describe the relationship between energy and matter on the atomic and http://www.answers.com/topic/subatomic scale. At the beginning of the twentieth century, German physicist Maxwell Planck (1858-1947) proposed that atoms absorb or http://www.answers.com/topic/emit electromagnetic radiation in bundles of energy termed quanta. This quantum concept seemed counter-intuitive to well-established Newtonian physics. Advancements associated with quantum mechanics (e.g., the uncertainty principle) also had profound implications with regard to the philosophical scientific arguments regarding the limitations of human knowledge. Planck's quantum theory, which also asserted that the energy of light (a http://www.answers.com/topic/photon) was directly proportional to its frequency, proved a powerful concept that accounted for a wide range of physical phenomena. Planck's constant relates the energy of a photon with the frequency of light. Along with the constant for the speed of light, Planck's constant (h = 6.626 x 10−34 Joule-second) is a fundamental constant of nature. Prior to Planck's work, electromagnetic radiation (light) was thought to travel in waves with an infinite number of available frequencies and wavelengths. Planck's work focused on attempting to explain the limited spectrum of light emitted by hot objects. Danish physicist Niels Bohr (1885-1962) studied Planck's quantum theory of radiation and worked in England with physicists J. J. Thomson (1856-1940), and Ernest Rutherford (1871-1937) to improve their classical models of the atom by incorporating quantum theory. During this time, Bohr developed his model of atomic structure. According to the Bohr model, when an electron is excited by energy it jumps from its ground state to an excited state (i.e., a higher energy orbital). The excited atom can then emit energy only in certain (quantized) amounts as its electrons jump back to lower energy orbits located closer to the nucleus. This excess energy is emitted in quanta of electromagnetic radiation (photons of light) that have exactly the same energy as the difference in energy between the orbits jumped by the electron. The electron quantum leaps between orbits proposed by the Bohr model accounted for Plank's observations that atoms emit or absorb electromagnetic radiation in quanta. Bohr's model also explained many important properties of the http://www.answers.com/topic/photoelectric-effect described by Albert Einstein (1879-1955). Einstein assumed that light was transmitted as a stream of particles termed photons. By extending the well-known wave properties of light to include a treatment of light as a stream of photons, Einstein was able to explain the photoelectric effect. Photoelectric properties are key to regulation of many microtechnology and proposed nanotechnology level systems. Quantum mechanics ultimately replaced electron "orbitals" of earlier atomic models with allowable values for angular momentum (angular velocity http://www.answers.com/topic/multiply by mass) and depicted electron positions in terms of probability "clouds" and regions. In the 1920s, the concept of http://www.answers.com/topic/quantization and its application to physical phenomena was further advanced by more mathematically complex models based on the work of the French physicist Louis Victor de Broglie (1892-1987) and Austrian physicist Erwin Schrödinger (1887-1961) that depicted the particle and wave nature of electrons. De Broglie showed that the electron was not merely a particle but a http://www.answers.com/topic/waveform. This proposal led Schrödinger to publish his wave equation in 1926. Schrödinger's work described electrons as a "standing wave" surrounding the nucleus, and his system of quantum mechanics is called wave mechanics. German physicist Max Born (1882-1970) and English physicist P. A. M. Dirac (1902-1984) made further advances in defining the http://www.answers.com/topic/subatomic-particle (principally the electron) as a wave rather than as a particle and in reconciling portions of quantum theory with relativity theory. Working at about the same time, German physicist Werner Heisenberg (1901-1976) formulated the first complete and self-consistent theory of quantum mechanics. Matrix mathematics was well established by the 1920s, and http://www.answers.com/topic/werner-heisenberg applied this powerful tool to quantum mechanics. In 1926, Heisenberg put forward his uncertainty principle which states that two http://www.answers.com/topic/complementary properties of a system, such as position and momentum, can never both be known exactly. This proposition helped cement the dual nature of particles (e.g., light can be described as having both wave and particle characteristics). Electromagnetic radiation (one region of the spectrum that comprises visible light) is now understood to have both particle and wave like properties. In 1925, Austrian-born physicist Wolfgang Pauli (1900-1958) published the Pauli exclusion principle states that no two electrons in an atom can simultaneously occupy the same quantum state (i.e., energy state). Pauli's specification of spin (+1/2 or −1/2) on an electron gave the two electrons in any http://www.answers.com/topic/suborbital differing quantum numbers (a system used to describe the quantum state) and made completely understandable the structure of the periodic table in terms of electron configurations (i.e., the energy-related arrangement of electrons in energy shells and suborbitals). In 1931, American chemist Linus Pauling published a paper that used quantum mechanics to explain how two electrons, from two different atoms, are shared to make a http://www.answers.com/topic/covalent-bond between the two atoms. Pauling's work provided the connection needed in order to fully apply the new quantum theory to chemical reactions. Advances in nanotechnology depend upon an understanding and application of these fundamental quantum principles. At the quantum level the smoothness of classical physics disappears and nanotechnologies are predicated on exploiting this quantum roughness. Applications The development of devices that are small, light, self-contained, use little energy and that will replace larger microelectronic equipment is one of the first goals of the anticipated nanotechnology revolution. The second phase will be marked by the introduction of materials not http://www.answers.com/topic/feasible at larger than nanotechnology levels. Given the nature of quantum http://www.answers.com/topic/variance, scientists http://www.answers.com/topic/theorize that single molecule sensors can be developed and that sophisticated memory storage and neural-like networks can be achieved with a very small number of molecules. Traditional engineering concepts undergo radical transformation at the atomic level. For example, nano-technology motors may drive gears, the cogs of which are composed of the atoms attached to a carbon ring. Nanomotors may themselves be driven by http://www.answers.com/topic/oscillate magnetic fields or high precision oscillating lasers. Perhaps the greatest promise for nanotechnology lies in potential http://www.answers.com/topic/biotechnology advances. Potential nano-level manipulation of DNA offers the opportunity to radically expand the horizons of genomic medicine and http://www.answers.com/topic/immunology. Tissue-based biosensors may unobtrusively be able to monitor and regulate site-specific medicine delivery or regulate physiological processes. Nanosystems might serve as highly sensitive detectors of toxic substances or used by inspectors to detect traces of biological or chemical weapons. In electronics and computer science, scientists assert that nanotechnologies will be the next major advance in computing and information processing science. Microelectronic devices rely on recognition and flips in electron gating (e.g. where differential states are ultimately represented by a series of binary numbers ["0" or "1"] that depict voltage states). In contrast, future quantum processing will utilize the identity of quantum states as set forth by quantum numbers. In quantum http://www.answers.com/topic/cryptography systems with the ability to http://www.answers.com/topic/decipherment encrypted information will rely on precise knowledge of manipulations used to achieve various atomic states. Nanoscale devices are constructed using a combination of fabrication steps. In the initial growth stage, layers of http://www.answers.com/topic/semiconductor materials are grown on a dimension limiting http://www.answers.com/topic/substrate. Layer composition can be altered to control electrical and/or optical characteristics. Techniques such as http://www.answers.com/topic/molecular-beam-epitaxy (http://www.answers.com/topic/mbe-abbreviation) and metallo-organic chemical vapor deposition (MOCVD) are capable of producing layers of a few atoms thickness. The developed pattern is then imposed on successive layers (the pattern transfer stage) to develop desired three dimensional structural characteristics. Nanotechnology Research In the United States, expenditures on nanotechnology development tops $500 million per year and is largely coordinated by the National Science Foundation and Department of Defense Advanced Research Projects Agency (DARPA) under the umbrella of the National Nano-technology Initiative. Other institutions with dedicated funding for nanotechnology include the Department of Energy (DOE) and National Institutes of Health (NIH). Research interests. Current research interests in nano-technology include programs to develop and exploit nanotubes for their ability to provide extremely strong bonds. Nanotubes can be http://www.answers.com/topic/flex and woven into fibers for use in ultrastrong-but also ultralight-bulletproof vests. Nanotubes are also excellent conductors that can be used to develop precise electronic http://www.answers.com/topic/circuitry. Other interests include the development of nanotechnology-based sensors that allow smarter autonomous weapons capable of a greater range of adaptations http://www.answers.com/topic/en-route to a target; materials that offer stealth characteristics across a broader span of the electromagnetic spectrum; self-repairing structures; and nanotechnology-based weapons to disrupt-but not destroy-electrical system infrastructure. Further Reading Books Mulhall, Douglas. Our Molecular Future: How Nanotechnology, Robotics, Genetics, and Artificial Intelligence Will Change Our World. Amherst, NY: Prometheus Books, 2002. Periodicals Bennewitz, R., et. al., "Atomic scale memory at a silicon surface." Nanotechnology 13 (2000): 499-502. Electronic National Science and Technology Council. "National Nano-technology Initiative." (March 19, 2003).

http://www.answers.com/library/Wikipedia-cid-54347 nanotechnologyhttp://www.answers.com/topic/c60a-png-1 http://www.answers.com/topic/c60a-png-1

Buckminsterfullerene C60, also known as the buckyball, is the simplest of the http://www.answers.com/topic/allotropes-of-carbon known as http://www.answers.com/topic/fullerene. Members of the fullerene family are a major subject of research falling under the nanotechnology umbrella.

Nanotechnology refers broadly to a field of http://www.answers.com/topic/applied-science and technology whose unifying theme is the control of matter on the atomic and http://www.answers.com/topic/molecule scale, normally 1 to 100 nanometers, and the fabrication of devices within that size range. It is a highly http://www.answers.com/topic/interdisciplinarity field, drawing from fields such as http://www.answers.com/topic/applied-physics-1, http://www.answers.com/topic/materials-science, http://www.answers.com/topic/colloid science, http://www.answers.com/topic/semiconductor-device, http://www.answers.com/topic/supramolecular-chemistry, and even http://www.answers.com/topic/mechanical-engineering and http://www.answers.com/topic/electrical-engineering. Much speculation exists as to what new science and technology may result from these lines of research. Nanotechnology can be seen as an extension of existing sciences into the nanoscale, or as a recasting of existing sciences using a newer, more modern term. Two main approaches are used in nanotechnology. In the "bottom-up" approach, materials and devices are built from http://www.answers.com/topic/molecule components which http://www.answers.com/topic/self-assembly chemically by principles of http://www.answers.com/topic/molecular-recognition. In the "top-down" approach, nano-objects are constructed from larger entities without atomic-level control. The impetus for nanotechnology comes from a renewed interest in colloidal science, coupled with a new generation of analytical tools such as the http://www.answers.com/topic/atomic-force-microscope (AFM), and the http://www.answers.com/topic/scanning-tunneling-microscope (STM). Combined with refined processes such as http://www.answers.com/topic/electron-beam-lithography and http://www.answers.com/topic/molecular-beam-epitaxy-1, these instruments allow the deliberate manipulation of nanostructures, and led to the observation of novel phenomena. Examples of nanotechnology in modern use are the manufacture of polymers based on molecular structure, and the design of http://www.answers.com/topic/integrated-circuit layouts based on surface science. Despite the great promise of numerous nanotechnologies such as http://www.answers.com/topic/quantum-dot and http://www.answers.com/topic/carbon-nanotube, real commercial applications have mainly used the advantages of colloidal nanoparticles in bulk form, such as http://www.answers.com/topic/sunscreen, http://www.answers.com/topic/cosmetic, http://www.answers.com/topic/industrial-coating, and stain resistant clothing.

Nanotechnology

c60a-png-1

{| ! style="BACKGROUND: #e6e6e6; COLOR: #000000" | Topics | http://www.answers.com/topic/history-of-nanotechnology · http://www.answers.com/topic/implications-of-nanotechnology

http://www.answers.com/topic/list-of-nanotechnology-applications · http://www.answers.com/topic/list-of-nanotechnology-organizations

http://www.answers.com/topic/nanotechnology-in-fiction · http://www.answers.com/topic/list-of-nanotechnology-topics ! style="BACKGROUND: #e6e6e6; COLOR: #000000; LINE-HEIGHT: 11pt" | Subfields and related fields | http://www.answers.com/topic/nanomedicine

molecular-self-assembly-1

molecular-electronics

scanning-probe-microscopy

nanolithography

http://www.answers.com/topic/molecular-nanotechnology ! style="BACKGROUND: #e6e6e6; COLOR: #000000" | Nanomaterials | http://www.answers.com/topic/nanomaterials · http://www.answers.com/topic/fullerene

carbon-nanotube

nanotube-membrane

fullerene-chemistry

http://www.answers.com/topic/potential-applications-of-carbon-nanotubes · http://www.answers.com/topic/fullerenes-in-popular-culture

http://www.answers.com/topic/timeline-of-carbon-nanotubes · http://www.answers.com/topic/allotropes-of-carbon

http://www.answers.com/topic/nanoparticle · http://www.answers.com/topic/quantum-dot

http://www.answers.com/topic/colloidal-gold · http://www.answers.com/topic/colloidal-silver ! style="BACKGROUND: #e6e6e6; COLOR: #000000" | Molecular nanotechnology | http://www.answers.com/topic/molecular-assembler

mechanosynthesis

http://www.answers.com/topic/nanorobotics · http://www.answers.com/topic/grey-goo

k-eric-drexler

engines-of-creation ----

: Main article: http://www.answers.com/topic/history-of-nanotechnology The first use of the distinguishing concepts in 'nanotechnology' (but predating use of that name) was in "http://www.answers.com/topic/there-s-plenty-of-room-at-the-bottom," a talk given by physicist http://www.answers.com/topic/richard-feynman at an http://www.answers.com/topic/american-physical-society meeting at http://www.answers.com/topic/california-institute-of-technology on http://www.answers.com/topic/december-29, http://www.answers.com/topic/1959. Feynman described a process by which the ability to manipulate individual atoms and molecules might be developed, using one set of precise tools to build and operate another proportionally smaller set, so on down to the needed scale. In the course of this, he noted, scaling issues would arise from the changing magnitude of various physical phenomena: gravity would become less important, surface tension and http://www.answers.com/topic/van-der-waals-force-1 would become more important, etc. This basic idea appears feasible, and exponential assembly enhances it with http://www.answers.com/topic/angle-of-parallelism to produce a useful quantity of end products. The term "nanotechnology" was defined by http://www.answers.com/topic/tokyo-university-of-science Professor http://www.answers.com/topic/norio-taniguchi in a http://www.answers.com/topic/1974 paper (N. Taniguchi, "On the Basic Concept of 'Nano-Technology'," Proc. Intl. Conf. Prod. Eng. Tokyo, Part II, Japan Society of Precision Engineering, 1974.) as follows: "'Nano-technology' mainly consists of the processing of, separation, consolidation, and deformation of materials by one atom or by one molecule." In the 1980s the basic idea of this definition was explored in much more depth by http://www.answers.com/topic/k-eric-drexler, who promoted the technological significance of nano-scale phenomena and devices through speeches and the books http://www.answers.com/topic/engines-of-creation (1986) and Nanosystems: Molecular Machinery, Manufacturing, and Computation, (1998, ISBN 0-471-57518-6), and so the term acquired its current sense. Nanotechnology and nanoscience got started in the early 1980s with two major developments; the birth of http://www.answers.com/topic/cluster-physics science and the invention of the http://www.answers.com/topic/scanning-tunneling-microscope (STM). This development led to the discovery of http://www.answers.com/topic/fullerene in 1986 and http://www.answers.com/topic/carbon-nanotube a few years later. In another development, the synthesis and properties of semiconductor http://www.answers.com/topic/nanocrystal was studied. This led to a fast increasing number of metal oxide nanoparticles of http://www.answers.com/topic/quantum-dot. The http://www.answers.com/topic/atomic-force-microscope was invented five years after the STM was invented. The AFM uses atomic force to see the atoms. http://www.answers.com/main/Record2?a=NR&url=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FImage%3AWikibooks-logo-en.svg

Wikibooks' [[wikibooks:|]] has more about this subject: The Opensource Handbook of Nanoscience and Nanotechnology

One nanometer (nm) is one billionth, or 10-9 of a meter. For comparison, typical carbon-carbon http://www.answers.com/topic/bond-length, or the spacing between these atoms in a molecule, are in the range .12-.15 nm, and a http://www.answers.com/topic/dna double-helix has a diameter around 2 nm. On the other hand, the smallest http://www.answers.com/topic/cell lifeforms, the bacteria of the genus http://www.answers.com/topic/mycoplasma-2, are around 200 nm in length. To put that scale in to context the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earthhttp://www.answers.com/topic/nanotechnology#wp-_note-NationalG. Or another way of putting it: a nanometer is the amount a man's beard grows in the time it takes him to raise the razor to his facehttp://www.answers.com/topic/nanotechnology#wp-_note-NationalG. Image:Atomic resolution Au100.JPG‎ Image of http://www.answers.com/topic/surface-reconstruction on a clean http://www.answers.com/topic/gold(http://www.answers.com/topic/miller-index) surface, as visualized using http://www.answers.com/topic/scanning-tunneling-microscope. The individual http://www.answers.com/topic/atom composing the surface are visible.

: Main article: http://www.answers.com/topic/nanomaterials A number of physical phenomena become noticeably pronounced as the size of the system decreases. These include http://www.answers.com/topic/statistical-mechanics effects, as well as http://www.answers.com/topic/quantum-mechanics effects, for example the "http://www.answers.com/topic/quantum size effect" where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, it becomes dominant when the nanometer size range is reached. Additionally, a number of http://www.answers.com/topic/physical-property-2 change when compared to macroscopic systems. One example is the increase in surface area to volume of materials. This catalytic activity also opens potential risks in their interaction with http://www.answers.com/topic/biomaterial. Materials reduced to the nanoscale can suddenly show very different properties compared to what they exhibit on a macroscale, enabling unique applications. For instance, opaque substances become transparent (copper); inert materials become catalysts (platinum); stable materials turn combustible (aluminum); solids turn into liquids at room temperature (gold); insulators become conductors (silicon). A material such as http://www.answers.com/topic/gold, which is chemically inert at normal scales, can serve as a potent chemical catalyst at nanoscales. Much of the fascination with nanotechnology stems from these unique quantum and surface phenomena that matter exhibits at the nanoscale.

: Main article: http://www.answers.com/topic/molecular-self-assembly-1 Modern http://www.answers.com/topic/chemical-synthesis has reached the point where it is possible to prepare small http://www.answers.com/topic/molecule to almost any structure. These methods are used today to produce a wide variety of useful chemicals such as http://www.answers.com/topic/drug or commercial http://www.answers.com/topic/polymer. This ability raises the question of extending this kind of control to the next-larger level, seeking methods to assemble these single molecules into http://www.answers.com/topic/supramolecular-assembly consisting of many molecules arranged in a well defined manner. These approaches utilize the concepts of http://www.answers.com/topic/molecular-self-assembly-1 and/or http://www.answers.com/topic/supramolecular-chemistry to automatically arrange themselves into some useful conformation through a http://www.answers.com/topic/bottom-up-2 approach. The concept of http://www.answers.com/topic/molecular-recognition is especially important: molecules can be designed so that a specific conformation or arrangement is favored due to http://www.answers.com/topic/noncovalent-bonding http://www.answers.com/topic/intermolecular-force. The Watson-Crick http://www.answers.com/topic/base-pair rules are a direct result of this, as is the specificity of an http://www.answers.com/topic/enzyme being targeted to a single http://www.answers.com/topic/substrate-biochemistry, or the specific http://www.answers.com/topic/protein-folding itself. Thus, two or more components can be designed to be complementary and mutually attractive so that they make a more complex and useful whole. Such bottom-up approaches should, broadly speaking, be able to produce devices in parallel and much cheaper than top-down methods, but could potentially be overwhelmed as the size and complexity of the desired assembly increases. Most useful structures require complex and thermodynamically unlikely arrangements of atoms. Nevertheless, there are many examples of self-assembly based on molecular recognition in http://www.answers.com/topic/biology-3, most notably http://www.answers.com/topic/base-pair and http://www.answers.com/topic/enzyme-http://www.answers.com/topic/substrate-biochemistry interactions. The challenge for nanotechnology is whether these principles can be used to engineer novel constructs in addition to natural ones.

: Main article: http://www.answers.com/topic/molecular-nanotechnology Molecular nanotechnology, sometimes called molecular manufacturing, is a term given to the concept of engineered nanosystems (nanoscale machines) operating on the molecular scale. It is especially associated with the concept of a http://www.answers.com/topic/molecular-assembler, a machine that can produce a desired structure or device atom-by-atom using the principles of http://www.answers.com/topic/mechanosynthesis. Manufacturing in the context of productive nanosystems is not related to, and should be clearly distinguished from, the conventional technologies used to manufacture nanomaterials such as carbon nanotubes and nanoparticles. When the term "nanotechnology" was independently coined and popularized by http://www.answers.com/topic/k-eric-drexler (who at the time was unaware of an http://www.answers.com/topic/history-of-nanotechnology by http://www.answers.com/topic/norio-taniguchi) it referred to a future manufacturing technology based on http://www.answers.com/topic/molecular-machine-1 systems. The premise was that molecular-scale biological analogies of traditional machine components demonstrated molecular machines were possible: by the countless examples found in biology, it is known that billions of years of evolutionary feedback can produce sophisticated, http://www.answers.com/topic/stochastic optimised biological machines. It is hoped that developments in nanotechnology will make possible their construction by some other means, perhaps using http://www.answers.com/topic/bionics principles. However, Drexler and other researchers have proposed that advanced nanotechnology, although perhaps initially implemented by biomimetic means, ultimately could be based on mechanical engineering principles, namely, a manufacturing technology based on the mechanical functionality of these components (such as gears, bearings, motors, and structural members) that would enable programmable, positional assembly to atomic specification (PNAS-1981). The physics and engineering performance of exemplar designs were analyzed in Drexler's book Nanosystems. But Drexler's analysis is very qualitative and does not address very pressing issues, such as the "fat fingers" and "Sticky fingers" problems. In general it is very difficult to assemble devices on the atomic scale, as all one has to position atoms are other atoms of comparable size and stickyness. Another view, put forth by Carlo Montemagno, is that future nanosystems will be hybrids of silicon technology and biological molecular machines. Yet another view, put forward by the late http://www.answers.com/topic/richard-smalley, is that mechanosynthesis is impossible due to the difficulties in mechanically manipulating individual molecules. This led to an exchange of letters in the http://www.answers.com/topic/american-chemical-society publication Chemical & Engineering News in 2003. Though biology clearly demonstrates that molecular machine systems are possible, non-biological molecular machines are today only in their infancy. Leaders in research on non-biological molecular machines are Dr. Alex Zettl and his colleagues at Lawrence Berkeley Laboratories and UC Berkeley. They have constructed at least three distinct molecular devices whose motion is controlled from the desktop with changing voltage: a nanotube http://www.answers.com/topic/nanomotor, a molecular actuator, and a nanoelectromechanical relaxation oscillator. An experiment indicating that positional molecular assembly is possible was performed by Ho and Lee at http://www.answers.com/topic/cornell-university in 1999. They used a scanning tunneling microscope to move an individual carbon monoxide molecule (CO) to an individual iron atom (Fe) sitting on a flat silver crystal, and chemically bound the CO to the Fe by applying a voltage. http://www.answers.com/topic/nanocartriangle-jpg-1 http://www.answers.com/topic/nanocartriangle-jpg-1

Space-filling model of the http://www.answers.com/topic/nanocar on a surface, using http://www.answers.com/topic/fullerene as wheels.

http://www.answers.com/topic/rotaxane-jpg-1 http://www.answers.com/topic/rotaxane-jpg-1

Graphical representation of a http://www.answers.com/topic/rotaxane, useful as a molecular switch.

http://www.answers.com/main/Record2?a=NR&url=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FImage%3AAchermann7RED.jpg http://www.answers.com/main/Record2?a=NR&url=http%3A%2F%2Fcommons.wikimedia.org%2Fwiki%2FImage%3AAchermann7RED.jpg

This device transfers energy from nano-thin layers of http://www.answers.com/topic/quantum-well to http://www.answers.com/topic/nanocrystal above them, causing the nanocrystals to emit visible light. [1]

As nanotechnology is a very broad term, there are many disparate but sometimes overlapping subfields that could fall under its umbrella. The following avenues of research could be considered subfields of nanotechnology. This includes subfields which develop or study materials having unique properties arising from their nanoscale dimensions. * http://www.answers.com/topic/colloid science has given rise to many materials which may be useful in nanotechnology, such as http://www.answers.com/topic/carbon-nanotube and other http://www.answers.com/topic/fullerene, and various http://www.answers.com/topic/nanoparticle and http://www.answers.com/topic/nanorod. * http://www.answers.com/topic/nanomaterials can also be used for bulk applications; most present commercial applications of nanotechnology are of this flavor. * Progress has been made in using these materials for medical applications; see nanomedicine. These seek to arrange smaller components into more complex assemblies. * DNA Nanotechnology utilises the specificity of http://www.answers.com/topic/base-pair to construct well-defined structures out of http://www.answers.com/topic/dna and other http://www.answers.com/topic/nucleic-acid. * More generally, molecular-self-assembly-1 seeks to use concepts of http://www.answers.com/topic/supramolecular-chemistry, and http://www.answers.com/topic/molecular-recognition in particular, to cause single-molecule components to automatically arrange themselves into some useful conformation. These seek to create smaller devices by using larger ones to direct their assembly. * Many technologies descended from conventional semiconductor-device-fabricationfor fabricating http://www.answers.com/topic/microprocessor are now capable of creating features smaller than 100 nm, falling under the definition of nanotechnology. http://www.answers.com/topic/giant-magnetoresistance-1-based hard drives already on the market fit this description,http://www.answers.com/topic/nanotechnology#wp-_note-0 as do http://www.answers.com/topic/atomic-layer-deposition (ALD) techniques. http://www.answers.com/topic/peter-gr-nberg and http://www.answers.com/topic/albert-fert received http://www.answers.com/topic/nobel-prize-in-physics for their discovery of Giant magnetoresistance and contributions to the field of spintronics in 2007.http://www.answers.com/topic/nanotechnology#wp-_note-1 * Solid-state techniques can also be used to create devices known as nanoelectromechanical-systemsor NEMS, which are related to http://www.answers.com/topic/microelectromechanical-systems or MEMS. * http://www.answers.com/topic/atomic-force-microscope tips can be used as a nanoscale "write head" to deposit a chemical on a surface in a desired pattern in a process called dip-pen-nanolithography. This fits into the larger subfield of http://www.answers.com/topic/nanolithography. These seek to develop components of a desired functionality without regard to how they might be assembled. * molecular-electronics seeks to develop molecules with useful electronic properties. These could then be used as single-molecule components in a nanoelectronic device. For an example see http://www.answers.com/topic/rotaxane. * Synthetic chemical methods can also be used to create synthetic-molecular-motors, such as in a so-called http://www.answers.com/topic/nanocar. |}

1 answer


A mutation can lead to a variety of outcomes, including no change in the organism, a change in a particular characteristic, or the development of a genetic disorder. The impact of a mutation depends on factors such as where the mutation occurs in the DNA and what specific genes are affected.

6 answers


This article includes a list of references, related reading or external links, but its sources remain unclear because it lacks inline citations. Please improve this article by introducing more precise citations. (September 2009) Mesa Boogie Mark IV, a guitar combo amplifier A guitar amplifier (or guitar amp) is an electronic amplifier designed to make the signal of an electric or acoustic guitar louder so that it will produce sound through a loudspeaker. Most guitar amplifiers can also modify the instrument's tone by emphasizing or de-emphasizing certain frequencies and adding electronic effects. Amplifiers consist of one or more circuit stages which have unique responsibilities in the modification of the input signal. The power amplifier or output stage produces a high current signal to drive a speaker to produce sound. One or more preamplifier stages precede the power amplifier stage. The preamplifier is a voltage amplifier that amplifies the guitar signal to a level that can drive the power stage. There may be one or more tone stages which affect the character of the guitar signal: before the preamp stage (as in the case of guitar pedals), in between the preamp and power stages (as in the cases of effects loop or many dedicated amplifier tone circuits), in between multiple stacked preamp stages, or in feedback loops from a post-preamp signal to an earlier pre-preamp signal (as in the case of presence modifier circuits). The tone stages may also have electronic effects such as equalization, compression, distortion, chorus, or reverb. Amplifiers may use vacuum tubes (in Britain they are called valves), or solid-state (transistor) devices, or both. There are two configurations of guitar amplifiers: combination ("combo") amplifiers, which include an amplifier and anywhere from one to four speakers in a wooden cabinet; and the standalone amplifier (often called a "head" or "amp head"), which does not include a speaker, but rather passes the signal to a speaker cabinet or "cab". Guitar amplifiers range in price and quality from small, low-powered practice amplifiers, designed for students, which sell for less than US$50, to expensive "boutique" amplifiers which are custom-made for professional musicians and can cost thousands of dollars. Contents [hide] 1 History 2 Types 2.1 Vacuum tube amplifiers 2.2 Solid-state amplifiers 2.3 Hybrid amplifiers 2.4 Modeling amplifiers 2.5 Acoustic guitar amplifiers 3 Amplifier configuration 4 Distortion, power, and volume 4.1 Power output 4.2 Distortion and volume 4.2.1 Power-tube distortion 4.3 Volume controls 5 Use with other instruments 6 See also 7 References 8 Further reading 9 External links [edit]History The first electric instrument amplifiers were not designed for use with electric guitars. The earliest examples appeared in the early 1930s when the introduction of electrolytic capacitors and rectifier tubes allowed the production of economical built-in power supplies that could be plugged into wall sockets, instead of heavy multiple battery packs, since rechargeable batteries wouldn't be lightweight until later on. While guitar amplifiers from the beginning were used to amplify acoustic guitar, electronic amplification of guitar was first widely popularized by the 1930s and 1940s craze for Hawaiian music, which extensively employed the amplified lap steel Hawaiian guitar. Tone controls on early guitar amplifiers were very simple and provided a great deal of treble boost, but the limited controls, the loudspeakers used, and the low power of the amplifiers (typically 15 watts or less prior to the mid-1950s) gave poor high treble and bass output. Some models also provided effects such as spring reverb and/or an electronic tremolo unit. Early Fender amps labeled tremolo as "vibrato" and labeled the vibrato arm of the Stratocaster guitar as a "tremolo bar" (see vibrato unit, electric guitar, and tremolo). In the 1960s, guitarists experimented with distortion produced by deliberately overdriving their amplifiers. The Kinks guitarist Dave Davies produced early distortion effects by connecting the already distorted output of one amplifier into the input of another. Later, most guitar amps were provided with preamplifier distortion controls, and "fuzz boxes" and other effects units were engineered to safely and reliably produce these sounds. In the 2000s overdrive and distortion has become an integral part of many styles of electric guitar playing, ranging from blues rock to heavy metal and hardcore punk. Guitar amplifiers were at first used with bass guitars and electronic keyboards, but other instruments produce a wider frequency range and need a suitable amplifier and full-range speaker system. Much more amplifier power is required to reproduce low-frequency sound, especially at high volume. Reproducing low frequencies also requires a suitable woofer or subwoofer speaker and enclosure. Woofer enclosures need to be larger and more sturdily built than cabinets for mid-range or high-frequency (tweeter) speakers. [edit]Types Two combo amplifiers Guitar amplifiers are manufactured in two main forms. The "combination" (or "combo") amplifier contains the amplifier head and guitar speakers in a single unit which is typically housed in a rectangular wooden box. The amplifier head or "amp head" contains the electronic circuitry constituting the preamp, built-in effects processing, and the power amplifier. Combo amps have at least one 1/4" input jack where the patch cord from the electric guitar can be plugged in. Other jacks may also be provided, such as an additional input jack, "send" and "return" jacks to create an effects loop (for connecting electronic effects such as compression, reverb, etc.), an extension speaker jack (for connecting an additional speaker cabinet). Some smaller practice amps have stereo RCA jacks for connecting a CD player, iPod or other sound source and a 1/4" headphone jack so that the player can practice without disturbing neighbours or family members. Kustom 200 bass amp - amp head and speakers, 100 watts RMS, two channels, two 15" speakers, 1971 Some amplifiers have a line out jack for connecting the amplifier's signal to a PA system or recording console or to connect the amplifier to another guitar amp. In but most styles of rock and blues guitar, the line out is not used to connect the guitar amp to a PA system or recording console, because the tonal coloration and overdrive from the amplifier and speaker is considered an important part of the amplifier's sound. However, players do use the line out to connect one guitar amplifier to another amplifier, in order to create different tone colors or sound effects. In the "amp head" form, the amplifier head is separate from the speakers, and joined to them by speaker cables. The separate amplifier is called an amplifier head, and is commonly placed on top of one or more loudspeaker enclosures. A separate amplifier head placed atop a guitar speaker enclosure or guitar speaker cabinet forms an amplifier "stack" or "amp stack". Amp heads may also have the different types of input and output jacks listed above in the combo section. In addition to a 1/4" input jack, acoustic guitar amplifiers typically have an additional input jack for a microphone, which is easily identified because it will use a three-pin XLR connector. Phantom power is not often provided on general-use amps, restricting the choice of microphones for use with these inputs. However, for high-end acoustic amplifiers, phantom power is often provided, so that musicians can use condenser microphones. Amplifiers used with electric guitars may be solid-state, which are lighter in weight and less expensive than tube amplifiers. Most guitarists, particularly in the genres of blues and rock, prefer the sound of vacuum tube amplifiers despite their higher cost, heavier weight, the need to periodically replace tubes and need to re-bias the output tubes (every year or two with moderate use). Some companies design amplifiers that require no biasing as long as properly rated tubes are used. Some modern amplifiers use a mixture of tube and solid-state technologies. Since the advent of microprocessors and digital signal processing, "modeling amps" have been developed in the late 1990s, these can simulate the sounds of a variety of well-known tube amplifiers without needing to use vacuum tubes. Amplifiers with processors and software emulate the sound of a classic amp well, but from the player's point of view the response of these amplifiers may not feel the same as the digital modeling does not accurately model all aspects of a tube amplifier. A wide range of instrument amplifiers is available, some for general purposes and others designed for specific instruments or particular sounds. These include: "Traditional" guitar amplifiers, with a clean, warm sound, a sharp treble roll-off at 5 kHz or less and bass roll-off at 60-100 Hz, and often built-in reverb and tremolo (sometimes incorrectly called 'vibrato') units. These amplifiers, such as the Fender "Tweed"-style amps, are often used by traditional rock, blues, and country musicians.Traditional amps have more recently become popular with musicians in indie and alternative bands Hard rock-style guitar amplifiers, which often include preamplification controls, tone filters, and distortion effects that provide the amplifier's characteristic tone. Users of these amplifiers use the amplifier's tone to add "drive", intensity, and "edge" to their guitar sound. Amplifiers of this type, such as Marshall amplifiers, are used in a range of genres, including hard rock, metal, and punk. Bass amplifiers, with extended bass response and tone controls optimized for bass guitars (or more rarely, for upright bass). Higher-end bass amplifiers sometimes include compressor or limiter features, which help to keep the amplifier from distorting at high volume levels, and an XLR DI output for patching the bass signal directly into a mixing board. Bass amplifiers are often provided with external metal heat sinks or fans to help keep the amplifier cool. Acoustic amplifiers, similar in many ways to keyboard amplifiers but designed specifically to produce a "clean," transparent, "acoustic" sound when used with acoustic instruments with built-in transducer pickups and/or microphones. [edit]Vacuum tube amplifiers Main article: Valve amplifier The glow from four "Electro Harmonix KT88" brand power tubes lights up the inside of a Traynor YBA-200 bass guitar amplifier Vacuum tubes (valves) were by far the dominant active electronic components in most instrument amplifier applications until the 1970s, when semiconductors (transistors) started taking over for performance and economic reasons, including heat and weight reduction, and improved reliability. High-end tube instrument amplifiers have survived as one of few exceptions, because of the sound quality. Typically, one or more dual triodes are used in the preamplifier section in order to provide sufficient voltage gain to offset losses by tone controls and to drive the power amplifier section. Rear view of a tube (valve) combo guitar amplifier. Visible are two glass output tubes, six smaller preamp tubes in their metal tube retainers, and both the power transformer and the output transformer. The output tubes are often arranged in a class AB push-pull connection to improve efficiency; this requires another triode or dual triode to split the phase of the signal. The tubes of the power amplifier stage are almost always of the pentode or beam tetrode type (also known as "kinkless tetrodes", hence the KTxx nomenclature). Some high power models use paralleled pairs of output tubes (four or more in total) in push-pull. Except for the light negative feedback from the secondary end of the output transformer to the driver stage, most amplifying stages work in "raw" open-loop mode. Some designs employ current feedback via unbypassed cathode resistors. Since most tubes show "soft clipping" gain non-linearity, applying an input signal high enough to overdrive any stage tends to produce favorably natural distortion. Today, most vacuum tube amplifiers are based on the ECC83/12AX7/7025 (dual triode) tubes for the preamplifier and driver sections and the EL84/6BQ5 or EL34/6CA7/KT77 or 6L6/KT66 or 6V6 tubes for the power output section. Some use the KT88/6550 beam power tubes in the output stage. The differing codes for equivalent tubes generally reflect those used by the original European or U.S.A. based manufacturers. These tubes are now mainly manufactured in Russia, China and Eastern European countries. Some amplifiers, such as the Marshall Silver Jubilee, use solid-state components in the preamp, most commonly diodes, to create distortion, a design feature known as diode clipping. Tube instrument amplifiers are often equipped with lower-grade transformers and simpler power regulation circuits than those of hi-fi amplifiers. They are usually not only for cost-saving reasons, but also are considered as a factor in sound creation[citation needed]. For example, a simple power regulation circuit's output tends to sag when there is a heavy load (that is, high output power) and vacuum tubes usually lose gain factors with lower power voltages. This results in a somewhat compressed sound which could be criticized as a "poor dynamic range" in case of hi-fi amplifiers, but could be desirable as "long sustain" of sounds on a guitar amplifier. Some tube guitar amplifiers use a rectifier tube instead of solid-state diodes specifically for this reason. Unfortunately, most amplifiers offer a fixed amount of sag, and this fixed amount can only be attained at full volumes. A small minority of amplifiers offer sag control via either multiple rectifiers or the Sag Circuit (a non-traditional power supply design patented by Maven Peal® Instruments). Amplifiers with multiple rectifiers can offer up to two sag settings (amounts), while the Sag Circuit provides a Sag control knob, which allows range of sag control at all volumes (by interacting with a wattage control knob). Some models have a "spring reverb" unit that simulates the reverberation of an echoic ambient. A reverb unit usually consists of one or more coil springs driven by the preamplifier section using a transducer driver similar to a loudspeaker at one end and an electro-magnetic pickup and preamplifier stage at the other end that picks up the long sustaining spring vibration, which is then mixed with the original signal. Some guitar amplifiers have a tremolo control. An internal oscillator generates a low frequency continuous signal which can modulate the input signal's amplitude or the output tubes' bias, thereby producing a tremolo effect. Tube amps have the following technical disadvantages in comparison to solid-state amps. They are bulky and heavy, primarily due to the iron in power and output transformers. Solid-state amplifiers still require power transformers, but are usually direct-coupled and don't need output transformers. Glass tubes are fragile, and require more care and consideration when equipment is moved repeatedly. Tube performance can deteriorate slightly over time before eventual catastrophic failure. When tube vacuum is maintained at a high level, though, excellent performance and life is possible. They are prone to pick up mechanical noises (microphonic noise), although such electro-mechanical feedback from the loudspeaker to the tubes in combo amplifiers may contribute to sound creation. Tubes benefit from a heater warm-up period before the application of high tension anode voltages; this allows the tube cathodes to operate without damage and so prolongs tube life. This is of particular importance for amplifiers with solid-state rectifiers. Tube amps have the following technical advantages over solid-state amps. Compared to semiconductors, tubes have a very low "drift" (of specs) over a wide range of operating conditions, specifically high heat/high power. Semiconductors are very heat-sensitive by comparison and this fact usually leads to compromises in solid-state amplifier designs. When a tube fails, it is replaceable. While solid-state devices are also replaceable, it is usually a much more involved process (i.e., having the amplifier tested by a professional, removing the faulty component, and replacing it). For working musicians this is usually a huge problem by comparison to looking in the back of a tube amp at the tubes and simply replacing the faulty tube. In addition, tubes can easily be removed and tested, while transistors cannot. Tube amplifiers respond differently from transistor amplifiers when signal levels approach and reach the point of clipping. In a tube-powered amplifier, the transition from linear amplification to limiting is less abrupt than in a solid-state unit, resulting in a less grating form of distortion at the onset of clipping. For this reason, some guitarists prefer the sound of an all-tube amplifier; the aesthetic properties of tube versus solid-state amps, though, are a topic of debate in the guitarist community. [edit]Solid-state amplifiers This article's Criticism or Controversy section may compromise the article's neutral point of view of the subject. Please integrate the section's contents into the article as a whole, or rewrite the material; see the discussion on the talk page. (October 2011) Most inexpensive guitar amplifiers currently produced are based on semiconductor (solid-state) circuits, and some designs incorporate tubes in the preamp stage for their subjectively warmer overdrive sound. Tubes create warm overdrive sounds because instead of cutting the peaked signal off, they more or less pull the peaked audio information back (like natural compression) which creates a fuzzy overdrive sound. While this is a desirable attribute in many cases, the tube's characteristic will "color" all the sounds at any volume, unlike solid-state. High-end solid-state amplifiers are less common, since many professional guitarists tend to favor vacuum tubes. Some jazz guitarists, however, tend to favor the "colder" sound of solid-state amplifiers, preferring not to color the sound of their guitar with the tube distortion and compression so popular with rock, blues, and metal musicians.[citation needed]. Solid-state amplifiers vary in output power, functionality, size, price, and sound quality in a wide range, from practice amplifiers to professional models. Some inexpensive amplifiers have only a single volume control and a one or two tone controls[citation needed]. [edit]Hybrid amplifiers A tube power amp may be fed by a solid-state pre-amp circuit, as in the Fender Super Champ XD and the Roland Bolt amplifier, which is thereby classed as a 'hybrid' amp. Randall Amplifier's current flagship models, the V2 and T2, use hybrid amp technology. Alternatively, a tube pre-amp can feed a solid-state output stage, as in models from Kustom and Vox. This approach dispenses with the need for an output transformer and allow modern power levels to be easily achieved. The Roland Micro Cube, left, a small and portable digital modeling amplifier. [edit]Modeling amplifiers Modeling amplifiers use amplifier modeling to simulate the sound of well-known guitar amps, cabinets, and effects, as well as simulating the way traditional speaker cabinets sound when mixed with different types of microphones. They may also be an original creation not meant to simulate any particular real world guitar amp at all, instead allowing the user to create their own unique sound. Such as the original creations of companies like AcmeBarGig or Peavey. This is usually achieved through digital processing. Modeling technology offers several advantages over traditional amplification. A modeling amp typically is capable of a wide range of tones and effects, and offers cabinet simulation, so it can be recorded without a microphone. Most modeling amps digitize the input signal and use a DSP, a dedicated microprocessor, to process the signal with digital computation. Some modeling amps incorporate vacuum tubes, digital processing, and some form of power attenuation. [edit]Acoustic guitar amplifiers These amplifiers are designed to be used with acoustic guitars, especially for the way these instruments are used in relatively quiet genres such as folk and bluegrass. They are similar in many ways to keyboard amplifiers, in that they have a relatively flat frequency response, and they are usually designed so that neither the power amplifier nor the speakers will introduce additional coloration. To produce this relatively "clean" sound, these amplifiers often have very powerful amplifiers (providing up to 800 watts RMS), to provide additional "headroom" and prevent unwanted distortion. Since an 800 watt amplifier built with standard Class AB technology would be very heavy, some acoustic amplifier manufacturers use lightweight Class D amplifiers, which are also called "switching amplifiers." Acoustic amplifiers are designed to produce a "clean", transparent, "acoustic" sound when used with acoustic instruments with built-in transducer pickups and/or microphones. The amplifiers often come with a simple mixer, so that the signals from a pickup and microphone can be blended. Since the early 2000s, it has become increasingly common for acoustic amplifiers to be provided with a range of digital effects, such as reverb and compression. As well, these amplifiers often contain feedback-suppressing devices, such as notch filters or parametric equalizers. [1] [edit]Amplifier configuration A 3 x 6 stack of Marshall guitar cabinets for Jeff Hanneman of Slayer In the case of electric guitars, an amplifier stack consisting of a head atop one cabinet is commonly called a half stack, while a head atop two cabinets is referred to as a full stack. The cabinet which the head sits on often has an angled top in front, while the lower cabinet of a full stack has a straight front. The first version of the Marshall stack was an amp head on an 8x12 cabinet, meaning a single speaker cabinet containing eight 12" guitar speakers. After six of these cabinets were made, the cabinet arrangement was changed to an amp head on two 4x12 cabinets, meaning four 12" speakers, to enable transporting the amp rig. In heavy metal bands, the term "double stack" or "full stack" is sometimes used to refer to two stacks, with the main amplifier section of a second amplifier serving as a slave to the first and four speaker cabinets in total. Another name for the "Head & Cab" that comes from the 1960s and 1970s is "Piggyback". Vox amp stacks could be put on a tiltable frame with casters. Fender heads could be attached to the cab and had "Tilt-Back" legs, like those used on larger Fender combo amps. Typically, a guitar amp's preamplifier section (known as a 'pre') provides sufficient gain so that an instrument can be connected directly to its input, and its main amplification section (known as the 'power stage') sufficient power to connect loudspeakers directly to its output, both without requiring extra amplification. Some touring bands have used the appearance of a large array of guitar amplifiers for aesthetic reasons. Some of these arrangements include one or more actual guitar cabinets, while others do not. In reference to a photograph (left) taken during an Immortal gig, showing that what appeared to the audience to be a wide formation of equipment was actually only the fronts of amplifiers mounted on a large frame, Gizmodo writer Rosa Golijan investigated the phenomenon and found that it was "not too uncommon".[2] Another arrangement, often used for public address amplifier systems, is to provide two stages of amplification in separate units. First a preamplifier or mixer is used to boost the instrument output, normally to line level, and perhaps to mix signals from several instruments. The output from this preamplifier is then connected to the input of a power amplifier, which powers the loudspeakers. Performing musicians that use the "two-stage" approach (as opposed to an amplifier with an integrated preamplifier and power amplifier) often want to custom-design a combination of equipment that best suits their musical or technical needs, and gives them more tonal and technical options. Some musicians require preamps that include specific features. Acoustic performers sometimes require preamps with "notch" filters (to prevent feedback), reverb, an XLR DI output, or parametric equalization. Hard rock, metal, or punk performers may desire a preamplifier with a range of distortion effects. As well, some musicians have specific power amplifier requirements, such as low-noise design, very high wattage, the inclusion of limiter features to prevent distortion and speaker damage, or biamp-capable operation. With the "two-stage" approach, the preamplifier and power amplifier are often mounted together in a rack case. This case may be either free-standing or placed on top of a loudspeaker cabinet. If many rack-mounted effects are used, the rack may be a large unit on wheels. Some touring players need several racks of effects units to reproduce on stage the sounds they have produced in the studio. At the other extreme, if a small rack case containing both preamp and power amp is placed on top of a guitar speaker cabinet, the distinction between a rack and a traditional amp head begins to blur. Another variation is to combine the power amplifier into the speaker cabinet, an arrangement called a powered speaker, and use these with a separate preamp, sometimes combined into an effects pedal board or floor preamp/processor. Preamplifiers are also used to connect very low-output or high-impedance instruments to instrument amplifiers. When piezoelectric transducers are used on upright bass or other acoustic instruments, the signal coming directly from the transducer is often too weak and it does not have the correct impedance for direct connection to an instrument amplifier. Small, battery-powered preamps are often used with acoustic instruments to resolve these problems. [edit]Distortion, power, and volume [edit]Power output For electric guitar amplifiers, there is often a distinction between "practice" or "recording studio" guitar amps, which tend to have output power ratings of 20 watts down to a small fraction of a watt, and "performance" amps, which are generally 50 watts or higher. Traditionally, these have been fixed-power amplifiers, with a few models having a half-power switch to slightly reduce the listening volume while preserving power-tube distortion. The relationship between perceived volume and power output is not immediately obvious. A 5-watt amplifier is perceived to be half as loud as a 50-watt amplifier (a tenfold increase in power), and a half-watt amplifier is a quarter as loud as a 50-watt amp. Doubling the power of an amplifier results in a "just noticeable" increase in volume, so a 100-watt amplifier is held to be only just noticeably louder than a 50-watt amplifier. Such generalizations are also subject to the human ear's tendency to behave as a natural compressor at high volumes. Power attenuation can be used with either low-power or high-power amplifiers, resulting in variable-power amplifiers. A high-power amplifier with power attenuation can produce power-tube distortion through a wide range of listening volumes. Speaker efficiency is also a major factor affecting a tube amplifier's maximum volume. For bass instruments, higher-power amplifiers are needed to reproduce low-frequency sounds. While an electric guitarist would be able to play at a small club with a 50-watt amplifier, a bass player performing in the same venue would probably need an amplifier with 200 or more watts. [edit]Distortion and volume Distortion is a feature available on many guitar amplifiers that is not typically found on keyboard or bass guitar amplifiers. Tube guitar amplifiers can produce distortion through pre-distortion equalization, preamp tube distortion, post-distortion EQ, power-tube distortion, tube rectifier compression, output transformer distortion, guitar speaker distortion, and guitar speaker and cabinet frequency response. Distortion sound or "texture" from guitar amplifiers is further shaped or processed through the frequency response and distortion factors in the microphones (their response, placement, and multi-microphone comb filtering effects), microphone preamps, mixer channel equalization, and compression. Additionally, the basic sound produced by the guitar amplifier can be changed and shaped by adding distortion and/or equalization effect pedals before the amp's input jack, in the effects loop just before the tube power amp, or after the power tubes. [edit]Power-tube distortion Power-tube distortion is required for amp sounds in some genres. In a standard master-volume guitar amp, as the amp's final or master volume is increased beyond the full power of the amplifier, power tube distortion is produced. The "power soak" approach places the attenuation between the power tubes and the guitar speaker. In the re-amped or "dummy load" approach, the tube power amp drives a mostly resistive dummy load while an additional low power amp drives the guitar speaker. In the isolation box approach, the guitar amplifier is used with a guitar speaker in a separate cabinet. A soundproofed isolation cabinet, isolation box, isolation booth, or isolation room can be used. [edit]Volume controls A variety of labels are used for level attenuation potentiometers in a guitar amplifier and other guitar equipment. Electric guitars and basses have a volume control to attenuate whichever pickup is selected. There may be two volume controls in parallel to mix the signal levels from the neck and bridge pickups. Rolling back the guitar's volume control also changes the pickup's equalization or frequency response, which can provide pre-distortion equalization. The simplest guitar amplifiers have only a volume control. Most have at least a gain control and a master volume control. The gain control is equivalent to the distortion control on a distortion pedal, and similarly may have a side-effect of changing the proportion of bass and treble sent to the next stage. A simple amplifier's tone controls typically include passive bass and treble controls. In some cases, a midrange control is provided. The amplifier's master volume control restricts the amount of signal permitted through to the driver stage and the power amplifier. When using a power attenuator with a tube amplifier, the master volume no longer acts as the master volume control. Instead, the power attenuator's attenuation control controls the power delivered to the speaker, and the amplifier's master volume control determines the amount of power-tube distortion. Power-supply based power reduction is controlled by a knob on the tube power amp, variously labeled "Wattage", "Power", "Scale", "Power Scale", or "Power Dampening". [edit]Use with other instruments Musicians often run sound-sources other than guitars through guitar amps. For live performances, synthesizers and drum machines or keyboards are often put through guitar amps to create a richer sound than can be obtained by patching the direct-outs right into the PA system. Guitar amplifiers can add tonal coloration, roll off unwanted high frequencies, and add overdrive or distortion. Deep Purple's Jon Lord played his Hammond Organ through a distorted Marshall amp to create a sound more suitable for heavy rock. String instruments and vocals are also put through guitar amps to add distortion effects. Some blues harp players also use guitar or bass amps to create a warmer overdrive sound for their harmonica playing; 1950s-style "tweed" amps are often used for this purpose, such as Fender Bassman combo amps. Recording engineers occasionally run pre-recorded parts through miked guitar amps, a process called re-amping. When a guitar part is recorded "dry" (e.g., without effects or distortion), straight into the mixing board for a recording, this gives the producer and mixing engineer much more flexibility to create new re-mixes or new tones from the recording. If a guitar player records an electric guitar part that is run through a chorus pedal and a distortion pedal, there is little that can be done at the "mix-down" stage to change the sound of this recording, beyond "tweaking" the equalization and modifying the level. Since re-mixing or mixdown can take place weeks, months, or even years after the original recording session, it may be impossible to have the guitarist come in to re-record a new part. If the dry guitar sound is recorded, though, the mixing engineers can add any effects they want to the signal and then re-play it through a miked guitar amplifier which is being recorded. The effects, amplifiers, cabinets, and miking processes can be changed to any combination. When a dry guitar has been recorded, it can be a useful tool for "updating" an older recording. For example, if a band wants to re-release a 1980s-era album on which the guitar has a very dated 1980s sound, with heavy flanging and artificial-sounding electronic distortion, the band can update the guitar sound by re-amping the dry signal and using 2000s-era effects. Mixing guitar amp signals with other signals is also done by some musicians. Chris Squire of Yes produced his bass guitar sound by playing through a guitar amplifier with its bass turned down, treble turned up, and volume turned up well into distortion; the miked guitar speaker signal was then mixed with a direct-input (DI) signal, a technique that has been used for processing synth keyboards as well. A bass guitar can also be played through a bass amp in parallel with a distorted guitar amp by using a DI box; the bass amp provides the low frequencies, while the guitar amp - which is not capable of reproducing the lowest frequencies of the bass guitar- emphasizes the upper harmonics of the instrument's tone. [edit]See also List of guitar amplifier manufacturers Power attenuator (guitar) Instrument amplifier Bass instrument amplification Electronic amplifier Valve sound Guitar effects Guitar speaker cabinet Re-amp [edit]References ^ Note: This style of amplifiers should not be confused with the brand of guitar and bass amplifiers called Acoustic, still available in second-hand music stores.) ^ Golijan, Rosa (22 September 2010). "The Concert Speakers Are A Lie". Gizmodo. Retrieved 16 January 2011. [edit]Further reading Weber, Gerald, "A Desktop Reference of Hip Vintage Guitar Amps", Hal Leonard Corporation, 1994. ISBN 0964106000 [edit]External links Wikimedia Commons has media related to: Guitar Wikibooks has a book on the topic of Guitar/Buying an amplifier Guitar portal Duncan's amp pages - All sorts of information, especially on tube guitar amplifiers. AX84 - Learn to build your own tube guitar amplifier with free schematics and plans. Amplified Parts Tech Corner - Technical resources and diagrams for modifying and repairing guitar amps Photos of Vintage Tube Guitar Amplifiers Rebuilding a Fender Deluxe Reverb Tube Amplifier - Information on modification and conversion of amplifiers [show]v · d · eGuitars [show]v · d · eMusic technology

1 answer