Wikidata ??span>, a new project of the Wikimedia Foundation (WMF) – which manages among others Wikipedia, Wikitionary, Wikiquote – was officially launched these days.
Into On the introductory page of Wikidata ??span> comes this description, which is also a statement of intent: “Wikidata secondary database is a free, collaborative and multi-lingual data collection structured, which has the purpose of providing support to Wikipedia, Wikimedia Commons, the other Wikimedia projects, and much more. ” Into The key to understand the main differences between Wikidata and Wikipedia are “structured data” on the portal you will find media files or articles (like on Wikipedia), but data packets containing information to be analyzed. Since this is “structured data” information can be automatically loaded into a large number of languages ??(thus resulting in the addition or expansion of multi-l anguage articles on Wikipedia) and can be read and modified by humans and machines. Into From the initial declaration of Wikidata emerge a few key principles of the project:Into free: Data is published under a free license and are therefore reusable;
Into collaborative: the data are entered and managed by the users themselves to Wikidata, as with Wikipedia;
Into Multilingual: the data entered in one language will be immediately available in all other languages, and you can help in any language;
Into a secondary database: in addition to the data, Wikidata also register their source, to support the notion of verifiability;
Into structured data collection: the collection of data in a structured form, it will allow easy reuse of Wikimedia projects and third parties, as well as allow the machines to read them, and getirli trial.
Into short, a large container of Big Data released in an Open. The main aim is to support Wikipedia, “making it easier to manage the connections between versions username infobox, reducing its relative on Wikipedia and improving quality.” But it is obvious that such a pool of data may have many other functions, not least to give a further boost to the hot topic of Big Data. Into The project, started by a group of researchers from Wikimedia Deutschland in October 2012, was financed by a donation given by ‘ Allen Institute for Artificial Intelligence , the Gordon and Betty Moore Foundation and Google Inc. , amounting to 1.3 million euro. After the first two phases (inclusion of the links of the different language versions of Wikipedia, to create a single centralized system of management interlink , and integration of reusable data) has to a few days kicked off the third phase, which aims to develop the ability to create lists and charts automatic, starting from the data is Wikipedia. Into Currently, the page Wikidata La Stampa looks like this: a set of interlink that connect the different language versions of Wikipedia dedicated to La Stampa. All that remains is to wait for the development delal third stage. Into Finally, a curiosity: the original logo reported only the word “date” under the colorful bar code, bar code that is not: it is actually the encoding of the word Wiki in Morse code.