Difference between revisions of "Get your own copy of WikiData"
Line 3: | Line 3: | ||
See {{Link|target=SPARQL}} for some examples that work online (mostly) without hitting these limits. | See {{Link|target=SPARQL}} for some examples that work online (mostly) without hitting these limits. | ||
+ | |||
+ | == What are alternative endpoints ? == | ||
+ | * [https://qlever.cs.uni-freiburg.de/wikidata QLever cs.uni-freiburg.de] | ||
= Prerequisites = | = Prerequisites = |
Revision as of 06:05, 29 January 2024
Why would you want your own Wikidata copy?
The resources behind https://query.wikidata.org/ are scarce and used by a lot of people. You might hit the https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_limits quite quickly.
See SPARQL for some examples that work online (mostly) without hitting these limits.
What are alternative endpoints ?
Prerequisites
Getting a copy of Wikidata is not for the faint of heart.
You need quite a bit of patience and some hardware resources to get your own WikiData copy working. The resources you need are a moving target since Wikidata is growing all the time.
On the other hand solutions such as QLever are making progress so that you can run your own copy of Wikidata on commodity hardware with an AMD Ryzen 9 16 core processor, 32 GB of RAM a 2 TTB SSD and the indexing will take less than 5 h. Together with the download time of some 6 hours that's less than half a day for getting a current copy so a daily update is feasible these days.
Successes
Please contact Wolfgang Fahl if you'd like to see your own success report added to the "Reports" table below. The imports table is generated from the documentation of our own Wikidata import trials in this semantic mediawiki.
Reports
# Date | Source | Target | Triples | days | RAM GB | CPU Cores | Speed | Link |
---|---|---|---|---|---|---|---|---|
2024-01-29 | latest-all.nt (2024-01-29) | QLever | 19.1 billion | 4.5 hours | 32 | 16 | AMD Ryzen 9 | Hannah Bast - QLever |
2022-07 | latest-all.ttl (2022-07-12) | stardog | 17.2 billion | 1 d 19 h | 253 | Tim Holzheim - BITPlan Wiki | ||
2022-06 | latest-all.nt (2022-06-25) | QLever | 17.2 billion | 1 d 2 h | 128 | 8 | 1.8 GHz | Wolfgang Fahl - BITPlan Wiki |
2022-05 | latest-all.ttl.bz2 (2022-05-29) | QLever | ~17 billion | 14h | 128 | 12/24 | 4.8 GHz boost | Hannah Bast - QLever |
2022-02 | latest-all.nt (2022-02) | stardog | 16.7 billion | 9h | Evren Sirin - stardog | |||
2022-02 | latest-all.nt (2022-01-29) | QLever | 16.9 billion | 4 d 2 h | 127 | 8 | 1.8 GHz | Wolfgang Fahl - BITPlan Wiki |
2020-08 | latest-all.nt (2020-08-15) | Apache Jena | 13.8 billion | 9 d 21 h | 64 | Wolfgang Fahl BITPlan Wiki | ||
2020-07 | latest-truthy.nt (2020-07-15) | Apache Jena | 5.2 billion | 4 d 14 h | 64 | Wolfgang Fahl BITPlan Wiki | ||
2020-06 | latest-all.ttl (2020-04-28) | Apache Jena | 12.9 billion | 6 d 16 h | ? | Jonas Sourlier - Jena Issue 1909 | ||
2020-03 | latest-all.nt.bz2 (2020-03-01 | Virtuoso | ~11.8 billion | 10 hours + 1day prep | 248 | Hugh Williams - Virtuoso | ||
2019-10 | blazegraph | ~10 billion | 5.5 d | 104 | 16 | Adam Shoreland Wikimedia Foundation | ||
2019-09 | latest-all.ttl (2019-09) | Virtuoso | 9.5 billion | 9.1 hours | ? | Adam Sanchez - WikiData mailing list | ||
2019-05 | wikidata-20190513-all-BETA.ttl | Virtuoso | ? | 43 hours | ? | - | ||
2019-05 | wikidata-20190513-all-BETA.ttl | Blazegraph | ? | 10.2 days | Adam Sanchez WikiData mailing list | |||
2019-02 | latest-all.ttl.gz | Apache Jena | ? | > 2 days | ? | corsin - muncca blog | ||
2018-01 | wikidata-20180101-all-BETA.ttl | Blazegraph | 3 billion | 4 days | 32 | 4 | 2.2 GHz | Wolfgang Fahl - BITPlan wiki |
2017-12 | latest-truthy.nt.gz | Apache Jena | ? | 8 hours | ? | Andy Seaborne Apache Jena Mailinglist |