{"name":"how-to-exchange-data-between-db-systems-on-early-humans-a-case-study-based-on-the-sfb-801422984167","title":"How to exchange data between DB Systems on Early Humans. A case study based on the SFB 806 DB and the ROCEEH ROAD system","maintainer":"Christian Willmes","maintainer_email":"c.willmes@uni-koeln.de","notes":"In the recent past data base systems providing information on early humans and their environment are becoming more and more important and increase rapidly in number. However, this increase in different DB systems is concomitant with an increasing redundancy in the digital information stored in these database systems. Therefore, in this study we explore ways to reduce redundancies due to multiple storage of data and, hence, we show solutions to minimize the requirements to store and manage digital information in the prehistory and paleoenvironment domains.\r\n\r\nThe example is based on the database systems of the DFG financed SFB 806: \u201cOur Way to Europe\u201d and the Heidelberg Academy of Sciences and Humanities project entitled: \u201cThe Role Of Culture in Early Expansion of Humans (ROCEEH). We focus especially on the spatial data available in both systems as well as on the environmental information. Therefore, we examine and test exchange interfaces based on Spatial Data Infrastructure technology (OGC Standards) and metadata/schema mappings.\r\n\r\nThe poster presents currently implemented interfaces of both data base systems in terms of their main commonalities and differences. Based on this overview, we discuss ways of direct links between the DB systems. Moreover, we identify procedures that need to be developed in the future to integrate and exchange data between both systems.\r\n\r\nWe show that DB-linking activities based on the OGC standards yield valuable results and lead to a more efficient, sustainable management of these DB systems providing added values for the related research groups.","type":"researchdata_literature","resources":[{"description":"","format":"PDF","url":"http://afs/rrz.uni-koeln.de/project/sfb806db/Phase_1/Z/Z2/data/000/ROCEEH_SFB806DB.pdf","created":"2015-02-03T19:00:07.365725","state":"active","package_id":"e95ba22b-20b1-4537-9f03-40ed20922831","last_modified":"2021-10-23T06:10:41","id":"7bdc562e-a447-43cb-9bf5-f8ae060c92f9","name":"ROCEEH_SFB806DB.pdf"}],"tags":[],"extras":[{"key":"bibtex:address","value":"2nd Data Management Workshop, Cologne"},{"key":"bibtex:author","value":"Michael M\u00e4rker and Christian Willmes and Volker Hochschild and Georg Bareth"},{"key":"bibtex:booktitle","value":""},{"key":"bibtex:citation","value":""},{"key":"bibtex:doi","value":"10.5880/SFB806.11"},{"key":"bibtex:editor","value":""},{"key":"bibtex:howpublished","value":""},{"key":"bibtex:isbn","value":""},{"key":"bibtex:journal","value":""},{"key":"bibtex:month","value":""},{"key":"bibtex:number","value":""},{"key":"bibtex:organization","value":""},{"key":"bibtex:pages","value":""},{"key":"bibtex:publisher","value":"CRC806-Database"},{"key":"bibtex:school","value":"University of Cologne"},{"key":"bibtex:series","value":""},{"key":"bibtex:type","value":"poster"},{"key":"bibtex:key","value":""},{"key":"bibtex:url","value":""},{"key":"bibtex:volume","value":""},{"key":"bibtex:year","value":"2015"}],"groups":[{"image_url":"http://crc806db.uni-koeln.de/fileadmin/template/images/projects/imageZ2.png","name":"z2"}],"author":"Christian Willmes","author_email":false}{"help":"Update a dataset (package).\n\n You must be authorized to edit the dataset and the groups that it belongs\n to.\n\n Plugins may change the parameters of this function depending on the value\n of the dataset's ``type`` attribute, see the ``IDatasetForm`` plugin\n interface.\n\n For further parameters see ``package_create()``.\n\n :param id: the name or id of the dataset to update\n :type id: string\n\n :returns: the updated dataset (if 'return_package_dict' is True in the\n context, which is the default. Otherwise returns just the\n dataset id)\n :rtype: dictionary\n\n ","success":false,"error":{"message":"Unable to add package to search index: Solr returned an error: 500 Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock \tat org.apache.lucene.store.Lock.obtain(Lock.java:85) \tat org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1562) \tat org.apache.lucene.index.IndexWriter.(IndexWriter.java:1418) \tat org.apache.solr.update.SolrIndexWriter.(SolrIndexWriter.java:191) \tat org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:98) \tat org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:173) \tat org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:220) \tat org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61) \tat org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:139) \tat org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69) \tat org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54) \tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131) \tat org.apache.solr.core.SolrCore.execute(SolrCore.java:1317) \tat org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338) \tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241) \tat org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) \tat org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) \tat org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) \tat org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) \tat org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766) \tat org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418) \tat org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) \tat or - \n\n\nError 500 Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock\n\norg.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock\n\tat org.apache.lucene.store.Lock.obtain(Lock.java:85)\n\tat org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1562)\n\tat org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1418)\n\tat org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:191)\n\tat org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:98)\n\tat org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:173)\n\tat org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:220)\n\tat org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateP","__type":"Search Index Error"}}