{"name":"data-integration-for-paleoenvironmental-and-archaeological-gis-based-analysis1428408827","title":"Data integration for paleoenvironmental and archaeological GIS based analysis","maintainer":"Christian Willmes","maintainer_email":"c.willmes@uni-koeln.de","notes":"A spatio-temporal data base integrating paleo environmental and archaeological data is presented in this contribution. The data base is developed applying a collaborative prototyping approach based on Semantic Mediawiki (SMW, Kr\u00f6tzsch et al. 2006) and OGC standards based Spatial Data Infrastructure technology. \r\nTime and space are the main integrating factors of the presented data base. The data is spatially integrated by its spatial extent. For GIS datasets this extent is present intrinsically. For data, not given in a GIS data format, or not containing explicit geo coordinates, the spatial integration is facilitated by annotating spatial attributes with predefined regions or sites, which translate to bounding boxes, polygons or point coordinates. The same is implemented for temporal data, where the data can be annotated with predefined periods and events, which translate into time-spans (periods) between a start and an end date, or into simple dates (events).\r\nDeveloping the data base using a prototyping (Naumann & Jenkins 1982) approach means to develop the data model and the interface from the integrated data, and from demands of the users during internal use and operation of the data base application. SMW proves to be an ideal technology for implementing this approach, because model, data and interface are editable and combined in one application. \r\nBased on some example use cases for integrated paleo environmental and archaeological GIS analysis, it will be shown how geographical and archaeological relations within the data base can be revealed. SMW allows to formulate complex queries and export the result in different formats. The complete process from query formulation, the query result export, the import of the data into a GIS, and an example GIS analysis based on the data will be explained. ","type":"researchdata","resources":[{"description":"","format":"PDF","url":"http://afs/rrz.uni-koeln.de/project/sfb806db/Phase_1/Z/Z2/data/000/CAA2015_5G_Willmes.pdf","created":"2015-04-07T14:13:47.840784","state":"active","package_id":"2bee2dbb-2ebe-4374-abaa-3ddc43dd9cc3","last_modified":"2021-10-25T06:10:53","id":"8b721a09-b7c2-41de-bb06-eca414a90531","name":"CAA2015_5G_Willmes.pdf"}],"tags":[],"extras":[{"key":"bibtex:address","value":"Siena, Italy"},{"key":"bibtex:author","value":"Christian Willmes and Daniel Becker and Georg Bareth"},{"key":"bibtex:booktitle","value":""},{"key":"bibtex:citation","value":""},{"key":"bibtex:doi","value":""},{"key":"bibtex:editor","value":""},{"key":"bibtex:howpublished","value":""},{"key":"bibtex:isbn","value":""},{"key":"bibtex:journal","value":""},{"key":"bibtex:month","value":""},{"key":"bibtex:number","value":""},{"key":"bibtex:organization","value":"Computer Applications in Archaeology (CAA)"},{"key":"bibtex:pages","value":""},{"key":"bibtex:publisher","value":"CAA Conference 2015"},{"key":"bibtex:school","value":""},{"key":"bibtex:series","value":""},{"key":"bibtex:type","value":"presentation"},{"key":"bibtex:key","value":""},{"key":"bibtex:url","value":""},{"key":"bibtex:volume","value":""},{"key":"bibtex:year","value":"2015"}],"groups":[{"image_url":"http://crc806db.uni-koeln.de/fileadmin/template/images/projects/imageZ2.png","name":"z2"}],"author":"Christian Willmes","author_email":false}{"help":"Update a dataset (package).\n\n You must be authorized to edit the dataset and the groups that it belongs\n to.\n\n Plugins may change the parameters of this function depending on the value\n of the dataset's ``type`` attribute, see the ``IDatasetForm`` plugin\n interface.\n\n For further parameters see ``package_create()``.\n\n :param id: the name or id of the dataset to update\n :type id: string\n\n :returns: the updated dataset (if 'return_package_dict' is True in the\n context, which is the default. Otherwise returns just the\n dataset id)\n :rtype: dictionary\n\n ","success":false,"error":{"message":"Unable to add package to search index: Solr returned an error: 500 Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock \tat org.apache.lucene.store.Lock.obtain(Lock.java:85) \tat org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1562) \tat org.apache.lucene.index.IndexWriter.(IndexWriter.java:1418) \tat org.apache.solr.update.SolrIndexWriter.(SolrIndexWriter.java:191) \tat org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:98) \tat org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:173) \tat org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:220) \tat org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61) \tat org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:139) \tat org.apache.solr.handler.XMLLoader.load(XMLLoader.java:69) \tat org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:54) \tat org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:131) \tat org.apache.solr.core.SolrCore.execute(SolrCore.java:1317) \tat org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:338) \tat org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:241) \tat org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) \tat org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) \tat org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) \tat org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) \tat org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:766) \tat org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418) \tat org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) \tat or - \n\n\nError 500 Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock\n\norg.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@\/var\/lib\/solr\/data\/index\/lucene-d6f7b3bf6fe64f362b4d45bfd4924f54-write.lock\n\tat org.apache.lucene.store.Lock.obtain(Lock.java:85)\n\tat org.apache.lucene.index.IndexWriter.init(IndexWriter.java:1562)\n\tat org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1418)\n\tat org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:191)\n\tat org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:98)\n\tat org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:173)\n\tat org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:220)\n\tat org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateP","__type":"Search Index Error"}}