Data collectors use the Collector's App to check the quality of their data; and to convert it into CRITTERBASE, too. It is platform independent (macOS, Linux and Windows) and written in Python.
The Collector's App does not depend on any server structure. You can bring your data into the correct format locally and work with it, absolutely securely. You maintain complete control over your data.
The CRITTERBASE Web service is hosted by the AWI Computing Centre and offers online access to publicly available data.
We store our research data securely on the GitLab service of the AWI Computing Centre. Any changes are logged, automatic backups are made regularly and any transport is SSL-encrypted directly via git client software like Sourcetree.
A direct machine-to-machine communication with the CRITTERBASE Web service is also possible through a REST interface to allow for software-based queries.
The CRITTERBASE Web service can also be accessed via the REST interface within the AWI JupyterHub or other instances of the AWI Marketplace, with all data and code remaining securely in one place in the AWI Computing Centre.
All data is subjected to a rigorous quality control. This includes the initial reading in of the data as well as the checking of data that has already been imported.
The Collector’s App can create useful reports and overviews regarding the overall quality of the CRITTERBASE. Especially with very large amounts of data, this is absolutely essential in order to guarantee high data quality. Particular emphasis was placed on user-friendly feedback.
When the Collector's App builds a CRITTERBASE, it sets up a clean PostgreSQL database. Especially if you are working locally, this means you can make direct queries to your local CRITTERBASE via SQL. This can also be done in R and Python - keeping everything you need for analyses and modelling neatly in one place
Due to the modular and generic architecture and the choice of an agile programming language, it is relatively easy to make adjustments to the entire CRITTERBASE system.
The whole project is Open Source and uses Ubuntu Linux, Python, Qt for Education, OpenJDK, jQuery, Apache, PostgreSQL and PL/pgSQL.
Alfred-Wegener-Institut, Bremerhaven
Karen Albers | Developer (Web service), computer science |
Jan Beermann | Project member BENOSIS - data provider, biology - ecology |
Thomas Brey | Main lead of CRITTERBASE, core member, biology - ecology |
Kerstin Beyer | Support with data collection, biology - ecology |
Daniel Damaske | Advisory on data management & digitization at Deutsche Allianz Meeresforschung |
Jennifer Dannheim | Project leader BENOSIS, core member, biology - ecology |
Stephan Frickenhaus | Strategic support in data science/publication enabling, computer science |
Manuela Gusky | Support in data collection, biology - ecology |
Birgit Glückselig | Support with data collection, biology - ecology |
Michael Günster | AWI GitLab support, computer science |
Miriam Hansen | Support in data entry in CRITTERBASE, biology - ecology |
AWI Helpdesk | Helping in countless IT problems |
Kerstin Jerosch | EU project coordination CoastCarb |
Paul Kloss | Lead developer, core member, computer science |
Gesche Krause | Strategic support in enabling stakeholder interaction, social science |
Roland Koppe | Lead support of AWI Computing Centre, Lead developer (Web service, computer science) |
Peter Konopatzky | Support in mapping and geo-referencing, computer science |
Rebecca Konijnenberg | Project member WEECOS - data collector & R-code provider, modelling |
Casper Kraan | Support for data entry in CRITTERBASE, biology - ecology |
Joerg Matthes | Support in virtual machines |
Petra Meyer | Administration |
PANGAEA team | Support for making data citable |
Hendrik Pehlke | Project member WEECOS - R-code provider, modelling |
Dieter Piepenburg | Project leader PANABIO, core member, biology - ecology |
Stefan Pinkernell | Support on AWI JupyterHub and AWI Marketplace, computer science |
Katharina Teschke | Project leader WEECOS, core member, biology - ecology |
Tawfik Sabbagh | Developer (Web service), computer science |
Andreas Walter | Support in mapping and geo-referencing, computer science |
Paul Wachter | Support in data entry in CRITTERBASE, computer science |
Alexa Wrede | Project member of upcoming CRITTERTRAITS & R-code contributor, biology - ecology |
Bundesamt für Seeschifffahrt und Hydrographie, Hamburg
Anne Elsner | Data-exchange cooperation between MARLIN and CRITTERBASE |
Gregor v. Halem | Data-exchange cooperation between MARLIN and CRITTERBASE |
Financial support
AWI | Alfred-Wegener-Institut through ESKP (Earth System Knowledge Platform) |
BMEL | Bundesministerium für Ernährung und Landwirtschaft |
BSH | Bundesamt für Seeschifffahrt und Hydrographie, Hamburg |
DFG | Deutsche Forschungsgemeinschaft through Nationale Forschungsdateninfrastruktur (NFDI-4biodiversity) |
HIFMB | Helmholtz Institut für Funktionelle Marine Biodiversität an der Universität von Oldenburg |
LIVE DEMONSTRATION
Would you like a little demonstration of the CRITTERBASE backend?
- Introduction into all relevant sub systems.
- Everything is shown in real time.
- Helps you get a good overview of the entire system.
- Best enjoyed with a cup of coffee.
- It takes a while ... :)
- You may want to use the jumps marks below for navigation.
⚠ for internal use only - do not share this file ⚠
-
Create new CRITTERBASE
- Create from scratch
- Setup Credentials
- Create empty CRITTERBASE
-
Connect
to new CRITTERBASE
-
Harvesting
FAO regions
-
Export/Conversion
-
Export
data to EXCEL
-
PANGAEA
Format
-
SQL
Export protocol
-
Convert
to MARLIN
-
Keeping track
-
DOI
control
-
Updating
Collector's App
WEB SERVICE
The more data marine ecologist contribute to CRITTERBASE, the more useful it will become for all of us.
- Our Web service is hosted by the AWI Computing Centre.
- Only quality checked samples can be found there.
- This is the offical version.
- Here is a sneak peak of the upcoming version.
Take a look at some screenshots and images
RELEASE TRAILER
Our trailer for the release of the CRITTERBASE Web service. We got a long way to go:
- 10 years data mining and preparation
- 400,000 records (so far)
- 4 years of software development
- 130,000 lines of code (so far)
Download video file: normal quality (120 MB) (19.06.2021)
⚠ for internal use only - do not share this file ⚠