This is a web application built with Dash for visualizing data recorded with ecoPi real-time audio recorders. The application allows users to explore the data collected by ecoPi and learn more about the species present in the monitored areas.
ecoPi is a recording device used in various projects to perform acoustic monitoring and study the biodiversity of birds and other wildlife. The device collects audio data in real-time, which is then processed using machine learning models to identify species. The ecoPi Real Time Frontend (ecoPi-RTF) web application allows users to browse the data collected by ecoPi devices and view the species detected in the recordings.
To learn more about the recording units, visit the OekoFor website.
We currently support these monitoring projects:
- SWAMP: Sapsucker Woods Acoustic Monitoring Project - birdnet.cornell.edu/swamp/
- AMiC: Acoustic Monitoring in Chemnitz - birdnet.cornell.edu/amic/
Interested? Want to host your own project? Please don't hesitate to contact us at [email protected].
ecoPi-RTF is a collaboration between the K. Lisa Yang Center for Conservation Bioacoustics at the Cornell Lab of Ornithology, Chemnitz University of Technology, and OekoFor GbR.
- Clone the repository
git clone https://github.com/birdnet-team/swamp.git- Create and activate a virtual environment
cd swamp
python3 -m venv .venv
source .venv/bin/activate- Install the required packages
pip3 install -r requirements.txt- Create a file
.envand add your OekoFor API key to theAPI_TOKENkey
API_TOKEN=<your api token>Note: You'll need an OekoFor API key to run the app. Please send an email to [email protected] to request API access.
- Get your OpenWeatherMap API key from OpenWeatherMap and add it to the
.envfile
OWM_API_KEY=<your api key>- Add these additional environment variables to the
.envfile:
CONFIG_FILE=configs/swamp_config.yaml
SITE_ROOT=''
PORT=8050Note: If you want to create a new project, you can create a new config file in the configs directory and copy the swamp_config.yaml file as a template.
This is a Dash app, so you can run it with the following command:
python3 app.pyThe app will be available at http://localhost:8050/.
You can also specify config files, site root (in case of URL forwarding), and dedicated port using command line arguments:
python3 app.py --config_file configs/swamp_config.yaml --site_root /swamp --port 8050We use Gunicorn to run the app in production.
Install Gunicorn with the following command:
sudo apt-get install gunicornYou can now run the app with the following command (from the root directory of the project):
gunicorn app:server --bind 0.0.0.0:8050 --workers 4 --env CONFIG_FILE=configs/swamp_config.yaml --env SITE_ROOT=/swampThe app will be available at http://localhost:8050/. You can specify the number of workers to run with the --workers flag based on the number of cores available on your machine. Make sure to set 'debug=False' in the app.py file before running the app in production. You also may have to set SITE_ROOT in when using URL forwarding in your domain.
To ensure that the cache is refreshed every 30 minutes, you can set up a cron job. Follow these steps:
- Open the crontab file for editing:
crontab -e- Add the following line to the crontab file to call the cache route every 30 minutes:
*/30 * * * * curl -X GET http://localhost:8050/cache- Save and close the crontab file.
Make sure your application is running and accessible at http://localhost:8050/ (or your specific site URL) before setting up the cron job.
- Source Code: The source code for this project is licensed under the MIT License.
- Models: The models used in this project are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0).
Please ensure you review and adhere to the specific license terms provided with each model.
Please note that all educational and research purposes are considered non-commercial use and it is therefore freely permitted to use BirdNET models in any way.
Feel free to use BirdNET for your acoustic analyses and research. If you do, please cite as:
@article{kahl2021birdnet,
title={BirdNET: A deep learning solution for avian diversity monitoring},
author={Kahl, Stefan and Wood, Connor M and Eibl, Maximilian and Klinck, Holger},
journal={Ecological Informatics},
volume={61},
pages={101236},
year={2021},
publisher={Elsevier}
}Our work in the K. Lisa Yang Center for Conservation Bioacoustics is made possible by the generosity of K. Lisa Yang to advance innovative conservation technologies to inspire and inform the conservation of wildlife and habitats.
The development of BirdNET is supported by the German Federal Ministry of Research, Technology and Space (FKZ 01|S22072), the German Federal Ministry for the Environment, Climate Action, Nature Conservation and Nuclear Safety (FKZ 67KI31040E), the German Federal Ministry of Economic Affairs and Energy (FKZ 16KN095550), the Deutsche Bundesstiftung Umwelt (project 39263/01) and the European Social Fund.
BirdNET is a joint effort of partners from academia and industry. Without these partnerships, this project would not have been possible. Thank you!

