Server rental store

AI in the Sint Eustatius Rainforest

AI in the Sint Eustatius Rainforest: Server Configuration

This article details the server configuration supporting the “AI in the Sint Eustatius Rainforest” project. This project leverages artificial intelligence to monitor and analyze biodiversity within the rainforest ecosystem of Sint Eustatius. This document is intended for new system administrators and developers joining the project, providing a comprehensive overview of the server infrastructure.

Project Overview

The “AI in the Sint Eustatius Rainforest” project utilizes a network of sensor nodes deployed throughout the rainforest, collecting data on audio, visual, and environmental conditions. This data is transmitted to a central server cluster for processing and analysis using machine learning algorithms. The primary goals are species identification, population monitoring, and early detection of ecological changes. See also Data Acquisition Pipeline and Machine Learning Models. Understanding the Sensor Network Topology is critical for troubleshooting.

Server Infrastructure

The server infrastructure is hosted within a secure, climate-controlled data center located in St. Eustatius. The core components consist of three primary server roles: Data Ingestion, Processing/AI, and Data Storage. These are all running on bare metal servers for performance and security reasons. We avoid virtual machines due to the real-time processing demands. See Security Protocols for details on data center security.

Data Ingestion Servers

These servers are responsible for receiving data streams from the sensor network. They perform initial data validation and buffering before forwarding the data to the processing cluster. These servers run a custom-built ingestion service written in Python. Refer to the Ingestion Service Documentation for details.

Server Name Role Operating System CPU RAM Network Interface
statia-ingest-01 Data Ingestion Ubuntu Server 22.04 LTS Intel Xeon Silver 4310 (12 cores) 64 GB DDR4 ECC 10 Gbps Ethernet
statia-ingest-02 Data Ingestion (Backup) Ubuntu Server 22.04 LTS Intel Xeon Silver 4310 (12 cores) 64 GB DDR4 ECC 10 Gbps Ethernet

Processing/AI Servers

These servers are the heart of the project, performing the computationally intensive tasks of machine learning model training and inference. They utilize GPUs to accelerate these processes. These servers run a containerized environment using Docker and Kubernetes for scalability and manageability. See the Kubernetes Configuration document.

Server Name Role Operating System CPU RAM GPU Storage
statia-ai-01 AI Processing Ubuntu Server 22.04 LTS Intel Xeon Gold 6338 (32 cores) 128 GB DDR4 ECC NVIDIA A100 (80GB) 4 TB NVMe SSD
statia-ai-02 AI Processing Ubuntu Server 22.04 LTS Intel Xeon Gold 6338 (32 cores) 128 GB DDR4 ECC NVIDIA A100 (80GB) 4 TB NVMe SSD
statia-ai-03 AI Processing Ubuntu Server 22.04 LTS Intel Xeon Gold 6338 (32 cores) 128 GB DDR4 ECC NVIDIA A100 (80GB) 4 TB NVMe SSD

Data Storage Servers

These servers provide persistent storage for the raw sensor data, processed data, and model artifacts. They utilize a distributed file system for redundancy and scalability. We use Ceph as our distributed filesystem. See Ceph Cluster Configuration for details.

Server Name Role Operating System CPU RAM Storage
statia-storage-01 Data Storage Ubuntu Server 22.04 LTS Intel Xeon Silver 4310 (12 cores) 96 GB DDR4 ECC 16 TB HDD (RAID6)
statia-storage-02 Data Storage Ubuntu Server 22.04 LTS Intel Xeon Silver 4310 (12 cores) 96 GB DDR4 ECC 16 TB HDD (RAID6)
statia-storage-03 Data Storage Ubuntu Server 22.04 LTS Intel Xeon Silver 4310 (12 cores) 96 GB DDR4 ECC 16 TB HDD (RAID6)

Networking

All servers are connected via a dedicated 10 Gbps network. A firewall protects the network from external threats. Internal communication between servers is secured using TLS/SSL. See Network Diagram for a visual representation of the network topology. Network monitoring is handled by Nagios Monitoring System.

Software Stack

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️