AI in the Guernsey Rainforest

From Server rental store
Revision as of 09:54, 16 April 2025 by Admin (talk | contribs) (Automated server configuration article)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
  1. AI in the Guernsey Rainforest: Server Configuration

This article details the server configuration supporting the "AI in the Guernsey Rainforest" project. This project utilizes machine learning to analyze audio and visual data collected from sensors deployed within the Guernsey Rainforest, aiding in biodiversity monitoring and conservation efforts. This guide is intended for new system administrators joining the team and provides a comprehensive overview of the hardware and software setup.

Project Overview

The "AI in the Guernsey Rainforest" project aims to automatically identify and classify species within the rainforest using data gathered from a network of remote sensors. These sensors capture audio recordings and low-resolution images. The data is processed locally at the edge, and then transmitted to a central server cluster for more complex analysis and long-term storage. This server cluster is the focus of this document. The project relies heavily on Semantic MediaWiki for data organization and querying.

Server Hardware Specifications

The server cluster consists of three primary server nodes: a database server, an application server, and a processing/inference server. Each node is built with redundancy in mind.

Server Role CPU RAM Storage Network Interface
Database Server Intel Xeon Gold 6248R (24 cores) 256 GB DDR4 ECC 4 x 4TB NVMe SSD (RAID 10) 10 Gigabit Ethernet
Application Server AMD EPYC 7763 (64 cores) 128 GB DDR4 ECC 2 x 2TB NVMe SSD (RAID 1) + 8TB HDD 10 Gigabit Ethernet
Processing/Inference Server 2 x NVIDIA Tesla A100 GPUs, Intel Xeon Gold 6338 (32 cores) 512 GB DDR4 ECC 1 x 2TB NVMe SSD 100 Gigabit Ethernet

All servers run on a dedicated VLAN for security and network isolation. Power redundancy is achieved through dual power supplies and a UPS system. See Server Room Infrastructure for more details.

Software Stack

The software stack is built around open-source components to minimize licensing costs and maximize flexibility. The operating system of choice is Ubuntu Server 22.04 LTS.

Database Server

The database server hosts the project's primary data store: a PostgreSQL database.

Component Version Configuration Notes
Operating System Ubuntu Server 22.04 LTS Standard security hardening applied
PostgreSQL 14.7 Configured with WAL archiving and regular backups. Database Backup Procedures document details the process.
pgAdmin 4 4.16 Used for database administration and monitoring.

The database schema is designed to efficiently store and query sensor data, species classifications, and related metadata. Refer to the Database Schema Documentation for a complete description.

Application Server

The application server runs the core application logic, providing an API for data access and management. It's built using Python with the Flask web framework.

Component Version Configuration Notes
Operating System Ubuntu Server 22.04 LTS Standard security hardening applied
Python 3.10 Virtual environment managed with venv.
Flask 2.2.2 Deployed using Gunicorn as a WSGI server.
Nginx 1.22.1 Used as a reverse proxy and load balancer. See Nginx Configuration Guide

The application server interacts with the database server to retrieve and store data. It also communicates with the processing/inference server to request species classifications.

Processing/Inference Server

This server is responsible for running the machine learning models used to classify species. It utilizes the TensorFlow framework and is optimized for GPU acceleration.

  • Operating System: Ubuntu Server 22.04 LTS
  • CUDA Toolkit: 11.7
  • cuDNN: 8.4.0
  • TensorFlow: 2.10.0
  • Python: 3.10

The models are periodically updated and deployed using a CI/CD pipeline. See Model Deployment Process for detailed instructions. GPU utilization is monitored using NVIDIA System Management Interface (nvidia-smi).

Network Configuration

All servers reside within a dedicated VLAN (192.168.10.0/24). Firewall rules are configured to restrict access to only necessary ports. DNS resolution is provided by an internal DNS server. The servers are also accessible via a secure VPN connection for remote administration. Refer to Network Diagram for a visual representation of the network topology.

Security Considerations

Security is paramount. All servers are regularly patched and updated. Intrusion detection and prevention systems are in place. Access control is strictly enforced. Data is encrypted both in transit and at rest. See Security Policy for a comprehensive overview of the project's security measures. Regular security audits are conducted by Security Team.

Future Expansion

As the project evolves, we anticipate scaling the server cluster to handle increased data volumes and processing demands. This may involve adding more processing/inference servers or migrating to a cloud-based infrastructure. Scalability Planning document details the proposed expansion strategies.


Main Page Technical Documentation Sensor Network Configuration Data Pipeline Machine Learning Models API Documentation Monitoring and Alerting Troubleshooting Guide Backup and Recovery User Access Control System Logs Performance Tuning Software Updates Contact Information


Intel-Based Server Configurations

Configuration Specifications Benchmark
Core i7-6700K/7700 Server 64 GB DDR4, NVMe SSD 2 x 512 GB CPU Benchmark: 8046
Core i7-8700 Server 64 GB DDR4, NVMe SSD 2x1 TB CPU Benchmark: 13124
Core i9-9900K Server 128 GB DDR4, NVMe SSD 2 x 1 TB CPU Benchmark: 49969
Core i9-13900 Server (64GB) 64 GB RAM, 2x2 TB NVMe SSD
Core i9-13900 Server (128GB) 128 GB RAM, 2x2 TB NVMe SSD
Core i5-13500 Server (64GB) 64 GB RAM, 2x500 GB NVMe SSD
Core i5-13500 Server (128GB) 128 GB RAM, 2x500 GB NVMe SSD
Core i5-13500 Workstation 64 GB DDR5 RAM, 2 NVMe SSD, NVIDIA RTX 4000

AMD-Based Server Configurations

Configuration Specifications Benchmark
Ryzen 5 3600 Server 64 GB RAM, 2x480 GB NVMe CPU Benchmark: 17849
Ryzen 7 7700 Server 64 GB DDR5 RAM, 2x1 TB NVMe CPU Benchmark: 35224
Ryzen 9 5950X Server 128 GB RAM, 2x4 TB NVMe CPU Benchmark: 46045
Ryzen 9 7950X Server 128 GB DDR5 ECC, 2x2 TB NVMe CPU Benchmark: 63561
EPYC 7502P Server (128GB/1TB) 128 GB RAM, 1 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (128GB/2TB) 128 GB RAM, 2 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (128GB/4TB) 128 GB RAM, 2x2 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (256GB/1TB) 256 GB RAM, 1 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (256GB/4TB) 256 GB RAM, 2x2 TB NVMe CPU Benchmark: 48021
EPYC 9454P Server 256 GB RAM, 2x2 TB NVMe

Order Your Dedicated Server

Configure and order your ideal server configuration

Need Assistance?

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️