AI in the Samoan Rainforest

From Server rental store
Jump to navigation Jump to search

AI in the Samoan Rainforest: Server Configuration

This article details the server configuration for the “AI in the Samoan Rainforest” project, designed to process data from a network of sensors deployed throughout the rainforests of Samoa. This project utilizes artificial intelligence to analyze biodiversity, track invasive species, and monitor environmental changes. This document is aimed at new system administrators joining the team.

Project Overview

The “AI in the Samoan Rainforest” project relies on real-time data analysis of audio, visual, and environmental sensor data. Data is collected from remote sensor nodes and transmitted to a central server farm for processing. The server infrastructure is crucial for the project's success, requiring high availability, scalability, and robust data storage. Understanding the individual components and their interdependencies is key to effective maintenance and troubleshooting. We utilize a hybrid cloud approach, leveraging both on-premise hardware and cloud services for optimal performance and cost-effectiveness. See also Data Acquisition Protocols and Sensor Network Topology.

Server Hardware Specification

The core of the server infrastructure consists of three primary server types: Data Ingestion Servers, Processing Servers, and Database Servers. The following tables detail the specifications for each:

Server Type CPU RAM Storage Network Interface
Intel Xeon Silver 4310 (12 Cores) | 64GB DDR4 ECC | 4TB NVMe SSD (RAID 1) | 10Gbps Ethernet |
AMD EPYC 7763 (64 Cores) | 256GB DDR4 ECC | 8TB NVMe SSD (RAID 0) + 16TB HDD (RAID 5) | 25Gbps Ethernet |
Intel Xeon Gold 6338 (32 Cores) | 128GB DDR4 ECC | 16 x 4TB SAS HDD (RAID 6) | 10Gbps Ethernet |

These servers are housed in a dedicated, climate-controlled server room located at the research facility in Apia. Power redundancy is provided by a UPS system with a minimum of 30 minutes of backup power. A detailed inventory of all hardware is maintained in the Hardware Inventory Database.

Software Stack

The software stack is designed for efficient data processing and storage. The following table summarizes the key software components:

Component Version Purpose Operating System
Ubuntu Server 22.04 LTS | Base operating system for all servers | Ubuntu |
Nginx | Reverse proxy and load balancer | Ubuntu |
Python 3.9 with TensorFlow 2.8 | AI model execution and data analysis | Ubuntu |
PostgreSQL 14 | Data storage and retrieval | Ubuntu |
Prometheus & Grafana | System monitoring and visualization | Ubuntu |
Docker & Kubernetes | Application deployment and orchestration | Ubuntu |

All code is version controlled using Git Repository Access. Deployment is automated using Continuous Integration/Continuous Deployment (CI/CD) Pipeline. Regular security audits are performed as per the Security Policy.

Network Configuration

The server network is segmented into three VLANs: Data Ingestion, Processing, and Database. This segmentation enhances security and improves network performance. Firewall rules are configured to restrict communication between VLANs to only necessary services.

VLAN ID Subnet Purpose Gateway
192.168.10.0/24 | Data Ingestion Servers | 192.168.10.1 |
192.168.20.0/24 | Processing Servers | 192.168.20.1 |
192.168.30.0/24 | Database Servers | 192.168.30.1 |

External access is provided through a dedicated internet connection with a static IP address. DNS records are managed internally using Internal DNS Configuration. Network diagrams are available in the Network Topology Documentation.


Data Flow

1. Sensor data is transmitted to the Data Ingestion Servers. 2. Nginx load balances the incoming data across multiple Data Ingestion Servers. 3. Data Ingestion Servers validate and pre-process the data. 4. Pre-processed data is sent to the Processing Servers via the network. 5. Processing Servers execute AI models to analyze the data. 6. Analyzed data is stored in the Database Servers. 7. Researchers access the data through a web interface. See Data Access Procedures.

Future Considerations

Future enhancements to the server infrastructure may include:

  • Expanding the server farm to accommodate increased data volume.
  • Implementing a distributed caching system to improve performance.
  • Migrating to a fully cloud-based solution for greater scalability and resilience. See Cloud Migration Strategy.
  • Exploring the use of GPU acceleration for faster AI model training and inference.

Related Documentation


Intel-Based Server Configurations

Configuration Specifications Benchmark
Core i7-6700K/7700 Server 64 GB DDR4, NVMe SSD 2 x 512 GB CPU Benchmark: 8046
Core i7-8700 Server 64 GB DDR4, NVMe SSD 2x1 TB CPU Benchmark: 13124
Core i9-9900K Server 128 GB DDR4, NVMe SSD 2 x 1 TB CPU Benchmark: 49969
Core i9-13900 Server (64GB) 64 GB RAM, 2x2 TB NVMe SSD
Core i9-13900 Server (128GB) 128 GB RAM, 2x2 TB NVMe SSD
Core i5-13500 Server (64GB) 64 GB RAM, 2x500 GB NVMe SSD
Core i5-13500 Server (128GB) 128 GB RAM, 2x500 GB NVMe SSD
Core i5-13500 Workstation 64 GB DDR5 RAM, 2 NVMe SSD, NVIDIA RTX 4000

AMD-Based Server Configurations

Configuration Specifications Benchmark
Ryzen 5 3600 Server 64 GB RAM, 2x480 GB NVMe CPU Benchmark: 17849
Ryzen 7 7700 Server 64 GB DDR5 RAM, 2x1 TB NVMe CPU Benchmark: 35224
Ryzen 9 5950X Server 128 GB RAM, 2x4 TB NVMe CPU Benchmark: 46045
Ryzen 9 7950X Server 128 GB DDR5 ECC, 2x2 TB NVMe CPU Benchmark: 63561
EPYC 7502P Server (128GB/1TB) 128 GB RAM, 1 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (128GB/2TB) 128 GB RAM, 2 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (128GB/4TB) 128 GB RAM, 2x2 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (256GB/1TB) 256 GB RAM, 1 TB NVMe CPU Benchmark: 48021
EPYC 7502P Server (256GB/4TB) 256 GB RAM, 2x2 TB NVMe CPU Benchmark: 48021
EPYC 9454P Server 256 GB RAM, 2x2 TB NVMe

Order Your Dedicated Server

Configure and order your ideal server configuration

Need Assistance?

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️