Server rental store

AI in the Pyrenees

# AI in the Pyrenees: Server Configuration

This document details the server configuration powering the "AI in the Pyrenees" project. This project utilizes artificial intelligence for environmental monitoring and predictive analysis within the Pyrenees mountain range. This article is aimed at newcomers to our MediaWiki infrastructure and provides a technical overview of the hardware and software deployed.

Overview

The "AI in the Pyrenees" project relies on a distributed server infrastructure to process data from a network of sensors deployed throughout the mountain range. These sensors collect data on temperature, humidity, wind speed, snow depth, and wildlife activity. The data is transmitted to regional hubs, which then forward it to the central processing servers located in a secure data center. The servers employ machine learning algorithms to identify patterns, predict environmental changes, and provide alerts to relevant authorities. We utilize a hybrid cloud approach, combining on-premise servers for low-latency processing with cloud resources for scalability and long-term storage. See also Data Acquisition Systems and Sensor Networks.

Hardware Configuration

The core of our infrastructure consists of three tiers of servers: edge servers, regional hubs, and central processing servers.

Edge Servers

These servers are located near the sensor networks to provide initial data processing and filtering. They are ruggedized for harsh environmental conditions.

Specification Value
Processor Intel Xeon E-2388G (8 cores, 3.2 GHz)
RAM 64 GB DDR4 ECC
Storage 1 TB NVMe SSD
Network Interface Dual Gigabit Ethernet
Operating System Ubuntu Server 22.04 LTS

Regional Hubs

These servers aggregate data from multiple edge servers and perform preliminary analysis. They act as a gateway to the central processing servers. Consider reviewing Network Topology for more details.

Specification Value
Processor AMD EPYC 7302P (16 cores, 3.0 GHz)
RAM 128 GB DDR4 ECC
Storage 2 x 2 TB NVMe SSD (RAID 1)
Network Interface Quad Gigabit Ethernet
Operating System CentOS Stream 9

Central Processing Servers

These servers handle the bulk of the data processing, model training, and analysis. They are housed in a secure data center with redundant power and cooling. See Data Center Security for detailed information.

Specification Value
Processor Dual Intel Xeon Platinum 8380 (40 cores per processor, 2.3 GHz)
RAM 512 GB DDR4 ECC
Storage 8 x 4 TB NVMe SSD (RAID 6) + 100 TB HDD Array
Network Interface Dual 10 Gigabit Ethernet
GPU 4 x NVIDIA A100 (80GB)
Operating System Red Hat Enterprise Linux 8

Software Configuration

The software stack is built around Python and various machine learning libraries. Review Software Dependencies for a complete list.

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️