Server rental store

AI in the Mediterranean Sea

# AI in the Mediterranean Sea: Server Configuration

This article details the server configuration supporting the "AI in the Mediterranean Sea" project, a research initiative utilizing artificial intelligence to monitor and analyze marine ecosystems. This guide is aimed at newcomers to our MediaWiki site and provides a detailed overview of the hardware and software infrastructure. It assumes a basic understanding of server administration and networking concepts. Please refer to Help:Contents for general MediaWiki help.

Project Overview

The "AI in the Mediterranean Sea" project involves deploying a network of underwater sensors collecting data on temperature, salinity, marine life, and pollution levels. This data is transmitted to our central server cluster for processing and analysis using machine learning algorithms. The goal is to provide real-time insights into the health of the Mediterranean Sea and predict potential ecological changes. See Project Goals for more details.

Server Infrastructure

The project relies on a distributed server infrastructure composed of three primary tiers: Data Acquisition, Data Processing, and Data Storage. Each tier utilizes specialized hardware and software components. Refer to the Network Diagram for a visual representation of the server architecture.

Data Acquisition Servers

These servers are located near the coastal monitoring stations. They receive data from the underwater sensors and perform initial data validation and pre-processing. They use a lightweight operating system and minimal processing power.

Data Acquisition Server Specifications Value
Operating System | Ubuntu Server 22.04 LTS Processor | Intel Celeron J4125 RAM | 8 GB DDR4 Storage | 256 GB SSD Network Interface | 1 Gbps Ethernet Data Protocol | MQTT over TLS

These servers employ MQTT broker software for secure data transmission. See Data Acquisition Protocol for specifics.

Data Processing Servers

This tier constitutes the core of the AI system. These servers are responsible for running the machine learning algorithms, analyzing the data, and generating insights. They require significant processing power and memory. These servers also use Docker containers to isolate and manage different AI models.

Data Processing Server Specifications Value
Operating System | CentOS Stream 9 Processor | 2 x AMD EPYC 7763 (64 cores each) RAM | 256 GB DDR4 ECC Storage | 2 x 2 TB NVMe SSD (RAID 1) Network Interface | 10 Gbps Ethernet GPU | 4 x NVIDIA A100 (80GB)

The machine learning framework used is TensorFlow, with models trained on Large Datasets. We also use Kubernetes for orchestration of the AI workloads.

Data Storage Servers

These servers are dedicated to storing the raw and processed data. They require high storage capacity and reliability. Data is backed up daily to an offsite location using rsync.

Data Storage Server Specifications Value
Operating System | Debian 11 Processor | Intel Xeon Silver 4310 RAM | 64 GB DDR4 ECC Storage | 8 x 16 TB SAS HDD (RAID 6) Network Interface | 10 Gbps Ethernet File System | ZFS

The database system used is PostgreSQL with a focus on time-series data. See Database Schema for detailed information.

Software Stack

The software stack consists of various open-source tools and libraries.

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️