Server rental store

AI in Merseyside

AI in Merseyside: Server Configuration Guide

Welcome to the Merseyside AI Initiative's server configuration documentationThis guide details the hardware and software setup powering our artificial intelligence projects. It's aimed at newcomers to the wiki and those assisting with server maintenance. Understanding these configurations is vital for successful development and deployment.

Overview

The Merseyside AI Initiative leverages a hybrid server infrastructure, combining on-premise hardware with cloud-based resources. This allows us to balance cost, security, and scalability. This document primarily focuses on the on-premise server cluster located at the Liverpool Science Park. We utilize a distributed computing model, employing several dedicated servers for different tasks: data ingestion, model training, and inference. We also integrate with cloud services for burst capacity and specialized hardware, such as GPUs. See Cloud Integration Overview for details on that aspect.

Hardware Specifications

The core of our on-premise infrastructure consists of three primary server types. These servers are interconnected via a dedicated 10 Gigabit Ethernet network. Power redundancy is provided by a dual-UPS system, and the server room maintains a constant temperature of 22°C with humidity control. Refer to the Data Center Standards page for detailed environmental specifications.

Server Type Model CPU RAM Storage Network Interface
Data Ingestion Server Dell PowerEdge R750 2 x Intel Xeon Gold 6338 256 GB DDR4 ECC 2 x 4TB NVMe SSD (RAID 1) + 16TB HDD 10 Gigabit Ethernet
Model Training Server Supermicro SuperServer 2029U-TR4 2 x AMD EPYC 7763 512 GB DDR4 ECC 4 x 8TB NVMe SSD (RAID 0) 10/40 Gigabit Ethernet
Inference Server HP ProLiant DL380 Gen10 2 x Intel Xeon Silver 4310 128 GB DDR4 ECC 1 x 1TB NVMe SSD 10 Gigabit Ethernet

Software Stack

All servers run Ubuntu Server 22.04 LTS. We employ a containerized environment using Docker and Kubernetes for application deployment and management. This ensures portability and scalability. We’ve standardized on Python 3.9 for our AI development, alongside libraries like TensorFlow, PyTorch, and scikit-learn. The Software Version Control page documents the precise library versions. All code is hosted on our internal GitLab Instance.

Operating System

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️