Server rental store

AI in the Rocky Mountains

# AI in the Rocky Mountains: Server Configuration

This article details the server configuration for our "AI in the Rocky Mountains" project, focused on analyzing wildlife patterns using machine learning. It's designed for newcomers to our MediaWiki site and provides a technical overview of the hardware and software powering this initiative. This document assumes a basic understanding of server administration and Linux concepts.

Project Overview

The "AI in the Rocky Mountains" project aims to process data from a network of remote cameras deployed across the Rocky Mountain region. This data includes images and video of wildlife, which are then analyzed using machine learning models to identify species, track movements, and monitor population trends. The servers are located in a secure, climate-controlled data center in Boulder, Colorado, chosen for its proximity to the study area and reliable power infrastructure. See Data Acquisition for details on camera setup.

Hardware Configuration

The server infrastructure consists of three primary server types: Input Servers, Processing Servers, and Storage Servers. Each type is configured with specific hardware to optimize its role. We utilize a clustered architecture for redundancy and scalability, as detailed in our Clustering Guide.

Input Servers

These servers receive data streams from the remote camera network. They perform initial validation and buffering before forwarding data to the Processing Servers.

Component Specification
CPU 2 x Intel Xeon Silver 4310 (12 Cores, 2.1 GHz)
RAM 64 GB DDR4 ECC Registered
Storage (Temporary) 2 x 1 TB NVMe SSD (RAID 1) – For buffering incoming data
Network Interface 2 x 10 Gbps Ethernet
Operating System Ubuntu Server 22.04 LTS

Processing Servers

These are the workhorses of the project, running the machine learning models to analyze the incoming data. They require significant computational power. For more information on the AI models used, see Machine Learning Models.

Component Specification
CPU 4 x AMD EPYC 7763 (64 Cores, 2.45 GHz)
RAM 256 GB DDR4 ECC Registered
GPU 4 x NVIDIA A100 (80GB HBM2e)
Storage (Local) 4 x 4 TB NVMe SSD (RAID 0) – For rapid model loading and temporary data processing
Network Interface 2 x 25 Gbps Ethernet
Operating System CentOS Stream 9

Storage Servers

These servers provide persistent storage for the raw data, processed data, and model artifacts. Data archiving is critical, as discussed in Data Archiving Strategy.

Component Specification
CPU 2 x Intel Xeon Gold 6338 (32 Cores, 2.0 GHz)
RAM 128 GB DDR4 ECC Registered
Storage 12 x 16 TB SAS HDD (RAID 6) – Total usable storage: ~144 TB
Network Interface 2 x 10 Gbps Ethernet
Operating System Red Hat Enterprise Linux 8

Software Configuration

The software stack is built around open-source technologies for maximum flexibility and cost-effectiveness. See Software Licensing for details on our licensing policy.

⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️